Sample records for maccs computer code

  1. Input-output model for MACCS nuclear accident impacts estimation¹

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domesticmore » product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.« less

  2. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  3. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  4. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  5. Evaluation of severe accident risks: Quantification of major input parameters: MAACS (MELCOR Accident Consequence Code System) input

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprung, J.L.; Jow, H-N; Rollstin, J.A.

    1990-12-01

    Estimation of offsite accident consequences is the customary final step in a probabilistic assessment of the risks of severe nuclear reactor accidents. Recently, the Nuclear Regulatory Commission reassessed the risks of severe accidents at five US power reactors (NUREG-1150). Offsite accident consequences for NUREG-1150 source terms were estimated using the MELCOR Accident Consequence Code System (MACCS). Before these calculations were performed, most MACCS input parameters were reviewed, and for each parameter reviewed, a best-estimate value was recommended. This report presents the results of these reviews. Specifically, recommended values and the basis for their selection are presented for MACCS atmospheric andmore » biospheric transport, emergency response, food pathway, and economic input parameters. Dose conversion factors and health effect parameters are not reviewed in this report. 134 refs., 15 figs., 110 tabs.« less

  6. SYNTAX score based on coronary computed tomography angiography may have a prognostic value in patients with complex coronary artery disease: An observational study from a retrospective cohort.

    PubMed

    Suh, Young Joo; Han, Kyunghwa; Chang, Suyon; Kim, Jin Young; Im, Dong Jin; Hong, Yoo Jin; Lee, Hye-Jeong; Hur, Jin; Kim, Young Jin; Choi, Byoung Wook

    2017-09-01

    The SYNergy between percutaneous coronary intervention with TAXus and cardiac surgery (SYNTAX) score is an invasive coronary angiography (ICA)-based score for quantifying the complexity of coronary artery disease (CAD). Although the SYNTAX score was originally developed based on ICA, recent publications have reported that coronary computed tomography angiography (CCTA) is a feasible modality for the estimation of the SYNTAX score.The aim of our study was to investigate the prognostic value of the SYNTAX score, based on CCTA for the prediction of major adverse cardiac and cerebrovascular events (MACCEs) in patients with complex CAD.The current study was approved by the institutional review board of our institution, and informed consent was waived for this retrospective cohort study. We included 251 patients (173 men, mean age 66.0 ± 9.29 years) who had complex CAD [3-vessel disease or left main (LM) disease] on CCTA. SYNTAX score was obtained on the basis of CCTA. Follow-up clinical outcome data regarding composite MACCEs were also obtained. Cox proportional hazards models were developed to predict the risk of MACCEs based on clinical variables, treatment, and computed tomography (CT)-SYNTAX scores.During the median follow-up period of 1517 days, there were 48 MACCEs. Univariate Cox hazards models demonstrated that MACCEs were associated with advanced age, low body mass index (BMI), and dyslipidemia (P < .2). In patients with LM disease, MACCEs were associated with a higher SYNTAX score. In patients with CT-SYNTAX score ≥23, patients who underwent coronary artery bypass graft surgery (CABG) and percutaneous coronary intervention had significantly lower hazard ratios than patients who were treated with medication alone. In multivariate Cox hazards model, advanced age, low BMI, and higher SYNTAX score showed an increased hazard ratio for MACCE, while treatment with CABG showed a lower hazard ratio (P < .2).On the basis of our results, CT-SYNTAX score can be a useful method for noninvasively predicting MACCEs in patients with complex CAD, especially in patients with LM disease.

  7. Prognostic value of MACC1 and proficient mismatch repair status for recurrence risk prediction in stage II colon cancer patients: the BIOGRID studies.

    PubMed

    Rohr, U-P; Herrmann, P; Ilm, K; Zhang, H; Lohmann, S; Reiser, A; Muranyi, A; Smith, J; Burock, S; Osterland, M; Leith, K; Singh, S; Brunhoeber, P; Bowermaster, R; Tie, J; Christie, M; Wong, H-L; Waring, P; Shanmugam, K; Gibbs, P; Stein, U

    2017-08-01

    We assessed the novel MACC1 gene to further stratify stage II colon cancer patients with proficient mismatch repair (pMMR). Four cohorts with 596 patients were analyzed: Charité 1 discovery cohort was assayed for MACC1 mRNA expression and MMR in cryo-preserved tumors. Charité 2 comparison cohort was used to translate MACC1 qRT-PCR analyses to FFPE samples. In the BIOGRID 1 training cohort MACC1 mRNA levels were related to MACC1 protein levels from immunohistochemistry in FFPE sections; also analyzed for MMR. Chemotherapy-naïve pMMR patients were stratified by MACC1 mRNA and protein expression to establish risk groups based on recurrence-free survival (RFS). Risk stratification from BIOGRID 1 was confirmed in the BIOGRID 2 validation cohort. Pooled BIOGRID datasets produced a best effect-size estimate. In BIOGRID 1, using qRT-PCR and immunohistochemistry for MACC1 detection, pMMR/MACC1-low patients had a lower recurrence probability versus pMMR/MACC1-high patients (5-year RFS of 92% and 67% versus 100% and 68%, respectively). In BIOGRID 2, longer RFS was confirmed for pMMR/MACC1-low versus pMMR/MACC1-high patients (5-year RFS of 100% versus 90%, respectively). In the pooled dataset, 6.5% of patients were pMMR/MACC1-low with no disease recurrence, resulting in a 17% higher 5-year RFS [95% confidence interval (CI) (12.6%-21.3%)] versus pMMR/MACC1-high patients (P = 0.037). Outcomes were similar for pMMR/MACC1-low and deficient MMR (dMMR) patients (5-year RFS of 100% and 96%, respectively). MACC1 expression stratifies colon cancer patients with unfavorable pMMR status. Stage II colon cancer patients with pMMR/MACC1-low tumors have a similar favorable prognosis to those with dMMR with potential implications for the role of adjuvant therapy. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  8. Statin and rottlerin small-molecule inhibitors restrict colon cancer progression and metastasis via MACC1.

    PubMed

    Juneja, Manisha; Kobelt, Dennis; Walther, Wolfgang; Voss, Cynthia; Smith, Janice; Specker, Edgar; Neuenschwander, Martin; Gohlke, Björn-Oliver; Dahlmann, Mathias; Radetzki, Silke; Preissner, Robert; von Kries, Jens Peter; Schlag, Peter Michael; Stein, Ulrike

    2017-06-01

    MACC1 (Metastasis Associated in Colon Cancer 1) is a key driver and prognostic biomarker for cancer progression and metastasis in a large variety of solid tumor types, particularly colorectal cancer (CRC). However, no MACC1 inhibitors have been identified yet. Therefore, we aimed to target MACC1 expression using a luciferase reporter-based high-throughput screening with the ChemBioNet library of more than 30,000 compounds. The small molecules lovastatin and rottlerin emerged as the most potent MACC1 transcriptional inhibitors. They remarkably inhibited MACC1 promoter activity and expression, resulting in reduced cell motility. Lovastatin impaired the binding of the transcription factors c-Jun and Sp1 to the MACC1 promoter, thereby inhibiting MACC1 transcription. Most importantly, in CRC-xenografted mice, lovastatin and rottlerin restricted MACC1 expression and liver metastasis. This is-to the best of our knowledge-the first identification of inhibitors restricting cancer progression and metastasis via the novel target MACC1. This drug repositioning might be of therapeutic value for CRC patients.

  9. ChemoPy: freely available python package for computational biology and chemoinformatics.

    PubMed

    Cao, Dong-Sheng; Xu, Qing-Song; Hu, Qian-Nan; Liang, Yi-Zeng

    2013-04-15

    Molecular representation for small molecules has been routinely used in QSAR/SAR, virtual screening, database search, ranking, drug ADME/T prediction and other drug discovery processes. To facilitate extensive studies of drug molecules, we developed a freely available, open-source python package called chemoinformatics in python (ChemoPy) for calculating the commonly used structural and physicochemical features. It computes 16 drug feature groups composed of 19 descriptors that include 1135 descriptor values. In addition, it provides seven types of molecular fingerprint systems for drug molecules, including topological fingerprints, electro-topological state (E-state) fingerprints, MACCS keys, FP4 keys, atom pairs fingerprints, topological torsion fingerprints and Morgan/circular fingerprints. By applying a semi-empirical quantum chemistry program MOPAC, ChemoPy can also compute a large number of 3D molecular descriptors conveniently. The python package, ChemoPy, is freely available via http://code.google.com/p/pychem/downloads/list, and it runs on Linux and MS-Windows. Supplementary data are available at Bioinformatics online.

  10. MACC1 - a novel target for solid cancers.

    PubMed

    Stein, Ulrike

    2013-09-01

    The metastatic dissemination of primary tumors is directly linked to patient survival in many tumor entities. The previously undescribed gene metastasis-associated in colon cancer 1 (MACC1) was discovered by genome-wide analyses in colorectal cancer (CRC) tissues. MACC1 is a tumor stage-independent predictor for CRC metastasis linked to metastasis-free survival. In this review, the discovery of MACC1 is briefly presented. In the following, the overwhelming confirmation of these data is provided supporting MACC1 as a new remarkable biomarker for disease prognosis and prediction of therapy response for CRC and also for a variety of additional forms of solid cancers. Lastly, the potential clinical utility of MACC1 as a target for prevention or restriction of tumor progression and metastasis is envisioned. MACC1 has been identified as a prognostic biomarker in a variety of solid cancers. MACC1 correlated with tumor formation and progression, development of metastases and patient survival representing a decisive driver for tumorigenesis and metastasis. MACC1 was also demonstrated to be of predictive value for therapy response. MACC1 is a promising therapeutic target for anti-tumor and anti-metastatic intervention strategies of solid cancers. Its clinical utility, however, must be demonstrated in clinical trials.

  11. Metastasis-associated in colon cancer-1 promotes vasculogenic mimicry in gastric cancer by upregulating TWIST1/2

    PubMed Central

    Wang, Lin; Lin, Li; Chen, Xi; Sun, Li; Liao, Yulin; Huang, Na; Liao, Wangjun

    2015-01-01

    Vasculogenic mimicry (VM) is a blood supply modality that is strongly associated with the epithelial-mesenchymal transition (EMT), TWIST1 activation and tumor progression. We previously reported that metastasis-associated in colon cancer-1 (MACC1) induced the EMT and was associated with a poor prognosis of patients with gastric cancer (GC), but it remains unknown whether MACC1 promotes VM and regulates the TWIST signaling pathway in GC. In this study, we investigated MACC1 expression and VM by immunohistochemistry in 88 patients with stage IV GC, and also investigated the role of TWIST1 and TWIST2 in MACC1-induced VM by using nude mice with GC xenografts and GC cell lines. We found that the VM density was significantly increased in the tumors of patients who died of GC and was positively correlated with MACC1 immunoreactivity (p < 0.05). The 3-year survival rate was only 8.6% in patients whose tumors showed double positive staining for MACC1 and VM, whereas it was 41.7% in patients whose tumors were negative for both MACC1 and VM. Moreover, nuclear expression of MACC1, TWIST1, and TWIST2 was upregulated in GC tissues compared with matched adjacent non-tumorous tissues (p < 0.05). Overexpression of MACC1 increased TWIST1/2 expression and induced typical VM in the GC xenografts of nude mice and in GC cell lines. MACC1 enhanced TWIST1/2 promoter activity and facilitated VM, while silencing of TWIST1 or TWIST2 inhibited VM. Hepatocyte growth factor (HGF) increased the nuclear translocation of MACC1, TWIST1, and TWIST2, while a c-Met inhibitor reduced these effects. These findings indicate that MACC1 promotes VM in GC by regulating the HGF/c-Met-TWIST1/2 signaling pathway, which means that MACC1 and this pathway are potential new therapeutic targets for GC. PMID:25895023

  12. MACC1 regulates Fas mediated apoptosis through STAT1/3 - Mcl-1 signaling in solid cancers.

    PubMed

    Radhakrishnan, Harikrishnan; Ilm, Katharina; Walther, Wolfgang; Shirasawa, Senji; Sasazuki, Takehiko; Daniel, Peter T; Gillissen, Bernhard; Stein, Ulrike

    2017-09-10

    MACC1 was identified as a novel player in cancer progression and metastasis, but its role in death receptor-mediated apoptosis is still unexplored. We show that MACC1 knockdown sensitizes cancer cells to death receptor-mediated apoptosis. For the first time, we provide evidence for STAT signaling as a MACC1 target. MACC1 knockdown drastically reduced STAT1/3 activating phosphorylation, thereby regulating the expression of its apoptosis targets Mcl-1 and Fas. STAT signaling inhibition by the JAK1/2 inhibitor ruxolitinib mimicked MACC1 knockdown-mediated molecular signatures and apoptosis sensitization to Fas activation. Despite the increased Fas expression, the reduced Mcl-1 expression was instrumental in apoptosis sensitization. This reduced Mcl-1-mediated apoptosis sensitization was Bax and Bak dependent. MACC1 knockdown also increased TRAIL-induced apoptosis. MACC1 overexpression enhanced STAT1/3 phosphorylation and increased Mcl-1 expression, which was abrogated by ruxolitinib. The central role of Mcl-1 was strengthened by the resistance of Mcl-1 overexpressing cells to apoptosis induction. The clinical relevance of Mcl-1 regulation by MACC1 was supported by their positive expression correlation in patient-derived tumors. Altogether, we reveal a novel death receptor-mediated apoptosis regulatory mechanism by MACC1 in solid cancers through modulation of the STAT1/3-Mcl-1 axis. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less

  14. Clinicopathological and prognostic significance of metastasis-associated in colon cancer-1 (MACC1) overexpression in colorectal cancer: a meta-analysis

    PubMed Central

    Zhao, Yang; Dai, Cong; Wang, Meng; Kang, Huafeng; Lin, Shuai; Yang, Pengtao; Liu, Xinghan; Liu, Kang; Xu, Peng; Zheng, Yi; Li, Shanli; Dai, Zhijun

    2016-01-01

    Metastasis-associated in colon cancer-1 (MACC1) has been reported to be overexpressed in diverse human malignancies, and the increasing amount of evidences suggest that its overexpression is associated with the development and progression of many human tumors. However, the prognostic and clinicopathological value of MACC1 in colorectal cancer remains inconclusive. Therefore, we conducted this meta-analysis to investigate the effect of MACC1 overexpression on clinicopathological features and survival outcomes in colorectal cancer. PubMed, CNKI, and Wanfang databases were searched for relevant articles published update to December 2015. Correlation of MACC1 expression level with overall survival (OS), disease-free survival (DFS), and clinicopathological features were analyzed. In this meta-analysis, fifteen studies with a total of 2,161 colorectal cancer patients were included. Our results showed that MACC1 overexpression was significantly associated with poorer OS and DFS. Moreover, MACC1 overexpression was significantly associated with gender, localization, TNM stage, T stage, and N stage. Together, our meta-analysis showed that MACC1 overexpression was significantly associated with poor survival rates, regional invasion and lymph-node metastasis. MACC1 expression level can serve as a novel prognostic factor in colorectal cancer patients. PMID:27542234

  15. A Near-real-time Data Transport System for Selected Stations in the Magnetometer Array for Cusp and Cleft Studies (MACCS)

    NASA Astrophysics Data System (ADS)

    Engebretson, M. J.; Valentic, T. A.; Stehle, R. H.; Hughes, W. J.

    2004-05-01

    The Magnetometer Array for Cusp and Cleft Studies (MACCS) is a two-dimensional array of eight fluxgate magnetometers that was established in 1992-1993 in the Eastern Canadian Arctic from 75° to over 80° MLAT to study electrodynamic interactions between the solar wind and Earth's magnetosphere and high-latitude ionosphere. A ninth site in Nain, Labrador, extends coverage down to 66° between existing Canadian and Greenland stations. Originally designed as part of NSF's GEM (Geospace Environment Modeling) Program, MACCS has contributed to the study of transients and waves at the magnetospheric boundary and in the near-cusp region as well as to large, cooperative, studies of ionospheric convection and substorm processes. Because of the limitations of existing telephone lines to each site, it has not been possible to economically access MACCS data promptly; instead, each month's collected data is recorded and mailed to the U.S. for processing and eventual posting on a publicly-accessible web site, http://space.augsburg.edu/space. As part of its recently renewed funding, NSF has supported the development of a near-real-time data transport system using the Iridium satellite network, which will be implemented at two MACCS sites in summer 2004. At the core of the new MACCS communications system is the Data Transport Network, software developed with NSF-ITR funding to automate the transfer of scientific data from remote field stations over unreliable, bandwidth-constrained network connections. The system utilizes a store-and-forward architecture based on sending data files as attachments to Usenet messages. This scheme not only isolates the instruments from network outages, but also provides a consistent framework for organizing and accessing multiple data feeds. Client programs are able to subscribe to data feeds to perform tasks such as system health monitoring, data processing, web page updates and e-mail alerts. The MACCS sites will employ the Data Transport Network on a small local Linux-based computer connected to an Iridium transceiver. Between 3-5Mb of data a day will be collected from the magnetometers and delivered in near-real-time for automatic distribution to modelers and index developers. More information about the Data Transport Network can be found at http://transport.sri.com/TransportDevel .

  16. The role of metastasis-associated in colon cancer 1 (MACC1) in endometrial carcinoma tumorigenesis and progression.

    PubMed

    Chen, Shuo; Zong, Zhi-Hong; Wu, Dan-Dan; Sun, Kai-Xuan; Liu, Bo-Liang; Zhao, Yang

    2017-04-01

    Metastasis-associated in colon cancer-1 (MACC1), has recently been identified as a key regulator in the progression of many cancers. However, its role in endometrial carcinoma (EC) remains unknown. MACC1 expression was determined in EC and normal endometrial tissues by immunohistochemistry. EC cell phenotypes and related molecules were examined after MACC1 downregulation by Small interfering RNA (siRNA) or microRNA (miRNA) transfection. We found that MACC1 was highly expressed in EC tissues than normal samples, and was significantly different in FIGO staging (I and II vs. III and IV), the depth of myometrial infiltration (<1/2 vs. ≥1/2), lymph nodes metastasis (negative vs. positive), besides, MACC1 overexpression was correlated with lower cumulative and relapse-free survival rate. MACC1 downregulation by siRNA transfection significantly induced G1 phrase arrest, suppressed EC cell proliferation, migration, and invasion. In addition, MACC1 downregulation also reduced expression of Cyclin D1 and Cyclin-dependent Kinase 2 (CDK2), N-cadherin (N-Ca), α-SMA, matrix metalloproteinase 2 (MMP2), and MMP9, but increased expression of E-cadherin (E-Ca). Bioinformatic predictions and dual-luciferase reporter assays indicate that MACC1 is a possible target of miR-23b. MiR-23b overexpression reduced MACC1 expression in vitro and induced G1 phrase arrest, suppressed cell proliferation, migration, and invasion. MiR-23b transfection also reduced Cyclin D1 and CDK2, N-Ca, α-SMA, MMP2, MMP9 expression, but increased E-Ca expression. Furthermore, the nude mouse xenograft assay showed that miR-23b overexpression suppressed tumour growth through downregulating MACC1 expression. Taken together, our results demonstrate for the first time that MACC1 may be a new and important diagnosis and therapeutic target of endometrial carcinoma. © 2017 Wiley Periodicals, Inc.

  17. Investigation of MACC1 Gene Expression in Head and Neck Cancer and Cancer Stem Cells.

    PubMed

    Evran, Ebru; Şahin, Hilal; Akbaş, Kübra; Çiğdem, Sadik; Gündüz, Esra

    2016-12-01

    By investigating the MACC1 gene (metastasis-associated in colon cancer 1) in cancer stem cells (CSC) resistant to chemotherapy and in cancer stem cells (CSC) resistant to chemotherapy and in cancer cells (CS) sensitive to chemotherapy we determineda steady expression in both types of cells in head and neck cancer. In conformity with the result we examined if this gene could be a competitor gene for chemotherapy. According to literature, the MACC1 gene shows a clear expression in head and neck cancer cells [1]. Here we examined MACC1 expression in CSC and investigated it as a possible biomarker. Our experiments were performed in the UT -SCC -74 in primary head and neck cancer cell line. We examined the MACC -1 gene expression by Real Time PCR from both isolated CSC and CS. Expression of MACC -1 gene of cancer stem cells showed an two-fold increase compared with cancer cells. Based on the positive expression of MACC1 in both CS and CSC, this gene may serve as a potential biomarker in head and neck cancer. By comparing the results of this study with the novel features of MACC1, two important hypotheses could be examined. The first hypothesis is that MACC1 is a possible transcripton factor in colon cancer, which influences a high expression of CSC in head and neck and affects the expression of three biomarkers of the CSC control group biomarkers. The second hypothesisis is that the positive expression of MACC1 in patients with a malignant prognosis of tongue cancer, which belongs to head and neck cancer types, operates a faster development of CSC to cancer cells.

  18. [The value of SYNTAX score in predicting outcome patients undergoing percutaneous coronary intervention].

    PubMed

    Gao, Yue-chun; Yu, Xian-peng; He, Ji-qiang; Chen, Fang

    2012-01-01

    To assess the value of SYNTAX score to predict major adverse cardiac and cerebrovascular events (MACCE) among patients with three-vessel or left-main coronary artery disease undergoing percutaneous coronary intervention. 190 patients with three-vessel or left-main coronary artery disease undergoing percutaneous coronary intervention (PCI) with Cypher select drug-eluting stent were enrolled. SYNTAX score and clinical SYNTAX score were retrospectively calculated. Our clinical Endpoint focused on MACCE, a composite of death, nonfatal myocardial infarction (MI), stroke and repeat revascularization. The value of SYNTAX score and clinical SYNTAX score to predict MACCE were studied respectively. 29 patients were observed to suffer from MACCE, accounting 18.5% of the overall 190 patients. MACCE rates of low (≤ 20.5), intermediate (21.0 - 31.0), and high (≥ 31.5) tertiles according to SYNTAX score were 9.1%, 16.2% and 30.9% respectively. Both univariate and multivariate analysis showed that SYNTAX score was the independent predictor of MACCE. MACCE rates of low (≤ 19.5), intermediate (19.6 - 29.1), and high (≥ 29.2) tertiles according to clinical SYNTAX score were 14.9%, 9.8% and 30.6% respectively. Both univariate and multivariate analysis showed that clinical SYNTAX score was the independent predictor of MACCE. ROC analysis showed both SYNTAX score (AUC = 0.667, P = 0.004) and clinical SYNTAX score (AUC = 0.636, P = 0.020) had predictive value of MACCE. Clinical SYNTAX score failed to show better predictive ability than the SYNTAX score. Both SYNTAX score and clinical SYNTAX score could be independent risk predictors for MACCE among patients with three-vessel or left-main coronary artery disease undergoing percutaneous coronary intervention. Clinical SYNTAX score failed to show better predictive ability than the SYNTAX score in this group of patients.

  19. MISR GoMACCS Products

    Atmospheric Science Data Center

    2016-11-25

    Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) is an intensive ... study area encompasses Texas and the northwestern Gulf of Mexico during July, August, September, and October, 2006. The Multi-angle ...

  20. Circulating metastasis associated in colon cancer 1 transcripts in gastric cancer patient plasma as diagnostic and prognostic biomarker

    PubMed Central

    Burock, Susen; Herrmann, Pia; Wendler, Ina; Niederstrasser, Markus; Wernecke, Klaus-Dieter; Stein, Ulrike

    2015-01-01

    AIM: To evaluate the diagnostic and prognostic value of circulating Metastasis Associated in Colon Cancer 1 (MACC1) transcripts in plasma of gastric cancer patients. METHODS: We provide for the first time a blood-based assay for transcript quantification of the metastasis inducer MACC1 in a prospective study of gastric cancer patient plasma. MACC1 is a strong prognostic biomarker for tumor progression and metastasis in a variety of solid cancers. We conducted a study to define the diagnostic and prognostic power of MACC1 transcripts using 76 plasma samples from gastric cancer patients, either newly diagnosed with gastric cancer, newly diagnosed with metachronous metastasis of gastric cancer, as well as follow-up patients. Findings were controlled by using plasma samples from 54 tumor-free volunteers. Plasma was separated, RNA was isolated, and levels of MACC1 as well as S100A4 transcripts were determined by quantitative RT-PCR. RESULTS: Based on the levels of circulating MACC1 transcripts in plasma we significantly discriminated tumor-free volunteers and gastric cancer patients (P < 0.001). Levels of circulating MACC1 transcripts were increased in gastric cancer patients of each disease stage, compared to tumor-free volunteers: patients with tumors without metastasis (P = 0.005), with synchronous metastasis (P = 0.002), with metachronous metastasis (P = 0.005), and patients during follow-up (P = 0.021). Sensitivity was 0.68 (95%CI: 0.45-0.85) and specificity was 0.89 (95%CI: 0.77-0.95), respectively. Importantly, gastric cancer patients with high circulating MACC1 transcript levels in plasma demonstrated significantly shorter survival when compared with patients demonstrating low MACC1 levels (P = 0.0015). Furthermore, gastric cancer patients with high circulating transcript levels of MACC1 as well as of S100A4 in plasma demonstrated significantly shorter survival when compared with patients demonstrating low levels of both biomarkers or with only one biomarker elevated (P = 0.001). CONCLUSION: Levels of circulating MACC1 transcripts in plasma of gastric cancer patients are of diagnostic value and are prognostic for patient survival in a prospective study. PMID:25574109

  1. MISR Regional GoMACCS Imagery Overview

    Atmospheric Science Data Center

    2016-08-24

    ... View Data  |  Download Data About this Web Site: Visualizations of select MISR Level 3 data for special regional ... version used in support of the GoMACCS Campaign. More information about the Level 1 and Level 2 products subsetted for the GoMACCS ...

  2. A Web Server for MACCS Magnetometer Data

    NASA Technical Reports Server (NTRS)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  3. MACC1 mediates chemotherapy sensitivity of 5-FU and cisplatin via regulating MCT1 expression in gastric cancer.

    PubMed

    Wang, Chunlin; Wen, Zhaowei; Xie, Jianming; Zhao, Yang; Zhao, Liang; Zhang, Shuyi; Liu, Yajing; Xue, Yan; Shi, Min

    2017-04-08

    Chemotherapeutic insensitivity is a main obstacle for effective treatment of gastric cancer (GC), the underlying mechanism remains to be investigated. Metastasis-associated in colon cancer-1 (MACC1), a transcription factor highly expressed in GC, is found to be related to chemotherapy sensitivity. Monocarboxylate transporter 1 (MCT1), a plasma membrane protein co-transporting lactate and H + , mediates drug sensitivity by regulating lactate metabolism. Targeting MCT1 has recently been regarded as a promising way to treat cancers and MCT1 inhibitor has entered the clinical trial for GC treatment. However, the correlation of these two genes and their combined effects on chemotherapy sensitivity has not been clarified. In this study, we found that MACC1 and MCT1 were both highly expressed in GC and exhibited a positive correlation in clinical samples. Further, we demonstrated that MACC1 could mediate sensitivity of 5-FU and cisplatin in GC cells, and MACC1 mediated MCT1 regulation was closely related to this sensitivity. A MCT1 inhibitor AZD3965 recovered the sensitivity of 5-FU and cisplatin in GC cells which overexpressed MACC1. These results suggested that MACC1 could influence the chemotherapy sensitivity by regulating MCT1 expression, providing new ideas and strategy for GC treatment. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. High MACC1 expression in combination with mutated KRAS G13 indicates poor survival of colorectal cancer patients.

    PubMed

    Ilm, Katharina; Kemmner, Wolfgang; Osterland, Marc; Burock, Susen; Koch, Gudrun; Herrmann, Pia; Schlag, Peter M; Stein, Ulrike

    2015-02-14

    The metastasis-associated in colon cancer 1 (MACC1) gene has been identified as prognostic biomarker for colorectal cancer (CRC). Here, we aimed at the refinement of risk assessment by separate and combined survival analyses of MACC1 expression with any of the markers KRAS mutated in codon 12 (KRAS G12) or codon 13 (KRAS G13), BRAF V600 mutation and MSI status in a retrospective study of 99 CRC patients with tumors UICC staged I, II and III. We showed that only high MACC1 expression (HR: 6.09, 95% CI: 2.50-14.85, P < 0.001) and KRAS G13 mutation (HR: 5.19, 95% CI: 1.06-25.45, P = 0.042) were independent prognostic markers for shorter metastasis-free survival (MFS). Accordingly, Cox regression analysis revealed that patients with high MACC1 expression and KRAS G13 mutation exhibited the worst prognosis (HR: 14.48, 95% CI: 3.37-62.18, P < 0.001). Patients were classified based on their molecular characteristics into four clusters with significant differences in MFS (P = 0.003) by using the SPSS 2-step cluster function and Kaplan-Meier survival analysis. According to our results, patients with high MACC1 expression and mutated KRAS G13 exhibited the highest risk for metachronous metastases formation. Moreover, we demonstrated that the "Traditional pathway" with an intermediate risk for metastasis formation can be further subdivided by assessing MACC1 expression into a low and high risk group with regard to MFS prognosis. This is the first report showing that identification of CRC patients at high risk for metastasis is possible by assessing MACC1 expression in combination with KRAS G13 mutation.

  5. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-02-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in-situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols and greenhouse gases, and is based on the Integrated Forecast System of the ECMWF. The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past three years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  6. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-11-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols, and greenhouse gases, and is based on the Integrated Forecasting System of the European Centre for Medium-Range Weather Forecasts (ECMWF). The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past 3 years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high-pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  7. Performance of magnetic activated carbon composite as peroxymonosulfate activator and regenerable adsorbent via sulfate radical-mediated oxidation processes.

    PubMed

    Oh, Wen-Da; Lua, Shun-Kuang; Dong, Zhili; Lim, Teik-Thye

    2015-03-02

    Magnetic activated carbon composite (CuFe2O4/AC, MACC) was prepared by a co-precipitation-calcination method. The MACC consisted of porous micro-particle morphology with homogeneously distributed CuFe2O4 and possessed high magnetic saturation moment (8.1 emu g(-1)). The performance of MACC was evaluated as catalyst and regenerable adsorbent via peroxymonosulfate (PMS, Oxone(®)) activation for methylene blue (MB) removal. Optimum CuFe2O4/AC w/w ratio was 1:1.5 giving excellent performance and can be reused for at least 3 cycles. The presence of common inorganic ions, namely Cl(-) and NO3(-) did not exert significant influence on MB degradation but humic acid decreased the MB degradation rate. As a regenerable adsorbent, negligible difference in regeneration efficiency was observed when a higher Oxone(®) dosage was employed but a better efficiency was obtained at a lower MACC loading. The factors hindering complete MACC regeneration are MB adsorption irreversibility and AC surface modification by PMS making it less favorable for subsequent MB adsorption. With an additional mild heat treatment (150 °C) after regeneration, 82% of the active sites were successfully regenerated. A kinetic model incorporating simultaneous first-order desorption, second-order adsorption and pseudo-first order degradation processes was numerically-solved to describe the rate of regeneration. The regeneration rate increased linearly with increasing Oxone(®):MACC ratio. The MACC could potentially serve as a catalyst for PMS activation and regenerable adsorbent. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Improved operator agreement and efficiency using the minimum area contour change method for delineation of hyperintense multiple sclerosis lesions on FLAIR MRI

    PubMed Central

    2013-01-01

    Background Activity of disease in patients with multiple sclerosis (MS) is monitored by detecting and delineating hyper-intense lesions on MRI scans. The Minimum Area Contour Change (MACC) algorithm has been created with two main goals: a) to improve inter-operator agreement on outlining regions of interest (ROIs) and b) to automatically propagate longitudinal ROIs from the baseline scan to a follow-up scan. Methods The MACC algorithm first identifies an outer bound for the solution path, forms a high number of iso-contour curves based on equally spaced contour values, and then selects the best contour value to outline the lesion. The MACC software was tested on a set of 17 FLAIR MRI images evaluated by a pair of human experts and a longitudinal dataset of 12 pairs of T2-weighted Fluid Attenuated Inversion Recovery (FLAIR) images that had lesion analysis ROIs drawn by a single expert operator. Results In the tests where two human experts evaluated the same MRI images, the MACC program demonstrated that it could markedly reduce inter-operator outline error. In the longitudinal part of the study, the MACC program created ROIs on follow-up scans that were in close agreement to the original expert’s ROIs. Finally, in a post-hoc analysis of 424 follow-up scans 91% of propagated MACC were accepted by an expert and only 9% of the final accepted ROIS had to be created or edited by the expert. Conclusion When used with an expert operator's verification of automatically created ROIs, MACC can be used to improve inter- operator agreement and decrease analysis time, which should improve data collected and analyzed in multicenter clinical trials. PMID:24004511

  9. AIRS Views of Anthropogenic and Biomass Burning CO: INTEX-B/MILAGRO and TEXAQS/GoMACCS

    NASA Astrophysics Data System (ADS)

    McMillan, W. W.; Warner, J.; Wicks, D.; Barnet, C.; Sachse, G.; Chu, A.; Sparling, L.

    2006-12-01

    Utilizing the Atmospheric InfraRed Sounder's (AIRS) unique spatial and temporal coverage, we present observations of anthropogenic and biomass burning CO emissions as observed by AIRS during the 2006 field experiments INTEX-B/MILAGRO and TEXAQS/GoMACCS. AIRS daily CO maps covering more than 75% of the planet demonstrate the near global transport of these emissions. AIRS day/night coverage of significant portions of the Earth often show substantial changes in 12 hours or less. However, the coarse vertical resolution of AIRS retrieved CO complicates its interpretation. For example, extensive CO emissions are evident from Asia during April and May 2006, but it is difficult to determine the relative contributions of biomass burning in Thailand vs. domestic and industrial emissions from China. Similarly, sometimes AIRS sees enhanced CO over and downwind of Mexico City and other populated areas. AIRS low information content and decreasing sensitivity in the boundary layer can result in underestimates of CO total columns and free tropospheric abundances. Building on our analyses of INTEX-A/ICARTT data from 2004, we present comparisons with INTEX-B/MILAGRO and TEXAQS/GoMACCS in situ aircraft measurements and other satellite CO observations. The combined analysis of AIRS CO, water vapor and O3 retrievals; MODIS aerosol optical depths; and forward trajectory computations illuminate a variety of dynamical processes in the troposphere.

  10. Prognostic Value of MACC1 in Digestive System Neoplasms: A Systematic Review and Meta-Analysis

    PubMed Central

    Wu, Zhenzhen; Zhou, Rui; Su, Yuqi; Sun, Li; Liao, Yulin; Liao, Wangjun

    2015-01-01

    Metastasis associated in colon cancer 1 (MACC1), a newly identified oncogene, has been associated with poor survival of cancer patients by multiple studies. However, the prognostic value of MACC1 in digestive system neoplasms needs systematic evidence to verify. Therefore, we aimed to provide further evidence on this topic by systematic review and meta-analysis. Literature search was conducted in multiple databases and eligible studies analyzing survival data and MACC1 expression were included for meta-analysis. Hazard ratio (HR) for clinical outcome was chosen as an effect measure of interest. According to our inclusion criteria, 18 studies with a total of 2,948 patients were identified. Pooled HRs indicated that high MACC1 expression significantly correlates with poorer OS in patients with digestive system neoplasms (HR = 1.94; 95% CI: 1.49–2.53) as well as poorer relapse-free survival (HR = 1.94, 95% CI: 1.33–2.82). The results of subgroup studies categorized by methodology, anatomic structure, and cancer subtype for pooled OS were all consistent with the overall pooled HR for OS as well. No publication bias was detected according to test of funnel plot asymmetry and Egger's test. In conclusion, high MACC1 expression may serve as a prognostic biomarker to guide individualized management in clinical practice for digestive system neoplasms. PMID:26090393

  11. Prognostic Value of MACC1 in Digestive System Neoplasms: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Zhenzhen; Zhou, Rui; Su, Yuqi; Sun, Li; Liao, Yulin; Liao, Wangjun

    2015-01-01

    Metastasis associated in colon cancer 1 (MACC1), a newly identified oncogene, has been associated with poor survival of cancer patients by multiple studies. However, the prognostic value of MACC1 in digestive system neoplasms needs systematic evidence to verify. Therefore, we aimed to provide further evidence on this topic by systematic review and meta-analysis. Literature search was conducted in multiple databases and eligible studies analyzing survival data and MACC1 expression were included for meta-analysis. Hazard ratio (HR) for clinical outcome was chosen as an effect measure of interest. According to our inclusion criteria, 18 studies with a total of 2,948 patients were identified. Pooled HRs indicated that high MACC1 expression significantly correlates with poorer OS in patients with digestive system neoplasms (HR = 1.94; 95% CI: 1.49-2.53) as well as poorer relapse-free survival (HR = 1.94, 95% CI: 1.33-2.82). The results of subgroup studies categorized by methodology, anatomic structure, and cancer subtype for pooled OS were all consistent with the overall pooled HR for OS as well. No publication bias was detected according to test of funnel plot asymmetry and Egger's test. In conclusion, high MACC1 expression may serve as a prognostic biomarker to guide individualized management in clinical practice for digestive system neoplasms.

  12. In-depth characterization of the salivary adenoid cystic carcinoma transcriptome with emphasis on dominant cell type.

    PubMed

    Bell, Diana; Bell, Achim H; Bondaruk, Jolanta; Hanna, Ehab Y; Weber, Randall S

    2016-05-15

    Adenoid cystic carcinoma (ACC), 1 of the most common salivary gland malignancies, arises from the intercalated ducts, which are composed of inner ductal epithelial cells and outer myoepithelial cells. The objective of this study was to determine the genomic subtypes of ACC with emphasis on dominant cell type to identify potential specific biomarkers for each subtype and to improve the understanding of this disease. A whole-genome expression study was performed based on 42 primary salivary ACCs and 5 normal salivary glands. RNA from these specimens was subjected to expression profiling with RNA sequencing, and results were analyzed to identify transcripts in epithelial-dominant ACC (E-ACC), myoepithelial-dominant ACC (M-ACC), and all ACC that were expressed differentially compared with the transcripts in normal salivary tissue. In total, the authors identified 430 differentially expressed transcripts that were unique to E-ACC, 392 that were unique to M-ACC, and 424 that were common to both M-ACC and E-ACC. The sets of E-ACC-specific and M-ACC-specific transcripts were sufficiently large to define and differentiate E-ACC from M-ACC. Ingenuity pathway analysis identified known cancer-related genes for 60% of the E-ACC transcripts, 69% of the M-ACC transcripts, and 68% of the transcripts that were common in both E-ACC and M-ACC. Three sets of highly expressed candidate genes-distal-less homeobox 6 (DLX6) for E-ACC; protein keratin 16 (KRT16), SRY box 11 (SOX11), and v-myb avian myeloblastosis viral oncogene homolog (MYB) for M-ACC; and engrailed 1 (EN1) and statherin (STATH), which are common to both E-ACC and M-ACC)-were further validated at the protein level. The current results enabled the authors to identify novel potential therapeutic targets and biomarkers in E-ACC and M-ACC individually, with the implication that EN1, DLX6, and OTX1 (orthodenticle homeobox 1) are potential drivers of these cancers. Cancer 2016;122:1513-22. © 2016 American Cancer Society. © 2016 American Cancer Society.

  13. High power vertical stacked diode laser development using macro-channel water cooling and hard solder bonding technology

    NASA Astrophysics Data System (ADS)

    Yu, Dongshan; Liang, Xuejie; Wang, Jingwei; Li, Xiaoning; Nie, Zhiqiang; Liu, Xingsheng

    2017-02-01

    A novel marco channel cooler (MaCC) has been developed for packaging high power diode vertical stacked (HPDL) lasers, which eliminates many of the issues in commercially-available copper micro-channel coolers (MCC). The MaCC coolers, which do not require deionized water as coolant, were carefully designed for compact size and superior thermal dissipation capability. Indium-free packaging technology was adopted throughout product design and fabrication process to minimize the risk of solder electromigration and thermal fatigue at high current density and long pulse width under QCW operation. Single MaCC unit with peak output power of up to 700W/bar at pulse width in microsecond range and 200W/bar at pulse width in millisecond range has been recorded. Characteristic comparison on thermal resistivity, spectrum, near filed and lifetime have been conducted between a MaCC product and its counterpart MCC product. QCW lifetime test (30ms 10Hz, 30% duty cycle) has also been conducted with distilled water as coolant. A vertical 40-MaCC stack product has been fabricated, total output power of 9 kilowatts has been recorded under QCW mode (3ms, 30Hz, 9% duty cycle).

  14. Ten-Year Cross-Sectional Study of Mechanically Assisted Crevice Corrosion in 1352 Consecutive Patients With Metal-on-Polyethylene Total Hip Arthroplasty.

    PubMed

    Hussey, Daniel K; McGrory, Brian J

    2017-08-01

    Mechanically assisted crevice corrosion (MACC) in metal-on-polyethylene total hip arthroplasty (THA) is of concern, but its prevalence, etiology, and natural history are incompletely understood. From January 2003 to December 2012, 1352 consecutive THA surgeries using a titanium stem, cobalt-chromium alloy femoral head, and highly cross-linked polyethylene liner from a single manufacturer were performed. Patients were followed at 1-year and 5-year intervals for surveillance, but also seen earlier if they had symptoms. Any patient with osteolysis >1 cm (n = 3) or unexplained pain (n = 85) underwent examination, radiographs, complete blood count, erythrocyte sedimentation rate, and C-reactive protein, as well as tests for serum cobalt and chromium levels. Symptomatic MACC was present in 43 of 1352 patients (3.2%). Prevalence of MACC by year of implant ranged from 0% (0 of 61, 2003; 0 of 138, 2005) to 10.5% (17 of 162; 2009). The M/L Taper stem had a greater prevalence (4.9%) of MACC than all other Zimmer (Zimmer, Inc, Warsaw, IN) 12/14 trunnion stem types combined (1.2%; P < .001). Twenty-seven of 43 (62.8%) patients have undergone revision surgery, and 16 of 43 (37.2%) patients have opted for ongoing surveillance. Comparing symptomatic THA patients with and without MACC, no demographic, clinical, or radiographic differences were found. MACC was significantly more common in 0 length femoral heads (compared with both -3.5 mm and +3.5 mm heads). The prevalence of MACC in metal-on-polyethylene hips is higher in this cross-sectional study than previously reported. A significantly higher prevalence was found in patients with M/L Taper style stem and THA performed both in 2009 and also between 2009 and 2012 with this manufacturer. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Relation of Stature to Outcomes in Korean Patients Undergoing Primary Percutaneous Coronary Intervention for Acute ST-Elevation Myocardial Infarction (from the INTERSTELLAR Registry).

    PubMed

    Moon, Jeonggeun; Suh, Jon; Oh, Pyung Chun; Lee, Kyounghoon; Park, Hyun Woo; Jang, Ho-Jun; Kim, Tae-Hoon; Park, Sang-Don; Kwon, Sung Woo; Kang, Woong Chol

    2016-07-15

    Although epidemiologic studies have shown the impact of height on occurrence and/or prognosis of cardiovascular diseases, the underlying mechanism is unclear. In addition, the relation in patients with ST-segment elevation myocardial infarction (STEMI) who underwent primary percutaneous coronary intervention (PCI) remains unknown. We sought to assess the influence of height on outcomes of patients with acute STEMI undergoing primary PCI and to provide a pathophysiological explanation. All 1,490 patients with STEMI undergoing primary PCI were analyzed. Major adverse cardiac and cerebrovascular events (MACCE) were defined as all-cause mortality, nonfatal myocardial infarction, nonfatal stroke, and unplanned hospitalization for heart failure (HF). Patients were divided into (1) MACCE (+) versus MACCE (-) and (2) first- to third-tertile groups according to height. MACCE (+) group was shorter than MACCE (-) group (164 ± 8 vs 166 ± 8 cm, p = 0.012). Prognostic impact of short stature was significant in older (≥70 years) male patients even after adjusting for co-morbidities (hazard ratio 0.951, 95% confidence interval 0.912 to 0.991, p = 0.017). The first-tertile group showed the worst MACCE-free survival (p = 0.035), and most cases of MACCE were HF (n, 17 [3%] vs 6 [1%] vs 2 [0%], p = 0.004). On post-PCI echocardiography, left atrial volume and early diastolic mitral velocity to early diastolic mitral annulus velocity ratio showed an inverse relation with height (p <0.001 for all) despite similar left ventricular ejection fraction. In conclusion, short stature is associated with occurrence of HF after primary PCI for STEMI, and its influence is prominent in aged male patients presumably for its correlation with diastolic dysfunction. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. MACCS : Multi-Mission Atmospheric Correction and Cloud Screening tool for high-frequency revisit data processing

    NASA Astrophysics Data System (ADS)

    Petrucci, B.; Huc, M.; Feuvrier, T.; Ruffel, C.; Hagolle, O.; Lonjou, V.; Desjardins, C.

    2015-10-01

    For the production of Level2A products during Sentinel-2 commissioning in the Technical Expertise Center Sentinel-2 in CNES, CESBIO proposed to adapt the Venus Level-2 , taking advantage of the similarities between the two missions: image acquisition at a high frequency (2 days for Venus, 5 days with the two Sentinel-2), high resolution (5m for Venus, 10, 20 and 60m for Sentinel-2), images acquisition under constant viewing conditions. The Multi-Mission Atmospheric Correction and Cloud Screening (MACCS) tool was born: based on CNES Orfeo Toolbox Library, Venμs processor which was already able to process Formosat2 and VENμS data, was adapted to process Sentinel-2 and Landsat5-7 data; since then, a great effort has been made reviewing MACCS software architecture in order to ease the add-on of new missions that have also the peculiarity of acquiring images at high resolution, high revisit and under constant viewing angles, such as Spot4/Take5 and Landsat8. The recursive and multi-temporal algorithm is implemented in a core that is the same for all the sensors and that combines several processing steps: estimation of cloud cover, cloud shadow, water, snow and shadows masks, of water vapor content, aerosol optical thickness, atmospheric correction. This core is accessed via a number of plug-ins where the specificity of the sensor and of the user project are taken into account: products format, algorithmic processing chaining and parameters. After a presentation of MACCS architecture and functionalities, the paper will give an overview of the production facilities integrating MACCS and the associated specificities: the interest for this tool has grown worldwide and MACCS will be used for extensive production within the THEIA land data center and Agri-S2 project. Finally the paper will zoom on the use of MACCS during Sentinel-2 In Orbit Test phase showing the first Level-2A products.

  17. MISR Regional GoMACCS Products

    Atmospheric Science Data Center

    2016-08-24

    ... parameters from one Level 1 or Level 2 product. Further information about the Level 1 and Level 2 data products can be found on the  ... MISR GoMACCS data table . Images available on this web site include the following parameters: Image Description ...

  18. The Mobile Advanced Command and Control Station (MACCS) Experimental Testbed

    DTIC Science & Technology

    2007-10-01

    were selected: Vehicle: Dodge (Sprinter 2500 high-roof - Mercedes - Benz vehicle) Electrical equipment and habitability equipment: Crossroads Coaches...this innovative , mobile, experimental tested. IMPACT/APPLICATIONS While MACCS clearly supports the research agenda for both HAL and ONR (as well as

  19. Efficacy and safety of aspirin, clopidogrel, and warfarin after coronary artery stenting in Korean patients with atrial fibrillation.

    PubMed

    Suh, Soon Yong; Kang, Woong Chol; Oh, Pyung Chun; Choi, Hanul; Moon, Chan Il; Lee, Kyounghoon; Han, Seung Hwan; Ahn, Taehoon; Choi, In Suck; Shin, Eak Kyun

    2014-09-01

    There are limited data on the optimal antithrombotic therapy for patients with atrial fibrillation (AF) who undergoing coronary stenting. We reviewed 203 patients (62.6 % men, mean age 68.3 ± 10.1 years) between 2003 and 2012, and recorded clinical and demographic characteristics of the patients. Clinical follow-up included major adverse cardiac and cerebrovascular events (MACCE) (cardiac death, myocardial infarction, target lesion revascularization, and stroke), stent thrombosis, and bleeding. The most commonly associated comorbidities were hypertension (70.4 %), diabetes mellitus (35.5 %), and congestive heart failure (26.6 %). Sixty-three percent of patients had stroke risk higher than CHADS2 score 2. At discharge, dual-antiplatelet therapy (aspirin, clopidogrel) was used in 166 patients (81.8 %; Group I), whereas 37 patients (18.2 %) were discharged with triple therapy (aspirin, clopidogrel, warfarin; Group II). The mean follow-up period was 42.0 ± 29.0 months. The mean international normalized ratio (INR) in group II was 1.83 ± 0.41. The total MACCE was 16.3 %, with stroke in 3.4 %. Compared with the group II, the incidence of MACCE (2.7 % vs 19.3 %, P = 0.012) and cardiac death (0 % vs 11.4 %, P = 0.028) were higher in the group I. Major and any bleeding, however, did not differ between the two groups. In multivariate analysis, no warfarin therapy (odds ratio 7.8, 95 % confidence interval 1.02-59.35; P = 0.048) was an independent predictor of MACCE. By Kaplan-Meier survival analysis, warfarin therapy was associated with a lower risk of MACCE (P = 0.024). In patients with AF undergoing coronary artery stenting, MACCE were reduced by warfarin therapy without increased bleeding, which might be related to tighter control with a lower INR value.

  20. Does geographical variability influence five-year MACCE rates in the multicentre SYNTAX revascularisation trial?

    PubMed

    Roy, Andrew K; Chevalier, Bernard; Lefèvre, Thierry; Louvard, Yves; Segurado, Ricardo; Sawaya, Fadi; Spaziano, Marco; Neylon, Antoinette; Serruys, Patrick A; Dawkins, Keith D; Kappetein, Arie Pieter; Mohr, Friedrich-Wilhelm; Colombo, Antonio; Feldman, Ted; Morice, Marie-Claude

    2017-09-20

    The use of multiple geographical sites for randomised cardiovascular trials may lead to important heterogeneity in treatment effects. This study aimed to determine whether treatment effects from different geographical recruitment regions impacted significantly on five-year MACCE rates in the SYNTAX trial. Five-year SYNTAX results (n=1,800) were analysed for geographical variability by site and country for the effect of treatment (CABG vs. PCI) on MACCE rates. Fixed, random, and linear mixed models were used to test clinical covariate effects, such as diabetes, lesion characteristics, and procedural factors. Comparing five-year MACCE rates, the pooled odds ratio (OR) between study sites was 0.58 (95% CI: 0.47-0.71), and countries 0.59 (95% CI: 0.45-0.73). By homogeneity testing, no individual site (X2=93.8, p=0.051) or country differences (X2=25.7, p=0.080) were observed. For random effects models, the intraclass correlation was minimal (ICC site=5.1%, ICC country=1.5%, p<0.001), indicating minimal geographical heterogeneity, with a hazard ratio of 0.70 (95% CI: 0.59-0.83). Baseline risk (smoking, diabetes, PAD) did not influence regional five-year MACCE outcomes (ICC 1.3%-5.2%), nor did revascularisation of the left main vs. three-vessel disease (p=0.241), across site or country subgroups. For CABG patients, the number of arterial (p=0.49) or venous (p=0.38) conduits used also made no difference. Geographic variability has no significant treatment effect on MACCE rates at five years. These findings highlight the generalisability of the five-year outcomes of the SYNTAX study.

  1. Comorbidities and Ventricular Dysfunction Drive Excess Mid-Term Morbidity in an Indigenous Australian Coronary Revascularisation Cohort.

    PubMed

    Wiemers, Paul D; Marney, Lucy; White, Nicole; Bough, Georgina; Hustig, Alistair; Tan, Wei; Cheng, Ching-Siang; Kang, Dong; Yadav, Sumit; Tam, Robert; Fraser, John F

    2018-04-24

    There is a paucity of data in regards to longer term morbidity outcomes in Indigenous Australian patients undergoing coronary artery bypass grafting (CABG). No comparative data on re-infarction, stroke or reintervention rates exist. Outcome data following percutaneous coronary intervention (PCI) is also extremely limited. Addressing this gap in knowledge forms the major aim of our study. This was a single centre cohort study conducted at the Townsville Hospital, Australia which provides tertiary adult cardiac surgical services to the northern parts of the state of Queensland. It incorporated consecutive patients (n=350) undergoing isolated CABG procedures, 2008-2010, 20.9% (73/350) of whom were Indigenous Australians. The main outcome measures were major adverse cardiac or cerebrovascular events (MACCE) at mid-term follow-up (mean 38.9 months). The incidence of MACCE among Indigenous Australian patients was approximately twice that of non-Indigenous patients at mid-term follow-up (36.7% vs. 18.6%; p=0.005; OR 2.525 (1.291-4.880)). Following adjustment for preoperative and operative variables, Indigenous Australian status itself was not significantly associated with MACCE (AOR 1.578 (0.637-3.910)). Significant associations with MACCE included renal impairment (AOR 2.198 (1.010-4.783)) and moderate-severe left ventricular impairment (AOR 3.697 (1.820-7.508)). An association between diabetes and MACCE failed to reach statistical significance (AOR 1.812 (0.941-3.490)). Indigenous Australians undergoing CABG suffer an excess of MACCE when followed-up in the longer term. High rates of comorbidities in the Indigenous Australian population likely play an aetiological role. Copyright © 2018. Published by Elsevier B.V.

  2. Integrative marker analysis allows risk assessment for metastasis in stage II colon cancer.

    PubMed

    Nitsche, Ulrich; Rosenberg, Robert; Balmert, Alexander; Schuster, Tibor; Slotta-Huspenina, Julia; Herrmann, Pia; Bader, Franz G; Friess, Helmut; Schlag, Peter M; Stein, Ulrike; Janssen, Klaus-Peter

    2012-11-01

    Individualized risk assessment in patients with UICC stage II colon cancer based on a panel of molecular genetic alterations. Risk assessment in patients with colon cancer and localized disease (UICC stage II) is not sufficiently reliable. Development of metachronous metastasis is assumed to be governed largely by individual tumor genetics. Fresh frozen tissue from 232 patients (T3-4, N0, M0) with complete tumor resection and a median follow-up of 97 months was analyzed for microsatellite stability, KRAS exon 2, and BRAF exon 15 mutations. Gene expression of the WNT-pathway surrogate marker osteopontin and the metastasis-associated genes SASH1 and MACC1 was determined for 179 patients. The results were correlated with metachronous distant metastasis risk (n = 22 patients). Mutations of KRAS were detected in 30% patients, mutations of BRAF in 15% patients, and microsatellite instability in 26% patients. Risk of recurrence was associated with KRAS mutation (P = 0.033), microsatellite stable tumors (P = 0.015), decreased expression of SASH1 (P = 0.049), and increased expression of MACC1 (P < 0.001). MACC1 was the only independent parameter for recurrence prediction (hazard ratio: 6.2; 95% confidence interval: 2.4-16; P < 0.001). Integrative 2-step cluster analysis allocated patients into 4 groups, according to their tumor genetics. KRAS mutation, BRAF wild type, microsatellite stability, and high MACC1 expression defined the group with the highest risk of recurrence (16%, 7 of 43), whereas BRAF wild type, microsatellite instability, and low MACC1 expression defined the group with the lowest risk (4%, 1 of 26). MACC1 expression predicts development of metastases, outperforming microsatellite stability status, as well as KRAS/BRAF mutation status.

  3. Weighting Composite Endpoints in Clinical Trials: Essential Evidence for the Heart Team

    PubMed Central

    Tong, Betty C.; Huber, Joel C.; Ascheim, Deborah D.; Puskas, John D.; Ferguson, T. Bruce; Blackstone, Eugene H.; Smith, Peter K.

    2013-01-01

    Background Coronary revascularization trials often use a composite endpoint of major adverse cardiac and cerebrovascular events (MACCE). The usual practice in analyzing data with a composite endpoint is to assign equal weights to each of the individual MACCE elements. Non-inferiority margins are used to offset effects of presumably less important components, but their magnitudes are subject to bias. This study describes the relative importance of MACCE elements from a patient perspective. Methods A discrete choice experiment was conducted. Survey respondents were presented with a scenario that would make them eligible for the SYNTAX 3-Vessel Disease cohort. Respondents chose among pairs of procedures that differed on the 3-year probability of MACCE, potential for increased longevity, and procedure/recovery time. Conjoint analysis derived relative weights for these attributes. Results In all, 224 respondents completed the survey. The attributes did not have equal weight. Risk of death was most important (relative weight 0.23), followed by stroke (.18), potential increased longevity and recovery time (each 0.17), MI (0.14) and risk of repeat revascularization (0.11). Applying these weights to the SYNTAX 3-year endpoints resulted in a persistent, but decreased margin of difference in MACCE favoring CABG compared to PCI. When labeled only as “Procedure A” and “B,” 87% of respondents chose CABG over PCI. When procedures were labeled as “Coronary Stent” and “Coronary Bypass Surgery,” only 73% chose CABG. Procedural preference varied with demographics, gender and familiarity with the procedures. Conclusions MACCE elements do not carry equal weight in a composite endpoint, from a patient perspective. Using a weighted composite endpoint increases the validity of statistical analyses and trial conclusions. Patients are subject to bias by labels when considering coronary revascularization. PMID:22795064

  4. Electrolytic conditioning of a magnesium aluminum chloride complex for reversible magnesium deposition

    DOE PAGES

    Barile, Christopher J.; Barile, Elizabeth C.; Zavadil, Kevin R.; ...

    2014-12-04

    We describe in this report the electrochemistry of Mg deposition and dissolution from the magnesium aluminum chloride complex (MACC). The results define the requirements for reversible Mg deposition and definitively establish that voltammetric cycling of the electrolyte significantly alters its composition and performance. Elemental analysis, scanning electron microscopy, and energy-dispersive X-ray spectroscopy (SEM-EDS) results demonstrate that irreversible Mg and Al deposits form during early cycles. Electrospray ionization-mass spectrometry (ESI-MS) data show that inhibitory oligomers develop in THF-based solutions. These oligomers form via the well-established mechanism of a cationic ring-opening polymerization of THF during the initial synthesis of the MACC andmore » under resting conditions. In contrast, MACC solutions in 1,2-dimethoxyethane (DME), an acyclic solvent, do not evolve as dramatically at open circuit potential. Furthermore, we propose a mechanism describing how the conditioning process of the MACC in THF improves its performance by both tuning the Mg:Al stoichiometry and eliminating oligomers.« less

  5. Marginal abatement cost curve for nitrogen oxides incorporating controls, renewable electricity, energy efficiency, and fuel switching.

    PubMed

    Loughlin, Daniel H; Macpherson, Alexander J; Kaufman, Katherine R; Keaveny, Brian N

    2017-10-01

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs are typically developed by sorting control technologies by their relative cost-effectiveness. Other potentially important abatement measures such as renewable electricity, energy efficiency, and fuel switching (RE/EE/FS) are often not incorporated into MACCs, as it is difficult to quantify their costs and abatement potential. In this paper, a U.S. energy system model is used to develop a MACC for nitrogen oxides (NO x ) that incorporates both traditional controls and these additional measures. The MACC is decomposed by sector, and the relative cost-effectiveness of RE/EE/FS and traditional controls are compared. RE/EE/FS are shown to have the potential to increase emission reductions beyond what is possible when applying traditional controls alone. Furthermore, a portion of RE/EE/FS appear to be cost-competitive with traditional controls. Renewable electricity, energy efficiency, and fuel switching can be cost-competitive with traditional air pollutant controls for abating air pollutant emissions. The application of renewable electricity, energy efficiency, and fuel switching is also shown to have the potential to increase emission reductions beyond what is possible when applying traditional controls alone.

  6. Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan C.; Gauntt, Randall O.

    Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less

  7. Marginal abatement cost curves for NOx that account for renewable electricity, energy efficiency, and fuel switching

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their resp...

  8. Regional and sectoral marginal abatement cost curves for NOx incorporating controls, renewable electricity, energy efficiency and fuel switching

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their resp...

  9. Marginal abatement cost curve for NOx incorporating controls, renewable electricity, energy efficiency and fuel switching

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their resp...

  10. Marginal abatement cost curve for NOx incorporating controls, renewable electricity, energy efficiency and fuel switching

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their rela...

  11. Clinical outcomes of patients with hypothyroidism undergoing percutaneous coronary intervention

    PubMed Central

    Zhang, Ming; Sara, Jaskanwal D.S.; Matsuzawa, Yasushi; Gharib, Hossein; Bell, Malcolm R.; Gulati, Rajiv; Lerman, Lilach O.

    2016-01-01

    Abstract Aims The aim of this study was to investigate the association between hypothyroidism and major adverse cardiovascular and cerebral events (MACCE) in patients undergoing percutaneous coronary intervention (PCI). Methods and results Two thousand four hundred and thirty patients who underwent PCI were included. Subjects were divided into two groups: hypothyroidism ( n = 686) defined either as a history of hypothyroidism or thyroid-stimulating hormone (TSH) ≥5.0 mU/mL, and euthyroidism ( n = 1744) defined as no history of hypothyroidism and/or 0.3 mU/mL ≤ TSH < 5.0 mU/mL. Patients with hypothyroidism were further categorized as untreated ( n = 193), or those taking thyroid replacement therapy (TRT) with adequate replacement (0.3 mU/mL ≤ TSH < 5.0 mU/mL, n = 175) or inadequate replacement (TSH ≥ 5.0 mU/mL, n = 318). Adjusted hazard ratios (HRs) were calculated using Cox proportional hazards models. Median follow-up was 3.0 years (interquartile range, 0.5–7.0). After adjustment for covariates, the risk of MACCE and its constituent parts was higher in patients with hypothyroidism compared with those with euthyroidism (MACCE: HR: 1.28, P = 0.0001; myocardial infarction (MI): HR: 1.25, P = 0.037; heart failure: HR: 1.46, P = 0.004; revascularization: HR: 1.26, P = 0.0008; stroke: HR: 1.62, P = 0.04). Compared with untreated patients or those with inadequate replacement, adequately treated hypothyroid patients had a lower risk of MACCE (HR: 0.69, P = 0.005; HR: 0.78, P = 0.045), cardiac death (HR: 0.43, P = 0.008), MI (HR: 0.50, P = 0.0004; HR: 0.60, P = 0.02), and heart failure (HR: 0.50, P = 0.02; HR: 0.52, P = 0.017). Conclusion Hypothyroidism is associated with a higher incidence of MACCE compared with euthyroidism in patients undergoing PCI. Maintaining adequate control on TRT is beneficial in preventing MACCE. PMID:26757789

  12. Differential Event Rates and Independent Predictors of Long-Term Major Cardiovascular Events and Death in 5795 Patients With Unprotected Left Main Coronary Artery Disease Treated With Stents, Bypass Surgery, or Medication: Insights From a Large International Multicenter Registry.

    PubMed

    Kang, Se Hun; Ahn, Jung-Min; Lee, Cheol Hyun; Lee, Pil Hyung; Kang, Soo-Jin; Lee, Seung-Whan; Kim, Young-Hak; Lee, Cheol Whan; Park, Seong-Wook; Park, Duk-Woo; Park, Seung-Jung

    2017-07-01

    Identifying predictive factors for major cardiovascular events and death in patients with unprotected left main coronary artery disease is of great clinical value for risk stratification and possible guidance for tailored preventive strategies. The Interventional Research Incorporation Society-Left MAIN Revascularization registry included 5795 patients with unprotected left main coronary artery disease (percutaneous coronary intervention, n=2850; coronary-artery bypass grafting, n=2337; medication alone, n=608). We analyzed the incidence and independent predictors of major adverse cardiac and cerebrovascular events (MACCE; a composite of death, MI, stroke, or repeat revascularization) and all-cause mortality in each treatment stratum. During follow-up (median, 4.3 years), the rates of MACCE and death were substantially higher in the medical group than in the percutaneous coronary intervention and coronary-artery bypass grafting groups ( P <0.001). In the percutaneous coronary intervention group, the 3 strongest predictors for MACCE were chronic renal failure, old age (≥65 years), and previous heart failure; those for all-cause mortality were chronic renal failure, old age, and low ejection fraction. In the coronary-artery bypass grafting group, old age, chronic renal failure, and low ejection fraction were the 3 strongest predictors of MACCE and death. In the medication group, old age, low ejection fraction, and diabetes mellitus were the 3 strongest predictors of MACCE and death. Among patients with unprotected left main coronary artery disease, the key clinical predictors for MACCE and death were generally similar regardless of index treatment. This study provides effect estimates for clinically relevant predictors of long-term clinical outcomes in real-world left main coronary artery patients, providing possible guidance for tailored preventive strategies. URL: https://clinicaltrials.gov. Unique identifier: NCT01341327. © 2017 American Heart Association, Inc.

  13. Severity of OSAS, CPAP and cardiovascular events: A follow-up study.

    PubMed

    Baratta, Francesco; Pastori, Daniele; Fabiani, Mario; Fabiani, Valerio; Ceci, Fabrizio; Lillo, Rossella; Lolli, Valeria; Brunori, Marco; Pannitteri, Gaetano; Cravotto, Elena; De Vito, Corrado; Angelico, Francesco; Del Ben, Maria

    2018-05-01

    Previous studies suggested obstructive sleep apnoea syndrome (OSAS) as a major risk factor for incident cardiovascular events. However, the relationship between OSAS severity, the use of continuous positive airway pressure (CPAP) treatment and the development of cardiovascular disease is still matter of debate. The aim was to test the association between OSAS and cardiovascular events in patients with concomitant cardio-metabolic diseases and the potential impact of CPAP therapy on cardiovascular outcomes. Prospective observational cohort study of consecutive outpatients with suspected metabolic disorders who had complete clinical and biochemical workup including polysomnography because of heavy snoring and possible OSAS. The primary endpoint was a composite of major adverse cardiovascular and cerebrovascular events (MACCE). Median follow-up was 81.3 months, including 434 patients (2701.2 person/years); 83 had a primary snoring, 84 had mild, 93 moderate and 174 severe OSAS, respectively. The incidence of MACCE was 0.8% per year (95% confidence interval [CI] 0.2-2.1) in primary snorers and 2.1% per year (95% CI 1.5-2.8) for those with OSAS. A positive association was observed between event-free survival and OSAS severity (log-rank test; P = .041). A multivariable Cox regression analysis showed obesity (HR = 8.011, 95% CI 1.071-59.922, P = .043), moderate OSAS (vs non-OSAS HR = 3.853, 95% CI 1.069-13.879, P = .039) and severe OSAS (vs non-OSAS HR = 3.540, 95% CI 1.026-12.217, P = .045) as predictors of MACCE. No significant association was observed between CPAP treatment and MACCE (log-rank test; P = .227). Our findings support the role of moderate/severe OSAS as a risk factor for incident MACCE. CPAP treatment was not associated with a lower rate of MACCE. © 2018 Stichting European Society for Clinical Investigation Journal Foundation.

  14. GLANCE - calculatinG heaLth impActs of atmospheric pollutioN in a Changing climatE

    NASA Astrophysics Data System (ADS)

    Vogel, Leif; Faria, Sérgio; Markandya, Anil

    2016-04-01

    Current annual global estimates of premature deaths from poor air quality are estimated in the range of 2.6-4.4 million, and 2050 projections are expected to double against 2010 levels. In Europe, annual economic burdens are estimated at around 750 bn €. Climate change will further exacerbate air pollution burdens; therefore, a better understanding of the economic impacts on human societies has become an area of intense investigation. European research efforts are being carried out within the MACC project series, which started in 2005. The outcome of this work has been integrated into a European capacity for Earth Observation, the Copernicus Atmospheric Monitoring Service (CAMS). In MACC/CAMS, key pollutant concentrations are computed at the European scale and globally by employing chemically-driven advanced transport models. The project GLANCE (calculatinG heaLth impActs of atmospheric pollutioN in a Changing climatE) aims at developing an integrated assessment model for calculating the health impacts and damage costs of air pollution at different physical scales. It combines MACC/CAMS (assimilated Earth Observations, an ensemble of chemical transport models and state of the art ECWMF weather forecasting) with downscaling based on in-situ network measurements. The strengthening of modelled projections through integration with empirical evidence reduces errors and uncertainties in the health impact projections and subsequent economic cost assessment. In addition, GLANCE will yield improved data accuracy at different time resolutions. This project is a multidisciplinary approach which brings together expertise from natural sciences and socio economic fields. Here, its general approach will be presented together with first results for the years 2007 - 2012 on the European scale. The results on health impacts and economic burdens are compared to existing assessments.

  15. The Interplay of Al and Mg Speciation in Advanced Mg Battery Electrolyte Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    See, Kimberly A.; Chapman, Karena W.; Zhu, Lingyang

    2016-01-13

    Mg batteries are an attractive alternative to Li-based energy storage due to the possibility of higher volumetric capacities with the added advantage of using sustainable materials. A promising emerging electrolyte for Mg batteries is the magnesium aluminum chloride complex (MACC) which shows high Mg electrodeposition and stripping efficiencies and relatively high anodic stabilities. As prepared, MACC is inactive with respect to Mg deposition; however, efficient Mg electrodeposition can be achieved following an electrolytic conditioning process. Through the use of Raman spectroscopy, surface enhanced Raman spectroscopy, 27Al and 35Cl nuclear magnetic resonance spectroscopy, and pair distribution function analysis, we explore themore » active vs inactive complexes in the MACC electrolyte and demonstrate the codependence of Al and Mg speciation. These techniques report on significant changes occurring in the bulk speciation of the conditioned electrolyte relative to the as-prepared solution. Analysis shows that the active Mg complex in conditioned MACC is very likely the [Mg2(μ–Cl)3·6THF]+ complex that is observed in the solid state structure. Additionally, conditioning creates free Cl– in the electrolyte solution, and we suggest the free Cl– adsorbs at the electrode surface to enhance Mg electrodeposition.« less

  16. Effect of growth phase on the fatty acid compositions of four species of marine diatoms

    NASA Astrophysics Data System (ADS)

    Liang, Ying; Mai, Kangsen

    2005-04-01

    The fatty acid compositions of four species of marine diatoms ( Chaetoceros gracilis MACC/B13, Cylindrotheca fusiformis MACC/B211, Phaeodactylum tricornutum MACC/B221 and Nitzschia closterium MACC/B222), cultivated at 22°C±1°C with the salinity of 28 in f/2 medium and harvested in the exponential growth phase, the early stationary phase and the late stationary phase, were determined. The results showed that growth phase has significant effect on most fatty acid contents in the four species of marine diatoms. The proportions of 16:0 and 16:1n-7 fatty acids increased while those of 16:3n-4 and eicosapentaenoic acid (EPA) decreased with increasing culture age in all species studied. The subtotal of saturated fatty acids (SFA) increased with the increasing culture age in all species with the exception of B13. The subtotal of monounsaturated fatty acids (MUFA) increased while that of polyunsaturated fatty acids (PUFA) decreased with culture age in the four species of marine diatoms. MUFA reached their lowest value in the exponential growth phase, whereas PUFA reached their highest value in the same phase.

  17. Air Support Control Officer Individual Position Training Simulation

    DTIC Science & Technology

    2017-06-01

    Analysis design development implementation evaluation ASCO Air support control officer ASLT Air support liaison team ASNO Air support net operator...Instructional system design LSTM Long-short term memory MACCS Marine Air Command and Control System MAGTF Marine Air Ground Task Force MASS Marine Air...information to designated MACCS agencies. ASCOs play an important part in facilitating the safe and successful conduct of air operations in DASC- controlled

  18. Clinical outcomes of patients with hypothyroidism undergoing percutaneous coronary intervention.

    PubMed

    Zhang, Ming; Sara, Jaskanwal D S; Matsuzawa, Yasushi; Gharib, Hossein; Bell, Malcolm R; Gulati, Rajiv; Lerman, Lilach O; Lerman, Amir

    2016-07-07

    The aim of this study was to investigate the association between hypothyroidism and major adverse cardiovascular and cerebral events (MACCE) in patients undergoing percutaneous coronary intervention (PCI). Two thousand four hundred and thirty patients who underwent PCI were included. Subjects were divided into two groups: hypothyroidism (n = 686) defined either as a history of hypothyroidism or thyroid-stimulating hormone (TSH) ≥5.0 mU/mL, and euthyroidism (n = 1744) defined as no history of hypothyroidism and/or 0.3 mU/mL ≤ TSH < 5.0 mU/mL. Patients with hypothyroidism were further categorized as untreated (n = 193), or those taking thyroid replacement therapy (TRT) with adequate replacement (0.3 mU/mL ≤ TSH < 5.0 mU/mL, n = 175) or inadequate replacement (TSH ≥ 5.0 mU/mL, n = 318). Adjusted hazard ratios (HRs) were calculated using Cox proportional hazards models. Median follow-up was 3.0 years (interquartile range, 0.5-7.0). After adjustment for covariates, the risk of MACCE and its constituent parts was higher in patients with hypothyroidism compared with those with euthyroidism (MACCE: HR: 1.28, P = 0.0001; myocardial infarction (MI): HR: 1.25, P = 0.037; heart failure: HR: 1.46, P = 0.004; revascularization: HR: 1.26, P = 0.0008; stroke: HR: 1.62, P = 0.04). Compared with untreated patients or those with inadequate replacement, adequately treated hypothyroid patients had a lower risk of MACCE (HR: 0.69, P = 0.005; HR: 0.78, P = 0.045), cardiac death (HR: 0.43, P = 0.008), MI (HR: 0.50, P = 0.0004; HR: 0.60, P = 0.02), and heart failure (HR: 0.50, P = 0.02; HR: 0.52, P = 0.017). Hypothyroidism is associated with a higher incidence of MACCE compared with euthyroidism in patients undergoing PCI. Maintaining adequate control on TRT is beneficial in preventing MACCE. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.

  19. The Minnesota Adolescent Community Cohort Study: Design and Baseline Results

    PubMed Central

    Forster, Jean; Chen, Vincent; Perry, Cheryl; Oswald, John; Willmorth, Michael

    2014-01-01

    The Minnesota Adolescent Community Cohort (MACC) Study is a population-based, longitudinal study that enrolled 3636 youth from Minnesota and 605 youth from comparison states age 12 to 16 years in 2000–2001. Participants have been surveyed by telephone semi-annually about their tobacco-related attitudes and behaviors. The goals of the study are to evaluate the effects of the Minnesota Youth Tobacco Prevention Initiative and its shutdown on youth smoking patterns, and to better define the patterns of development of tobacco use in adolescents. A multilevel sample was constructed representing individuals, local jurisdictions and the entire state, and data are collected to characterize each of these levels. This paper presents the details of the multilevel study design. We also provide baseline information about MACC participants including demographics and tobacco-related attitudes and behaviors. This paper describes smoking prevalence at the local level, and compares MACC participants to the state as a whole. PMID:21360063

  20. Marginal abatement cost curves for NOx that account for ...

    EPA Pesticide Factsheets

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their respective cost effectiveness. Alternative measures, such as renewable electricity, energy efficiency, and fuel switching (RE/EE/FS), are not considered as it is difficult to quantify their abatement potential. In this paper, we demonstrate the use of an energy system model to develop a MACC for nitrogen oxides (NOx) that incorporates both end-of-pipe controls and these alternative measures. We decompose the MACC by sector, and evaluate the cost-effectiveness of RE/EE/FS relative to end-of-pipe controls. RE/EE/FS are shown to produce considerable emission reductions after end-of-pipe controls have been exhausted. Furthermore, some RE/EE/FS are shown to be cost-competitive with end-of-pipe controls. Demonstrate how the MARKAL energy system model can be used to evaluate the potential role of renewable electricity, energy efficiency and fuel switching (RE/EE/FS) in achieving NOx reductions. For this particular analysis, we show that RE/EE/FSs are able to increase the quantity of NOx reductions available for a particular marginal cost (ranging from $5k per ton to $40k per ton) by approximately 50%.

  1. Tropospheric chemistry in the integrated forecasting system of ECMWF

    NASA Astrophysics Data System (ADS)

    Flemming, J.; Huijnen, V.; Arteta, J.; Bechtold, P.; Beljaars, A.; Blechschmidt, A.-M.; Josse, B.; Diamantakis, M.; Engelen, R. J.; Gaudel, A.; Inness, A.; Jones, L.; Katragkou, E.; Marecal, V.; Peuch, V.-H.; Richter, A.; Schultz, M. G.; Stein, O.; Tsikerdekis, A.

    2014-11-01

    A representation of atmospheric chemistry has been included in the Integrated Forecasting System (IFS) of the European Centre for Medium-range Weather Forecasts (ECMWF). The new chemistry modules complement the aerosol modules of the IFS for atmospheric composition, which is named C-IFS. C-IFS for chemistry supersedes a coupled system, in which the Chemical Transport Model (CTM) Model for OZone and Related chemical Tracers 3 was two-way coupled to the IFS (IFS-MOZART). This paper contains a description of the new on-line implementation, an evaluation with observations and a comparison of the performance of C-IFS with MOZART and with a re-analysis of atmospheric composition produced by IFS-MOZART within the Monitoring Atmospheric Composition and Climate (MACC) project. The chemical mechanism of C-IFS is an extended version of the Carbon Bond 2005 (CB05) chemical mechanism as implemented in the CTM Transport Model 5 (TM5). CB05 describes tropospheric chemistry with 54 species and 126 reactions. Wet deposition and lightning nitrogen monoxide (NO) emissions are modelled in C-IFS using the detailed input of the IFS physics package. A one-year simulation by C-IFS, MOZART and the MACC re-analysis is evaluated against ozonesondes, carbon monoxide (CO) aircraft profiles, European surface observations of ozone (O3), CO, sulphur dioxide (SO2) and nitrogen dioxide (NO2) as well as satellite retrievals of CO, tropospheric NO2 and formaldehyde. Anthropogenic emissions from the MACC/CityZen (MACCity) inventory and biomass burning emissions from the Global Fire Assimilation System (GFAS) data set were used in the simulations by both C-IFS and MOZART. C-IFS (CB05) showed an improved performance with respect to MOZART for CO, upper tropospheric O3, winter time SO2 and was of a similar accuracy for other evaluated species. C-IFS (CB05) is about ten times more computationally efficient than IFS-MOZART.

  2. Tropospheric chemistry in the Integrated Forecasting System of ECMWF

    NASA Astrophysics Data System (ADS)

    Flemming, J.; Huijnen, V.; Arteta, J.; Bechtold, P.; Beljaars, A.; Blechschmidt, A.-M.; Diamantakis, M.; Engelen, R. J.; Gaudel, A.; Inness, A.; Jones, L.; Josse, B.; Katragkou, E.; Marecal, V.; Peuch, V.-H.; Richter, A.; Schultz, M. G.; Stein, O.; Tsikerdekis, A.

    2015-04-01

    A representation of atmospheric chemistry has been included in the Integrated Forecasting System (IFS) of the European Centre for Medium-Range Weather Forecasts (ECMWF). The new chemistry modules complement the aerosol modules of the IFS for atmospheric composition, which is named C-IFS. C-IFS for chemistry supersedes a coupled system in which chemical transport model (CTM) Model for OZone and Related chemical Tracers 3 was two-way coupled to the IFS (IFS-MOZART). This paper contains a description of the new on-line implementation, an evaluation with observations and a comparison of the performance of C-IFS with MOZART and with a re-analysis of atmospheric composition produced by IFS-MOZART within the Monitoring Atmospheric Composition and Climate (MACC) project. The chemical mechanism of C-IFS is an extended version of the Carbon Bond 2005 (CB05) chemical mechanism as implemented in CTM Transport Model 5 (TM5). CB05 describes tropospheric chemistry with 54 species and 126 reactions. Wet deposition and lightning nitrogen monoxide (NO) emissions are modelled in C-IFS using the detailed input of the IFS physics package. A 1 year simulation by C-IFS, MOZART and the MACC re-analysis is evaluated against ozonesondes, carbon monoxide (CO) aircraft profiles, European surface observations of ozone (O3), CO, sulfur dioxide (SO2) and nitrogen dioxide (NO2) as well as satellite retrievals of CO, tropospheric NO2 and formaldehyde. Anthropogenic emissions from the MACC/CityZen (MACCity) inventory and biomass burning emissions from the Global Fire Assimilation System (GFAS) data set were used in the simulations by both C-IFS and MOZART. C-IFS (CB05) showed an improved performance with respect to MOZART for CO, upper tropospheric O3, and wintertime SO2, and was of a similar accuracy for other evaluated species. C-IFS (CB05) is about 10 times more computationally efficient than IFS-MOZART.

  3. Prognostic Implications of Dual Platelet Reactivity Testing in Acute Coronary Syndrome.

    PubMed

    de Carvalho, Leonardo P; Fong, Alan; Troughton, Richard; Yan, Bryan P; Chin, Chee-Tang; Poh, Sock-Cheng; Mejin, Melissa; Huang, Nancy; Seneviratna, Aruni; Lee, Chi-Hang; Low, Adrian F; Tan, Huay-Cheem; Chan, Siew-Pang; Frampton, Christopher; Richards, A Mark; Chan, Mark Y

    2018-02-01

    Studies on platelet reactivity (PR) testing commonly test PR only after percutaneous coronary intervention (PCI) has been performed. There are few data on pre- and post-PCI testing. Data on simultaneous testing of aspirin and adenosine diphosphate antagonist response are conflicting. We investigated the prognostic value of combined serial assessments of high on-aspirin PR (HASPR) and high on-adenosine diphosphate receptor antagonist PR (HADPR) in patients with acute coronary syndrome (ACS). HASPR and HADPR were assessed in 928 ACS patients before (initial test) and 24 hours after (final test) coronary angiography, with or without revascularization. Patients with HASPR on the initial test, compared with those without, had significantly higher intraprocedural thrombotic events (IPTE) (8.6 vs. 1.2%, p  ≤ 0.001) and higher 30-day major adverse cardiovascular and cerebrovascular events (MACCE; 5.2 vs. 2.3%, p  = 0.05), but not 12-month MACCE (13.0 vs. 15.1%, p  = 0.50). Patients with initial HADPR, compared with those without, had significantly higher IPTE (4.4 vs. 0.9%, p  = 0.004), but not 30-day (3.5 vs. 2.3%, p  = 0.32) or 12-month MACCE (14.0 vs. 12.5%, p  = 0.54). The c-statistic of the Global Registry of Acute Coronary Events (GRACE) score alone, GRACE score + ASPR test and GRACE score + ADPR test for discriminating 30-day MACCE was 0.649, 0.803 and 0.757, respectively. Final ADPR was associated with 30-day MACCE among patients with intermediate-to-high GRACE score (adjusted odds ratio [OR]: 4.50, 95% confidence interval [CI]: 1.14-17.66), but not low GRACE score (adjusted OR: 1.19, 95% CI: 0.13-10.79). In conclusion, both HASPR and HADPR predict ischaemic events in ACS. This predictive utility is time-dependent and risk-dependent. Schattauer GmbH Stuttgart.

  4. microRNA-598 inhibits cell proliferation and invasion of glioblastoma by directly targeting metastasis associated in colon cancer-1.

    PubMed

    Wang, Ning; Zhang, Yang; Liang, Huaxin

    2018-02-14

    The dysregulation of microRNAs (miRNAs) expression is closely related with tumorigenesis and tumour development in glioblastoma (GBM). In this study, we found that miRNA-598 (miR-598) expression was significantly downregulated in GBM tissues and cell lines. Restoring miR-598 expression inhibited cell proliferation and invasion in GBM. Moreover, we validated that metastasis associated in colon cancer-1 (MACC1) is a novel target of miR-598 in GBM. Recovered MACC1 expression reversed the inhibitory effects of miR-598 overexpression on GBM cells. In addition, miR-598 overexpression suppressed the Met/AKT pathway activation in GBM. Our results provided compelling evidence that miR-598 serves tumour suppressive roles in GBM and that its anti-oncogenic effects are mediated chiefly through the direct suppression of MACC1 expression and regulation of the Met/AKT signalling pathway. Therefore, miR-598 is a potential target in the treatment of GBM.

  5. Anterior Cingulate Glutamate Is Reduced by Acamprosate Treatment in Patients With Alcohol Dependence.

    PubMed

    Frye, Mark A; Hinton, David J; Karpyak, Victor M; Biernacka, Joanna M; Gunderson, Lee J; Feeder, Scott E; Choi, Doo-Sup; Port, John D

    2016-12-01

    Although the precise drug mechanism of action of acamprosate remains unclear, its antidipsotropic effect is mediated in part through glutamatergic neurotransmission. We evaluated the effect of 4 weeks of acamprosate treatment in a cohort of 13 subjects with alcohol dependence (confirmed by a structured interview, Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision) on proton magnetic resonance spectroscopy glutamate levels in the midline anterior cingulate cortex (MACC). We compared levels of metabolites with a group of 16 healthy controls. The Pennsylvania Alcohol Craving Scale was used to assess craving intensity. At baseline, before treatment, the mean cerebrospinal fluid-corrected MACC glutamate (Glu) level was significantly elevated in subjects with alcohol dependence compared with controls (P = 0.004). Four weeks of acamprosate treatment reduced glutamate levels (P = 0.025), an effect that was not observed in subjects who did not take acamprosate. At baseline, there was a significant positive correlation between cravings, measured by the Pennsylvania Alcohol Craving Scale, and MACC (Glu) levels (P = 0.019). Overall, these data would suggest a normalizing effect of acamprosate on a hyperglutamatergic state observed in recently withdrawn patients with alcohol dependence and a positive association between MACC glutamate levels and craving intensity in early abstinence. Further research is needed to evaluate the use of these findings for clinical practice, including monitoring of craving intensity and individualized selection of treatment with antidipsotropic medications in subjects with alcohol dependence.

  6. Profiling of Resistance Patterns & Oncogenic Signaling Pathways in Evaluation of Cancers of the Thorax and Therapeutic Target Identification

    DTIC Science & Technology

    2010-06-01

    mutation si gnature i s prognostic in EGFR wild-type l ung adenocarcinomas and identifies Metastasis associated in colon cancer 1 (MACC1) as an EGFR...T790M mutation (N=7, blue curve) (AUC: area under the curve). Figure 3. EGFR dependency signature is a favorable prognostic factor. EGFR index...developed. T he si gnature w as shown t o b e prognostic regardless of EGFR status. T he results also suggest MACC1 to be a regulator of MET in NSCLC

  7. Evaluation of the MACC operational forecast system - potential and challenges of global near-real-time modelling with respect to reactive gases in the troposphere

    NASA Astrophysics Data System (ADS)

    Wagner, A.; Blechschmidt, A.-M.; Bouarar, I.; Brunke, E.-G.; Clerbaux, C.; Cupeiro, M.; Cristofanelli, P.; Eskes, H.; Flemming, J.; Flentje, H.; George, M.; Gilge, S.; Hilboll, A.; Inness, A.; Kapsomenakis, J.; Richter, A.; Ries, L.; Spangl, W.; Stein, O.; Weller, R.; Zerefos, C.

    2015-12-01

    The Monitoring Atmospheric Composition and Climate (MACC) project represents the European Union's Copernicus Atmosphere Monitoring Service (CAMS) (http://www.copernicus.eu/), which became fully operational during 2015. The global near-real-time MACC model production run for aerosol and reactive gases provides daily analyses and 5-day forecasts of atmospheric composition fields. It is the only assimilation system worldwide that is operational to produce global analyses and forecasts of reactive gases and aerosol fields. We have investigated the ability of the MACC analysis system to simulate tropospheric concentrations of reactive gases covering the period between 2009 and 2012. A validation was performed based on carbon monoxide (CO), nitrogen dioxide (NO2) and ozone (O3) surface observations from the Global Atmosphere Watch (GAW) network, the O3 surface observations from the European Monitoring and Evaluation Programme (EMEP) and, furthermore, NO2 tropospheric columns, as well as CO total columns, derived from satellite sensors. The MACC system proved capable of reproducing reactive gas concentrations with consistent quality; however, with a seasonally dependent bias compared to surface and satellite observations - for northern hemispheric surface O3 mixing ratios, positive biases appear during the warm seasons and negative biases during the cold parts of the year, with monthly modified normalised mean biases (MNMBs) ranging between -30 and 30 % at the surface. Model biases are likely to result from difficulties in the simulation of vertical mixing at night and deficiencies in the model's dry deposition parameterisation. Observed tropospheric columns of NO2 and CO could be reproduced correctly during the warm seasons, but are mostly underestimated by the model during the cold seasons, when anthropogenic emissions are at their highest level, especially over the US, Europe and Asia. Monthly MNMBs of the satellite data evaluation range from values between -110 and 40 % for NO2 and at most -20 % for CO, over the investigated regions. The underestimation is likely to result from a combination of errors concerning the dry deposition parameterisation and certain limitations in the current emission inventories, together with an insufficiently established seasonality in the emissions.

  8. Angiographic outcomes following stenting or coronary artery bypass surgery of the left main coronary artery: fifteen-month outcomes from the synergy between PCI with TAXUS express and cardiac surgery left main angiographic substudy (SYNTAX-LE MANS).

    PubMed

    Morice, Marie-Claude; Feldman, Ted E E; Mack, Michael J; Ståhle, Elisabeth; Holmes, David R; Colombo, Antonio; Morel, Marie-Angèle; van den Brand, Marcel; Serruys, Patrick W; Mohr, Friedrich; Carrié, Didier; Fournial, Gérard; James, Stefan; Leadley, Katrin; Dawkins, Keith D; Kappetein, A Pieter

    2011-10-30

    The SYNTAX-LE MANS substudy prospectively evaluated 15-month angiographic and clinical outcomes in patients with treated left main (LM) disease. In the SYNTAX trial, 1,800 patients with three-vessel and/or LM disease were randomised to either CABG or PCI; of these, 271 LM patients were prospectively assigned to receive a 15-month angiogram. The primary endpoint for the CABG arm was the ratio of ≥50% to <100% obstructed/occluded grafts bypassing LM lesions to the number placed. The primary endpoint for the PCI arm was the proportion of patients with ≤50% diameter stenosis ('patent' stents) of treated LM lesions. Per protocol, no formal comparison between CABG and PCI arms was intended based on the differing primary endpoints. Available 15-month angiograms were analysed for 114 CABG and 149 PCI patients. At 15 months, 9.9% (26/263) of CABG grafts were 100% occluded and an additional 5.7% (15/263) were ≥50% to <100% occluded. Overall, 27.2% (31/114) of patients had ≥1 obstructed/occluded graft. The 15-month CABG MACCE rate was 8.8% (10/114) and MACCE at 15 months was not significantly associated with graft obstruction/occlusion (p=0.85). In the PCI arm, 92.4% (134/145) of patients had ≤50% diameter LM stenosis at 15 months (89.7% [87/97] distal LM lesions and 97.9% [47/48] non-distal LM lesions). The 15-month PCI MACCE rate was 12.8% (20/156) and this was significantly associated with lack of stent patency at 15 months (p<0.001), mainly due to repeat revascularisation. At 15 months, 15.6% (41/263) of grafts were at least 50% obstructed but this was not significantly associated with MACCE; 92.4% (134/145) of patients had stents that remained patent at 15 months, and stent restenosis was significantly associated with MACCE, predominantly due to revascularisation.

  9. 2-year results of the AUTAX (Austrian Multivessel TAXUS-Stent) registry beyond the SYNTAX (synergy between percutaneous coronary intervention with TAXUS and cardiac surgery) study.

    PubMed

    Gyöngyösi, Mariann; Christ, Günter; Lang, Irene; Kreiner, Gerhard; Sochor, Heinz; Probst, Peter; Neunteufl, Thomas; Badr-Eslam, Rosa; Winkler, Susanne; Nyolczas, Noemi; Posa, Aniko; Leisch, Franz; Karnik, Ronald; Siostrzonek, Peter; Harb, Stefan; Heigert, Matthias; Zenker, Gerald; Benzer, Werner; Bonner, Gerhard; Kaider, Alexandra; Glogar, Dietmar

    2009-08-01

    The multicenter AUTAX (Austrian Multivessel TAXUS-Stent) registry investigated the 2-year clinical/angiographic outcomes of patients with multivessel coronary artery disease after implantation of TAXUS Express stents (Boston Scientific, Natick, Massachusetts), in a "real-world" setting. The AUTAX registry included patients with 2- or 3-vessel disease, with/without previous percutaneous coronary intervention (PCI) and concomitant surgery. Patients (n = 441, 64 +/- 12 years, 78% men) (n = 1,080 lesions) with possible complete revascularization by PCI were prospectively included. Median clinical follow-up was 753 (quartiles 728 to 775) days after PCI in 95.7%, with control angiography of 78% at 6 months. The primary end point was the composite of major adverse cardiac (nonfatal acute myocardial infarction [AMI], all-cause mortality, target lesion revascularization [TLR]) and cerebrovascular events (MACCE). Potential risk factor effects on 2-year MACCE were evaluated using Cox regression. Complete revascularization was successful in 90.5%, with left main PCI of 6.8%. Rates of acute, subacute, and late stent thrombosis were 0.7%, 0.5%, and 0.5%. Two-year follow-up identified AMI (1.4%), death (3.6%), stroke (0.2%), and TLR (13.1%), for a composite MACCE of 18.3%. The binary restenosis rate was 10.8%. The median of cumulative SYNTAX score was 23.0 (range 12.0 to 56.5). The SYNTAX score did not predict TLR or MACCE, due to lack of scoring of restenotic or bypass stenoses (29.8%). Age (hazard ratio [HR]: 1.03, p = 0.019) and acute coronary syndrome (HR: 2.1, p = 0.001) were significant predictors of 2-year MACCE. Incomplete revascularization predicted death or AMI (HR: 3.84, p = 0.002). With the aim of complete revascularization, TAXUS stent implantations can be safe for patients with multivessel disease. The AUTAX registry including patients with post-PCI lesions provides additional information to the SYNTAX (Synergy Between Percutaneous Coronary Intervention With TAXUS and Cardiac Surgery) study. (Austrian Multivessel TAXUS-Stent Registry; NCT00738686).

  10. Influence of sleep-disordered breathing assessed by pulse oximetry on long-term clinical outcomes in patients who underwent percutaneous coronary intervention.

    PubMed

    Yatsu, Shoichiro; Naito, Ryo; Kasai, Takatoshi; Matsumoto, Hiroki; Shitara, Jun; Shimizu, Megumi; Murata, Azusa; Kato, Takao; Suda, Shoko; Hiki, Masaru; Sai, Eiryu; Miyauchi, Katsumi; Daida, Hiroyuki

    2018-03-31

    Sleep-disordered breathing (SDB) has been recognized as an important risk factor for coronary artery disease (CAD). However, SDB was not fully examined, because sleep studies are limited. Nocturnal pulse oximetry has been suggested to be a useful tool for evaluating SDB. Therefore, the aim of this study was to investigate the influence of SDB assessed by nocturnal pulse oximetry on clinical outcomes in patients who underwent percutaneous coronary intervention (PCI). We conducted a prospective, multicenter, observational cohort study, wherein SDB was assessed by finger pulse oximetry in patients who underwent PCI from January 2014 to December 2016. SDB was defined as 4% oxygen desaturation index of 5 and higher. The primary endpoint was major adverse cardiac or cerebrovascular event (MACCE), defined as a composite of all-cause mortality, acute coronary syndrome, and/or stroke. Of 539 patients, 296 (54.9%) had SDB. MACCE occurred in 32 patients (5.8%) during a median follow-up of 1.9 years. The cumulative incidence of MACCE was significantly higher in patients with SDB (P = 0.0134). In the stepwise multivariable Cox proportional model, the presence of SDB was a significant predictor of MACCE (hazard ratio 2.26; 95% confidence interval 1.05-5.4, P = 0.036). SDB determined by nocturnal pulse oximetry was associated with worse clinical outcomes in patients who underwent PCI. Screening for SDB with nocturnal pulse oximetry was considered to be important for risk stratification in patients with CAD.

  11. Access to MISR Aerosol Data and Imagery for the GoMACCS Field Study

    NASA Astrophysics Data System (ADS)

    Ritchey, N.; Watkinson, T.; Davis, J.; Walter, J.; Protack, S.; Matthews, J.; Smyth, M.; Rheingans, B.; Gaitley, B.; Ferebee, M.; Haberer, S.

    2006-12-01

    NASA Langley Atmospheric Science Data Center (ASDC) and NASA Jet Propulsion Laboratory (JPL) Multi- angle Imaging SpectroRadiometer (MISR) teams collaborated to provide special data products and images in an innovative approach for the Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) field campaign. GoMACCS was an intensive field study focused on providing a better understanding of the sources and atmospheric processes responsible for the formation and distribution of ozone and aerosols in the atmosphere and the influence that these species have on the radiative forcing of regional and global climate, as well as their impact on human health and regional haze. The study area encompassed Texas and the northwestern Gulf of Mexico. Numerous U. S. Government agencies, universities and commercial entities participated in the field campaign which occurred August through September 2006. Aerosol and meteorological measurements were provided by a network of instruments on land, buoys and ships, by airborne in situ and remote instruments, and by satellite retrievals. MISR's role in GoMACCS was to provide satellite retrievals of aerosols and cloud properties and imagery as quickly as possible after data acquisition. The diverse group of scientific participants created unique opportunities for ASDC and MISR to develop special data products and images that were easily accessible by all participants. Examples of the data products, images and access methods as well as the data and imagery flow will be presented. Additional information about ASDC and MISR is available from the following web sites, http://eosweb.larc.nasa.gov and http://www-misr.jpl.nasa.gov/.

  12. A regional air quality forecasting system over Europe: the MACC-II daily ensemble production

    NASA Astrophysics Data System (ADS)

    Marécal, V.; Peuch, V.-H.; Andersson, C.; Andersson, S.; Arteta, J.; Beekmann, M.; Benedictow, A.; Bergström, R.; Bessagnet, B.; Cansado, A.; Chéroux, F.; Colette, A.; Coman, A.; Curier, R. L.; Denier van der Gon, H. A. C.; Drouin, A.; Elbern, H.; Emili, E.; Engelen, R. J.; Eskes, H. J.; Foret, G.; Friese, E.; Gauss, M.; Giannaros, C.; Guth, J.; Joly, M.; Jaumouillé, E.; Josse, B.; Kadygrov, N.; Kaiser, J. W.; Krajsek, K.; Kuenen, J.; Kumar, U.; Liora, N.; Lopez, E.; Malherbe, L.; Martinez, I.; Melas, D.; Meleux, F.; Menut, L.; Moinat, P.; Morales, T.; Parmentier, J.; Piacentini, A.; Plu, M.; Poupkou, A.; Queguiner, S.; Robertson, L.; Rouïl, L.; Schaap, M.; Segers, A.; Sofiev, M.; Thomas, M.; Timmermans, R.; Valdebenito, Á.; van Velthoven, P.; van Versendaal, R.; Vira, J.; Ung, A.

    2015-03-01

    This paper describes the pre-operational analysis and forecasting system developed during MACC (Monitoring Atmospheric Composition and Climate) and continued in MACC-II (Monitoring Atmospheric Composition and Climate: Interim Implementation) European projects to provide air quality services for the European continent. The paper gives an overall picture of its status at the end of MACC-II (summer 2014). This system is based on seven state-of-the art models developed and run in Europe (CHIMERE, EMEP, EURAD-IM, LOTOS-EUROS, MATCH, MOCAGE and SILAM). These models are used to calculate multi-model ensemble products. The MACC-II system provides daily 96 h forecasts with hourly outputs of 10 chemical species/aerosols (O3, NO2, SO2, CO, PM10, PM2.5, NO, NH3, total NMVOCs and PAN + PAN precursors) over 8 vertical levels from the surface to 5 km height. The hourly analysis at the surface is done a posteriori for the past day using a selection of representative air quality data from European monitoring stations. The performances of the system are assessed daily, weekly and 3 monthly (seasonally) through statistical indicators calculated using the available representative air quality data from European monitoring stations. Results for a case study show the ability of the median ensemble to forecast regional ozone pollution events. The time period of this case study is also used to illustrate that the median ensemble generally outperforms each of the individual models and that it is still robust even if two of the seven models are missing. The seasonal performances of the individual models and of the multi-model ensemble have been monitored since September 2009 for ozone, NO2 and PM10 and show an overall improvement over time. The change of the skills of the ensemble over the past two summers for ozone and the past two winters for PM10 are discussed in the paper. While the evolution of the ozone scores is not significant, there are improvements of PM10 over the past two winters that can be at least partly attributed to new developments on aerosols in the seven individual models. Nevertheless, the year to year changes in the models and ensemble skills are also linked to the variability of the meteorological conditions and of the set of observations used to calculate the statistical indicators. In parallel, a scientific analysis of the results of the seven models and of the ensemble is also done over the Mediterranean area because of the specificity of its meteorology and emissions. The system is robust in terms of the production availability. Major efforts have been done in MACC-II towards the operationalisation of all its components. Foreseen developments and research for improving its performances are discussed in the conclusion.

  13. Lipoprotein(a) levels predict adverse vascular events after acute myocardial infarction.

    PubMed

    Mitsuda, Takayuki; Uemura, Yusuke; Ishii, Hideki; Takemoto, Kenji; Uchikawa, Tomohiro; Koyasu, Masayoshi; Ishikawa, Shinji; Miura, Ayako; Imai, Ryo; Iwamiya, Satoshi; Ozaki, Yuta; Kato, Tomohiro; Shibata, Rei; Watarai, Masato; Murohara, Toyoaki

    2016-12-01

    Lipoprotein(a) [Lp(a)], which is genetically determined, has been reported as an independent risk factor for atherosclerotic vascular disease. However, the prognostic value of Lp(a) for secondary vascular events in patients after coronary artery disease has not been fully elucidated. This 3-year observational study included a total of 176 patients with ST-elevated myocardial infarction (STEMI), whose Lp(a) levels were measured within 24 h after primary percutaneous coronary intervention. We divided enrolled patients into two groups according to Lp(a) level and investigated the association between Lp(a) and the incidence of major adverse cardiac and cerebrovascular events (MACCE). A Kaplan-Meier analysis demonstrated that patients with higher Lp(a) levels had a higher incidence of MACCE than those with lower Lp(a) levels (log-rank P = 0.034). A multivariate Cox regression analysis revealed that Lp(a) levels were independently correlated with the occurrence of MACCE after adjusting for other classical risk factors of atherosclerotic vascular diseases (hazard ratio 1.030, 95 % confidence interval: 1.011-1.048, P = 0.002). In receiver-operating curve analysis, the cutoff value to maximize the predictive power of Lp(a) was 19.0 mg/dl (area under the curve = 0.674, sensitivity 69.2 %, specificity 62.0 %). Evaluation of Lp(a) in addition to the established coronary risk factors improved their predictive value for the occurrence of MACCE. In conclusion, Lp(a) levels at admission independently predict secondary vascular events in patients with STEMI. Lp(a) might provide useful information for the development of secondary prevention strategies in patients with myocardial infarction.

  14. Mid-latitude storm track variability and its influence on atmospheric composition

    NASA Astrophysics Data System (ADS)

    Knowland, K. E.; Doherty, R. M.; Hodges, K.

    2013-12-01

    Using the storm tracking algorithm, TRACK (Hodges, 1994, 1995, 1999), we have studied the behaviour of storm tracks in the North Atlantic basin, using 850-hPa relative vorticity from the ERA-Interim Re-analysis (Dee et al., 2011). We have correlated surface ozone measurements at rural coastal sites in Europe to the storm track data to explore the role mid-latitude cyclones and their transport of pollutants play in determining surface air quality in Western Europe. To further investigate this relationship, we have used the Monitoring Atmospheric Composition Climate (MACC) Re-analysis dataset (Inness et al., 2013) in TRACK. The MACC Re-analysis is a 10-year dataset which couples a chemistry transport model (Mozart-3; Stein 2009, 2012) to an extended version of the European Centre for Medium-Range Weather Forecasts' (ECMWF) Integrated Forecast System (IFS). Storm tracks in the MACC Re-analysis compare well to the storm tracks using the ERA-Interim Re-analysis for the same 10-year period, as both are based on ECMWF IFSs. We also compare surface ozone values from MACC to surface ozone measurements previously studied. Using TRACK, we follow ozone (O3) and carbon monoxide (CO) through the life cycle of storms from North America to Western Europe. Along the storm tracks, we examine the distribution of CO and O3 within 6 degrees of the center of each storm and vertically at different pressure levels in the troposphere. We hope to better understand the mechanisms with which pollution is vented from the boundary layer to the free troposphere, as well as transport of pollutants to rural areas. Our hope is to give policy makers more detailed information on how climate variability associated with storm tracks between 1979-2013 may affect air quality in Northeast USA and Western Europe.

  15. Seasonal and interannual variability of carbon monoxide based on MOZAIC observations, MACC reanalysis, and model simulations over an urban site in India

    NASA Astrophysics Data System (ADS)

    Sheel, Varun; Sahu, L. K.; Kajino, M.; Deushi, M.; Stein, O.; Nedelec, P.

    2014-07-01

    The spatial and temporal variations of carbon monoxide (CO) are analyzed over a tropical urban site, Hyderabad (17°27'N, 78°28'E) in central India. We have used vertical profiles from the Measurement of ozone and water vapor by Airbus in-service aircraft (MOZAIC) aircraft observations, Monitoring Atmospheric Composition and Climate (MACC) reanalysis, and two chemical transport model simulations (Model for Ozone And Related Tracers (MOZART) and MRI global Chemistry Climate Model (MRI-CCM2)) for the years 2006-2008. In the lower troposphere, the CO mixing ratio showed strong seasonality, with higher levels (>300 ppbv) during the winter and premonsoon seasons associated with a stable anticyclonic circulation, while lower CO values (up to 100 ppbv) were observed in the monsoon season. In the planetary boundary layer (PBL), the seasonal distribution of CO shows the impact of both local meteorology and emissions. While the PBL CO is predominantly influenced by strong winds, bringing regional background air from marine and biomass burning regions, under calm conditions CO levels are elevated by local emissions. On the other hand, in the free troposphere, seasonal variation reflects the impact of long-range transport associated with the Intertropical Convergence Zone and biomass burning. The interannual variations were mainly due to transition from El Niño to La Niña conditions. The overall modified normalized mean biases (normalization based on the observed and model mean values) with respect to the observed CO profiles were lower for the MACC reanalysis than the MOZART and MRI-CCM2 models. The CO in the PBL region was consistently underestimated by MACC reanalysis during all the seasons, while MOZART and MRI-CCM2 show both positive and negative biases depending on the season.

  16. Previous cerebrovascular disease is an important predictor of clinical outcomes in elderly patients with percutaneous coronary interventions: The Nobori-Biolimus eluting stent prospective multicenter 1-year observational registry in South Korea

    PubMed Central

    Kim, Yong Hoon; Her, Ae-Young; Kim, Byeong-Keuk; Shin, Dong-Ho; Kim, Jung-Sun; Ko, Young-Guk; Choi, Donghoon; Hong, Myeong-Ki; Jang, Yangsoo

    2017-01-01

    Objective: The appropriate selection of elderly patients for revascularization has become increasingly important because these subsets of patients are more likely to experience a major cardiac or cerebrovascular event—percutaneous coronary intervention (PCI). The objective of this study was to determine important independent risk factor for predicting clinical outcomes in the elderly patients after successful PCI, particularly in a series of South Korean population. Methods: This study is prospective, multicenter, observational cross-sectional study. A total of 1,884 consecutive patients who underwent successful PCI with Nobori® Biolimus A9-eluting stents were enrolled between April 2010 and December 2012. They were divided into two groups according to the age: patients <75 years old (younger patient group) and ≥75 years old (elderly patient group). The primary endpoint was major adverse cardiac or cerebrovascular events (MACCE) at 1-year after index PCI. Results: The 1-year cumulative incidence of MACCE (12.9% vs. 4.3%, p<0.001) and total death (7.1% vs. 1.5%, p<0.001) was significantly higher in the elderly group than in younger group. Previous cerebrovascular disease was significantly correlated with MACCE in elderly patients 1-year after PCI (hazard ratio, 2.804; 95% confidence interval, 1.290–6.093 p=0.009). Conclusion: Previous cerebrovascular disease is important independent predictor of the MACCE in elderly patients at 1-year after PCI with Nobori® Biolimus A9-eluting stents especially in a series of South Korean population. Therefore, careful PCI with intensive monitoring and management can improve major clinical outcomes after successful PCI in elderly patients with previous cerebrovascular disease compared with younger patients. PMID:28554989

  17. The prognostic role of stress echocardiography in a contemporary population and the clinical significance of limited apical ischaemia.

    PubMed

    Papachristidis, Alexandros; Roper, Damian; Cassar Demarco, Daniela; Tsironis, Ioannis; Papitsas, Michael; Byrne, Jonathan; Alfakih, Khaled; Monaghan, Mark J

    2016-12-01

    In this study, we aim to reassess the prognostic value of stress echocardiography (SE) in a contemporary population and to evaluate the clinical significance of limited apical ischaemia, which has not been previously studied. We included 880 patients who underwent SE. Follow-up data with regards to MACCE (cardiac death, myocardial infarction, any repeat revascularisation and cerebrovascular accident) were collected over 12 months after the SE. Mortality data were recorded over 27.02 ± 4.6 months (5.5-34.2 months). We sought to investigate the predictors of MACCE and all-cause mortality. In a multivariable analysis, only the positive result of SE was predictive of MACCE (HR, 3.71; P = 0.012). The positive SE group was divided into 2 subgroups: (a) inducible ischaemia limited to the apical segments ('apical ischaemia') and (b) ischaemia in any other segments with or without apical involvement ('other positive'). The subgroup of patients with apical ischaemia had a significantly worse outcome compared to the patients with a negative SE (HR, 3.68; P = 0.041) but a similar outcome to the 'other positive' subgroup. However, when investigated with invasive coronary angiography, the prevalence of coronary artery disease (CAD) and their rate of revascularisation was considerably lower. Only age (HR, 1.07; P < 0.001) was correlated with all-cause mortality. SE remains a strong predictor of patients' outcome in a contemporary population. A positive SE result was the only predictor of 12-month MACCE. The subgroup of patients with limited apical ischaemia have similar outcome to patients with ischaemia in other segments despite a lower prevalence of CAD and a lower revascularisation rate. © 2016 The authors.

  18. Evaluation of the MACC operational forecast system - potential and challenges of global near-real-time modelling with respect to reactive gases in the troposphere

    NASA Astrophysics Data System (ADS)

    Wagner, A.; Blechschmidt, A.-M.; Bouarar, I.; Brunke, E.-G.; Clerbaux, C.; Cupeiro, M.; Cristofanelli, P.; Eskes, H.; Flemming, J.; Flentje, H.; George, M.; Gilge, S.; Hilboll, A.; Inness, A.; Kapsomenakis, J.; Richter, A.; Ries, L.; Spangl, W.; Stein, O.; Weller, R.; Zerefos, C.

    2015-03-01

    Monitoring Atmospheric Composition and Climate (MACC/MACCII) currently represents the European Union's Copernicus Atmosphere Monitoring Service (CAMS) (http://www.copernicus.eu), which will become fully operational in the course of 2015. The global near-real-time MACC model production run for aerosol and reactive gases provides daily analyses and 5 day forecasts of atmospheric composition fields. It is the only assimilation system world-wide that is operational to produce global analyses and forecasts of reactive gases and aerosol fields. We have investigated the ability of the MACC analysis system to simulate tropospheric concentrations of reactive gases (CO, O3, and NO2) covering the period between 2009 and 2012. A validation was performed based on CO and O3 surface observations from the Global Atmosphere Watch (GAW) network, O3 surface observations from the European Monitoring and Evaluation Programme (EMEP) and furthermore, NO2 tropospheric columns derived from the satellite sensors SCIAMACHY and GOME-2, and CO total columns derived from the satellite sensor MOPITT. The MACC system proved capable of reproducing reactive gas concentrations in consistent quality, however, with a seasonally dependent bias compared to surface and satellite observations: for northern hemispheric surface O3 mixing ratios, positive biases appear during the warm seasons and negative biases during the cold parts of the years, with monthly Modified Normalised Mean Biases (MNMBs) ranging between -30 and 30% at the surface. Model biases are likely to result from difficulties in the simulation of vertical mixing at night and deficiencies in the model's dry deposition parameterization. Observed tropospheric columns of NO2 and CO could be reproduced correctly during the warm seasons, but are mostly underestimated by the model during the cold seasons, when anthropogenic emissions are at a highest, especially over the US, Europe and Asia. Monthly MNMBs of the satellite data evaluation range between -110 and 40% for NO2 and at most -20% for CO, over the investigated regions. The underestimation is likely to result from a combination of errors concerning the dry deposition parameterization and certain limitations in the current emission inventories, together with an insufficiently established seasonality in the emissions.

  19. Treatment of complex coronary artery disease in patients with diabetes: 5-year results comparing outcomes of bypass surgery and percutaneous coronary intervention in the SYNTAX trial.

    PubMed

    Kappetein, Arie Pieter; Head, Stuart J; Morice, Marie-Claude; Banning, Adrian P; Serruys, Patrick W; Mohr, Friedrich-Wilhelm; Dawkins, Keith D; Mack, Michael J

    2013-05-01

    This prespecified subgroup analysis examined the effect of diabetes on left main coronary disease (LM) and/or three-vessel disease (3VD) in patients treated with percutaneous coronary intervention (PCI) or coronary artery bypass grafting (CABG) in the SYNTAX trial. Patients (n = 1800) with LM and/or 3VD were randomized to receive either PCI with TAXUS Express paclitaxel-eluting stents or CABG. Five-year outcomes in subgroups with (n = 452) or without (n = 1348) diabetes were examined: major adverse cardiac or cerebrovascular events (MACCE), the composite safety end-point of all-cause death/stroke/myocardial infarction (MI) and individual MACCE components death, stroke, MI and repeat revascularization. Event rates were estimated with Kaplan-Meier analyses. In diabetic patients, 5-year rates were significantly higher for PCI vs CABG for MACCE (PCI: 46.5% vs CABG: 29.0%; P < 0.001) and repeat revascularization (PCI: 35.3% vs CABG: 14.6%; P < 0.001). There was no difference in the composite of all-cause death/stroke/MI (PCI: 23.9% vs CABG: 19.1%; P = 0.26) or individual components all-cause death (PCI: 19.5% vs CABG: 12.9%; P = 0.065), stroke (PCI: 3.0% vs CABG: 4.7%; P = 0.34) or MI (PCI: 9.0% vs CABG: 5.4%; P = 0.20). In non-diabetic patients, rates with PCI were also higher for MACCE (PCI: 34.1% vs CABG: 26.3%; P = 0.002) and repeat revascularization (PCI: 22.8% vs CABG: 13.4%; P < 0.001), but not for the composite end-point of all-cause death/stroke/MI (PCI: 19.8% vs CABG: 15.9%; P = 0.069). There were no differences in all-cause death (PCI: 12.0% vs CABG: 10.9%; P = 0.48) or stroke (PCI: 2.2% vs CABG: 3.5%; P = 0.15), but rates of MI (PCI: 9.9% vs CABG: 3.4%; P < 0.001) were significantly increased in the PCI arm in non-diabetic patients. In both diabetic and non-diabetic patients, PCI resulted in higher rates of MACCE and repeat revascularization at 5 years. Although PCI is a potential treatment option in patients with less-complex lesions, CABG should be the revascularization option of choice for patients with more-complex anatomic disease, especially with concurrent diabetes.

  20. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less

  1. Adjusted Levenberg-Marquardt method application to methene retrieval from IASI/METOP spectra

    NASA Astrophysics Data System (ADS)

    Khamatnurova, Marina; Gribanov, Konstantin

    2016-04-01

    Levenberg-Marquardt method [1] with iteratively adjusted parameter and simultaneous evaluation of averaging kernels together with technique of parameters selection are developed and applied to the retrieval of methane vertical profiles in the atmosphere from IASI/METOP spectra. Retrieved methane vertical profiles are then used for calculation of total atmospheric column amount. NCEP/NCAR reanalysis data provided by ESRL (NOAA, Boulder,USA) [2] are taken as initial guess for retrieval algorithm. Surface temperature, temperature and humidity vertical profiles are retrieved before methane vertical profile retrieval for each selected spectrum. Modified software package FIRE-ARMS [3] were used for numerical experiments. To adjust parameters and validate the method we used ECMWF MACC reanalysis data [4]. Methane columnar values retrieved from cloudless IASI spectra demonstrate good agreement with MACC columnar values. Comparison is performed for IASI spectra measured in May of 2012 over Western Siberia. Application of the method for current IASI/METOP measurements are discussed. 1.Ma C., Jiang L. Some Research on Levenberg-Marquardt Method for the Nonlinear Equations // Applied Mathematics and Computation. 2007. V.184. P. 1032-1040 2.http://www.esrl.noaa.gov/psdhttp://www.esrl.noaa.gov/psd 3.Gribanov K.G., Zakharov V.I., Tashkun S.A., Tyuterev Vl.G.. A New Software Tool for Radiative Transfer Calculations and its application to IMG/ADEOS data // JQSRT.2001.V.68.№ 4. P. 435-451. 4.http://www.ecmwf.int/http://www.ecmwf.int

  2. Diabetes mellitus: long-term prognostic value of whole-body MR imaging for the occurrence of cardiac and cerebrovascular events.

    PubMed

    Bamberg, Fabian; Parhofer, Klaus G; Lochner, Elena; Marcus, Roy P; Theisen, Daniel; Findeisen, Hannes M; Hoffmann, Udo; Schönberg, Stefan O; Schlett, Christopher L; Reiser, Maximilian F; Weckbach, Sabine

    2013-12-01

    To study the predictive value of whole-body magnetic resonance (MR) imaging for the occurrence of cardiac and cerebrovascular events in a cohort of patients with diabetes mellitus (DM). This HIPAA-compliant study was approved by the institutional review board. Informed consent was obtained from all patients before enrollment into the study. The authors followed up 65 patients with DM (types 1 and 2) who underwent a comprehensive, contrast material-enhanced whole-body MR imaging protocol, including brain, cardiac, and vascular sequences at baseline. Follow-up was performed by phone interview. The primary endpoint was a major adverse cardiac and cerebrovascular event (MACCE), which was defined as composite cardiac-cerebrovascular death, myocardial infarction, cerebrovascular event, or revascularization. MR images were assessed for the presence of systemic atherosclerotic vessel changes, white matter lesions, and myocardial changes. Kaplan-Meier survival and Cox regression analyses were performed to determine associations. Follow-up was completed in 61 patients (94%; median age, 67.5 years; 30 women [49%]; median follow-up, 70 months); 14 of the 61 patients (23%) experienced MACCE. Although normal whole-body MR imaging excluded MACCE during the follow-up period (0%; 95% confidence interval [CI]: 0%, 17%), any detectable ischemic and/or atherosclerotic changes at whole-body MR imaging (prevalence, 66%) conferred a cumulative event rate of 20% at 3 years and 35% at 6 years. Whole-body MR imaging summary estimate of disease was strongly predictive for MACCE (one increment of vessel score and each territory with atherosclerotic changes: hazard ratio, 13.2 [95% CI: 4.5, 40.1] and 3.9 [95% CI: 2.2, 7.5], respectively), also beyond clinical characteristics as well as individual cardiac or cerebrovascular MR findings. These initial data indicate that disease burden as assessed with whole-body MR imaging confers strong prognostic information in patients with DM. Online supplemental material is available for this article. © RSNA, 2013.

  3. [Percutaneous coronary intervention of unprotected left main coronary compared with coronary artery bypass grafting; 3 years of experience in the National Institute of Cardiology, Mexico].

    PubMed

    López-Aguilar, Carlos; Abundes-Velasco, Arturo; Eid-Lidt, Guering; Piña-Reyna, Yigal; Gaspar-Hernández, Jorge

    The best revascularisation method of the unprotected left main artery is a current and evolving topic. A total of 2439 percutaneous coronary interventions (PCI) were registered during a 3-year period. The study included all the patients with PCI of the unprotected left main coronary (n=48) and matched with patients who underwent coronary artery bypass graft (CABG) (n=50). Major adverse cerebral and cardiac events (MACCE) were assessed within the hospital and in outpatients during a 16 month follow up. The cardiovascular risk was greater in the PCI group; logEuroSCORE 16±21 vs. 5±6, P=.001; clinical Syntax 77±74 vs 53±39, P=.04. On admission, the PCI group of patients had a higher frequency of ST segment elevation myocardial infarction (STEMI) and cardiogenic shock. The MACCE were similar in both groups (14% vs. 18%, P=.64). STEMI was less frequent in the PCI group (0% vs. 10%, P=.03). Cardiovascular events were lower in the PCI group (2.3% vs. 18%, P=.01), and there was a decrease in general and cardiac mortality (2.3% vs. 12%, P=.08 y 2.3% vs. 8%, P=.24), on excluding the patients with cardiogenic shock as a presentation. MACCE were similar in both groups in the out-patient phase (15% vs. 12%, P=.46). Survival without MACCE, general and cardiac death were comparable between groups (log rank, P=.38, P=.44 and P=.16, respectively). Even though the clinical and peri-procedural risk profile of the PCI patients were higher, the in-hospital and out-hospital efficacy and safety were comparable with CABG. Copyright © 2016 Instituto Nacional de Cardiología Ignacio Chávez. Publicado por Masson Doyma México S.A. All rights reserved.

  4. Cost-effectiveness of percutaneous coronary intervention with drug-eluting stents in patients with multivessel coronary artery disease compared to coronary artery bypass surgery five-years after intervention

    PubMed Central

    Krenn, Lisa; Kopp, Christoph; Glogar, Dietmar; Lang, Irene M; Delle-Karth, Georg; Neunteufl, Thomas; Kreiner, Gerhard; Kaider, Alexandra; Bergler-Klein, Jutta; Khorsand, Aliasghar; Nikfardjam, Mariam; Laufer, Günther; Maurer, Gerald; Gyöngyösi, Mariann

    2014-01-01

    Objectives Cost-effectiveness of percutaneous coronary intervention (PCI) using drug-eluting stents (DES), and coronary artery bypass surgery (CABG) was analyzed in patients with multivessel coronary artery disease over a 5-year follow-up. Background DES implantation reducing revascularization rate and associated costs might be attractive for health economics as compared to CABG. Methods Consecutive patients with multivessel DES-PCI (n = 114, 3.3 ± 1.2 DES/patient) or CABG (n = 85, 2.7 ± 0.9 grafts/patient) were included prospectively. Primary endpoint was cost-benefit of multivessel DES-PCI over CABG, and the incremental cost-effectiveness ratio (ICER) was calculated. Secondary endpoint was the incidence of major adverse cardiac and cerebrovascular events (MACCE), including acute myocardial infarction (AMI), all-cause death, revascularization, and stroke. Results Despite multiple uses for DES, in-hospital costs were significantly less for PCI than CABG, with 4551 €/patient difference between the groups. At 5-years, the overall costs remained higher for CABG patients (mean difference 5400 € between groups). Cost-effectiveness planes including all patients or subgroups of elderly patients, diabetic patients, or Syntax score >32 indicated that CABG is a more effective, more costly treatment mode for multivessel disease. At the 5-year follow-up, a higher incidence of MACCE (37.7% vs. 25.8%; log rank P = 0.048) and a trend towards more AMI/death/stroke (25.4% vs. 21.2%, log rank P = 0.359) was observed in PCI as compared to CABG. ICER indicated 45615 € or 126683 € to prevent one MACCE or AMI/death/stroke if CABG is performed. Conclusions Cost-effectiveness analysis of DES-PCI vs. CABG demonstrated that CABG is the most effective, but most costly, treatment for preventing MACCE in patients with multivessel disease. © 2014 Wiley Periodicals, Inc. PMID:24403120

  5. Observational Prospective study to esTIMAte the rates of outcomes in patients undergoing PCI with drug-eluting stent implantation who take statins -follow-up (OPTIMA II).

    PubMed

    Karpov, Yu; Logunova, N; Tomilova, D; Buza, V; Khomitskaya, Yu

    2017-02-01

    The OPTIMA II study sought to evaluate rates of major adverse cardiac and cerebrovascular events (MACCEs) during the long-term follow-up of chronic statin users who underwent percutaneous coronary intervention (PCI) with implantation of a drug-eluting stent (DES). OPTIMA II was a non-interventional, observational study conducted at a single center in the Russian Federation. Included patients were aged ≥18 years with stable angina who had received long-term (≥1 month) statin therapy prior to elective PCI with DES implantation and who had participated in the original OPTIMA study. Patients received treatment for stable angina after PCI as per routine study site clinical practice. Study data were collected from patient medical records and a routine visit 4 years after PCI. NCT02099565. Rate of MACCEs 4 years after PCI. Overall, 543 patients agreed to participate in the study (90.2% of patients in the original OPTIMA study). The mean (± standard deviation [SD]) duration of follow-up from the date of PCI to data collection was 4.42 ± 0.58 (range: 0.28-5.56) years. The frequency of MACCEs (including data in patients who died) was 30.8% (95% confidence interval: 27.0-34.7); half of MACCEs occurred in the first year of follow-up. After PCI, the majority of patients had no clinical signs of angina. Overall, 24.3% of patients discontinued statin intake in the 4 years after PCI. Only 7.7% of patients achieved a low-density lipoprotein (LDL) cholesterol goal of <1.8 mmol/L. Key limitations of this study related to its observational nature; for example, the sample size was small, the clinical results were derived from outpatients and hospitalized medical records, only one follow-up visit was performed at the end of the study (after 4 years' follow-up), only depersonalized medical information was made available for statistical analysis, and adherence to statin treatment was evaluated on the basis of patient questionnaire. Long-term follow-up of patients who underwent PCI with DES implantation demonstrated MACCEs in nearly one-third of patients, which is comparable to data from other studies. PCI was associated with relief from angina or minimal angina frequency, but compliance with statin therapy and the achievement of LDL cholesterol targets 4 years after PCI were suboptimal.

  6. Stochastic Modeling of Radioactive Material Releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason; Pope, Chad

    2015-09-01

    Nonreactor nuclear facilities operated under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA was developed using the MATLAB coding framework. The software application has a graphical user input. SODA can be installed on both Windows and Mac computers and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC, rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The work was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  7. Consistent Evaluation of ACOS-GOSAT, BESD-SCIAMACHY, CarbonTracker, and MACC Through Comparisons to TCCON

    NASA Technical Reports Server (NTRS)

    Kulawik, Susan; Wunch, Debra; O’Dell, Christopher; Frankenberg, Christian; Reuter, Maximilian; Chevallier, Frederic; Oda, Tomohiro; Sherlock, Vanessa; Buchwitz, Michael; Osterman, Greg; hide

    2016-01-01

    Consistent validation of satellite CO2 estimates is a prerequisite for using multiple satellite CO2 measurements for joint flux inversion, and for establishing an accurate long-term atmospheric CO2 data record. Harmonizing satellite CO2 measurements is particularly important since the differences in instruments, observing geometries, sampling strategies, etc. imbue different measurement characteristics in the various satellite CO2 data products. We focus on validating model and satellite observation attributes that impact flux estimates and CO2 assimilation, including accurate error estimates, correlated and random errors, overall biases, biases by season and latitude, the impact of coincidence criteria, validation of seasonal cycle phase and amplitude, yearly growth, and daily variability. We evaluate dry-air mole fraction (X(sub CO2)) for Greenhouse gases Observing SATellite (GOSAT) (Atmospheric CO2 Observations from Space, ACOS b3.5) and SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY) (Bremen Optimal Estimation DOAS, BESD v2.00.08) as well as the CarbonTracker (CT2013b) simulated CO2 mole fraction fields and the Monitoring Atmospheric Composition and Climate (MACC) CO2 inversion system (v13.1) and compare these to Total Carbon Column Observing Network (TCCON) observations (GGG2012/2014). We find standard deviations of 0.9, 0.9, 1.7, and 2.1 parts per million vs. TCCON for CT2013b, MACC, GOSAT, and SCIAMACHY, respectively, with the single observation errors 1.9 and 0.9 times the predicted errors for GOSAT and SCIAMACHY, respectively. We quantify how satellite error drops with data averaging by interpreting according to (error(sup 2) equals a(sup 2) plus b(sup 2) divided by n (with n being the number of observations averaged, a the systematic (correlated) errors, and b the random (uncorrelated) errors). a and b are estimated by satellites, coincidence criteria, and hemisphere. Biases at individual stations have year-to-year variability of 0.3 parts per million, with biases larger than the TCCON predicted bias uncertainty of 0.4 parts per million at many stations. We find that GOSAT and CT2013b under-predict the seasonal cycle amplitude in the Northern Hemisphere (NH) between 46 and 53 degrees North latitude, MACC over-predicts between 26 and 37 degrees North latitude, and CT2013b under-predicts the seasonal cycle amplitude in the Southern Hemisphere (SH). The seasonal cycle phase indicates whether a data set or model lags another data set in time. We find that the GOSAT measurements improve the seasonal cycle phase substantially over the prior while SCIAMACHY measurements improve the phase significantly for just two of seven sites. The models reproduce the measured seasonal cycle phase well except for at Lauder_125HR (CT2013b) and Darwin (MACC). We compare the variability within 1 day between TCCON and models in June-July-August; there is correlation between 0.2 and 0.8 in the NH, with models showing 10-50 percent the variability of TCCON at different stations and CT2013b showing more variability than MACC. This paper highlights findings that provide inputs to estimate flux errors in model assimilations, and places where models and satellites need further investigation, e.g., the SH for models and 45-67 degrees North latitude for GOSAT and CT2013b.

  8. Impact of anaemia on long-term outcomes in patients treated with first- and second-generation drug-eluting stents; Katowice-Zabrze Registry.

    PubMed

    Wańha, Wojciech; Kawecki, Damian; Roleder, Tomasz; Pluta, Aleksandra; Marcinkiewicz, Kamil; Dola, Janusz; Morawiec, Beata; Krzych, Łukasz; Pawłowski, Tomasz; Smolka, Grzegorz; Ochała, Andrzej; Nowalany-Kozielska, Ewa; Tendera, Michał; Wojakowski, Wojciech

    2016-01-01

    Coexisting anaemia is associated with an increased risk of major adverse cardiac and cerebrovascular events (MACCE) and bleeding complications after percutaneous coronary intervention (PCI), especially in patients with acute coronary syndrome. To assess the impact of anaemia in patients with coronary artery disease (CAD) treated with first- and second-generation drug-eluting stents (DES) on one-year MACCE. The registry included 1916 consecutive patients (UA: n = 1502, 78.3%; NSTEMI: n = 283, 14.7%; STEMI/LBBB: n = 131, 6.8%) treated either with first- (34%) or second-generation (66%) DES. The study population was divided into two groups: patients presenting with anaemia 217 (11%) and without anaemia 1699 (89%) prior to PCI. Anaemia was defined according to World Heart Organisation (haemoglobin [Hb] level < 13 g/dL for men and < 12 g/dL for women). Patients with anaemia were older (69, IQR: 61-75 vs. 62, IQR: 56-70, p < 0.001), had higher prevalence of co-morbidities: diabetes (44.7% vs. 36.4%, p = 0.020), chronic kidney disease (31.3% vs. 19.4%; p < 0.001), peripheral artery disease (10.1% vs. 5.4%, p = 0.005), and lower left ventricular ejection fraction values (50, IQR: 40-57% vs. 55, IQR: 45-60%; p < 0.001). No difference between gender in frequency of anaemia was found. Patients with anaemia more often had prior myocardial infarction (MI) (57.6% vs. 46.4%; p = 0.002) and coronary artery bypass grafting (31.3% vs. 19.4%; p < 0.001) in comparison to patients without anaemia. They also more often had multivessel disease in angiography (36.4% vs. 26.1%; p = 0.001) and more complexity CAD as measured by SYNTAX score (21, IQR: 12-27 points vs. 14, IQR: 8-22 points; p = 0.001). In-hospital risk of acute heart failure (2.7% vs. 0.7%; p = 0.006) and bleeding requiring transfusion (3.2% vs. 0.5%; p < 0.001) was significantly higher in patients with anaemia. One-year follow-up showed that there was higher rate of death in patients with anaemia. However, there were no differences in MI, stroke, target vessel revascularisation (TVR) and MACCE in comparison to patients with normal Hb. There were no differences according to type of DES (first vs. second generation) in the population of patients with anaemia. In patients with anaemia there is a significantly higher risk of death in 12-month follow-up, but anaemia has no impact on the incidence of MI, repeat revascularisation, stroke and MACCE. There is no advantage of II-DES over I-DES generation in terms of MACCE and TVR in patients with anaemia.

  9. Green Infrastructure Barriers and Opportunities in the Macatawa Watershed, Michigan

    EPA Pesticide Factsheets

    The project supports MACC outreach and implementation efforts of the watershed management plan by facilitating communication with local municipal staff and educating local decision makers about green infrastructure.

  10. Impact of Chronic Obstructive Pulmonary Disease on Long-Term Outcome in Patients with Coronary Artery Disease Undergoing Percutaneous Coronary Intervention.

    PubMed

    Zhang, Ming; Cheng, Yun-Jiu; Zheng, Wei-Ping; Liu, Guang-Hui; Chen, Huai-Sheng; Ning, Yu; Zhao, Xin; Su, Li-Xiao; Liu, Li-Juan

    2016-01-01

    Objective . The aim of this study was to investigate the association between COPD and major adverse cardiovascular and cerebral events (MACCE) in patients undergoing percutaneous coronary intervention (PCI). Methods . 2,362 patients who underwent PCI were included in this study. Subjects were divided into 2 groups: with COPD ( n = 233) and without COPD ( n = 2,129). Cox proportional hazards models were analyzed to determine the effect of COPD on the incidence of MACCE. Results . The patients with COPD were older ( P < 0.0001) and were more likely to be current smokers ( P = 0.02) and have had hypertension ( P = 0.02) and diabetes mellitus ( P = 0.01). Prevalence of serious cardiovascular comorbidity was higher in the patients with COPD, including a history of MI ( P = 0.02) and HF ( P < 0.0001). Compared with non-COPD group, the COPD group showed a higher risk of all-cause death (hazard ratio (HR): 2.45, P < 0.0001), cardiac death (HR: 2.53, P = 0.0002), MI (HR: 1.387, P = 0.027), and HF (HR: 2.25, P < 0.0001). Conclusions . Patients with CAD and concomitant COPD are associated with a higher incidence of MACCE (all-cause death, cardiac death, MI, and HF) compared to patients without COPD. The patients with a history of COPD have higher in-hospital and long-term mortality rates than those without COPD after PCI.

  11. Global data set of biogenic VOC emissions calculated by the MEGAN model over the last 30 years

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sindelarova, K.; Granier, Claire; Bouarar, I.

    The Model of Emissions of Gases and Aerosols from Nature (MEGANv2.1) together with the Modern-Era Retrospective Analysis for Research and Applications (MERRA) meteorological fields were used to create a global emission dataset of biogenic VOCs available on a monthly basis for the time period of 1980 - 2010. This dataset is called MEGAN-MACC. The model estimated mean annual total BVOC emission of 760 Tg(C) yr1 consisting of isoprene (70%), monoterpenes (11%), methanol (6%), acetone (3%), sesquiterpenes (2.5%) and other BVOC species each contributing less than 2 %. Several sensitivity model runs were performed to study the impact of different modelmore » input and model settings on isoprene estimates and resulted in differences of * 17% of the reference isoprene total. A greater impact was observed for sensitivity run applying parameterization of soil moisture deficit that led to a 50% reduction of isoprene emissions on a global scale, most significantly in specific regions of Africa, South America and Australia. MEGAN-MACC estimates are comparable to results of previous studies. More detailed comparison with other isoprene in ventories indicated significant spatial and temporal differences between the datasets especially for Australia, Southeast Asia and South America. MEGAN-MACC estimates of isoprene and*-pinene showed a reasonable agreement with surface flux measurements in the Amazon andthe model was able to capture the seasonal variation of emissions in this region.« less

  12. Comparison of different antithrombotic regimens for patients with atrial fibrillation undergoing drug-eluting stent implantation.

    PubMed

    Gao, Fei; Zhou, Yu Jie; Wang, Zhi Jian; Shen, Hua; Liu, Xiao Li; Nie, Bin; Yan, Zhen Xian; Yang, Shi Wei; Jia, De An; Yu, Miao

    2010-04-01

    The optimal antithrombotic strategy for patients with atrial fibrillation (AF) undergoing drug-eluting stent (DES) implantation is unknown. The 622 consecutive AF patients undergoing DES implantation were prospectively enrolled. Among them, 142 patients (TT group) continued triple antithrombotic therapy comprising aspirin, clopidogrel and warfarin after discharge; 355 patients (DT group) had dual antiplatelet therapy; 125 patients (WS group) were discharged with warfarin and a single antiplatelet agent. Target INR was set as 1.8-2.5 and was regularly monitored after discharge. The TT group had a significant reduction in stroke and major adverse cardiac and cerebral events (MACCE) (8.8% vs 20.1% vs 14.9%, P=0.010) as compared with either the DT or WS group. In the Cox regression analysis, administration with warfarin (hazard ratio (HR) 0.49; 95% confidence interval (CI) 0.31-0.77; P=0.002) and baseline CHADS(2) score >or=2 (HR 2.09; 95%CI 1.27-3.45; P=0.004) were independent predictors of MACCE. Importantly, the incidence of major bleeding was comparable among 3 groups (2.9% vs 1.8% vs 2.5%, P=0.725), although the overall bleeding rate was increased in the TT group. Kaplan-Meier analysis indicated that the TT group was associated with the best net clinical outcome. The cardiovascular benefits of triple antithrombotic therapy were confirmed by reducing the MACCE rate, and its major bleeding risk might be acceptable if the INR is closely monitored.

  13. Preliminary risks associated with postulated tritium release from production reactor operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Kula, K.R.; Horton, W.H.

    1988-01-01

    The Probabilistic Risk Assessment (PRA) of Savannah River Plant (SRP) reactor operation is assessing the off-site risk due to tritium releases during postulated full or partial loss of heavy water moderator accidents. Other sources of tritium in the reactor are less likely to contribute to off-site risk in non-fuel melting accident scenarios. Preliminary determination of the frequency of average partial moderator loss (including incidents with leaks as small as .5 kg) yields an estimate of /approximately/1 per reactor year. The full moderator loss frequency is conservatively chosen as 5 /times/ 10/sup /minus/3/ per reactor year. Conditional consequences, determined with amore » version of the MACCS code modified to handle tritium, are found to be insignificant. The 95th percentile individual cancer risk is 4 /times/ 10/sup /minus/8/ per reactor year within 16 km of the release point. The full moderator loss accident contributes about 75% of the evaluated risks. 13 refs., 4 figs., 5 tabs.« less

  14. Evaluation of the ability of the MACC-II Reanalysis to reproduce the distribution of O3 and CO in the UTLS as measured by MOZAIC-IAGOS

    NASA Astrophysics Data System (ADS)

    Gaudel, A.; Clark, H.; Thouret, V.; Eskes, H.; Huijnen, V.; Nedelec, P.

    2013-12-01

    Tropospheric ozone is probably one of the most important trace gases in the atmosphere. It plays a major role in the chemistry of the troposphere by exerting a strong influence on the concentrations of oxidants such as hydroxyl radical (OH) and is the third greenhouse gas after carbon dioxide and methane. Its radiative impact is of particular importance in the Upper Troposphere / Lower Stratosphere (UTLS), the most critical region regarding the climate change. Carbon Monoxide (CO) is one of the major ozone precursors (originating from all types of combustion) in the troposphere. In the UTLS, it also has implications for stratospheric chemistry and indirect radiative forcing effects (as a chemical precursor of CO2 and O3). Assessing the global distribution (and possibly trends) of O3 and CO in this region of the atmosphere, combining high resolution in situ data and the most appropriate global 3D model to further quantify the different sources and their origins is then of particular interest. This is one of the objectives of the MOZAIC-IAGOS (http://www.iagos.fr) and MACC-II (http://www.gmes-atmosphere.eu) European programs. The aircraft of the MOZAIC program have collected simultaneously O3 and CO data regularly all over the world since the end of 2001. Most of the data are recorded in northern mid-latitudes, in the UTLS region (as commercial aircraft cruise altitude is between 9 and 12 km). MACC-II aims at providing information services covering air quality, climate forcing and stratospheric ozone, UV radiation and solar-energy resources, using near real time analysis and forecasting products, and reanalysis. The validation reports of the MACC models are regularly published (http://www.gmes-atmosphere.eu/services/gac/nrt/ and http://www.gmes-atmosphere.eu/services/gac/reanalysis/). We will present and discuss the performance of the MACC-reanalysis, including the ECMWF-Integrated Forecasting System (IFS) coupled to the CTM MOZART with 4DVAR data assimilation, to reproduce ozone and CO in the UTLS, as evaluated by the observations of MOZAIC between 2003 and 2008. In the UT, the model tends to overestimate O3 by about 30-40 % in the mid-latitudes and polar regions. This applies broadly to all seasons but is more marked in DJF and MAM. In tropical regions, the model underestimates UT ozone by about 20 % in all seasons but this is stronger in JJA. Upper-tropospheric CO is globally underestimated by the model in all seasons, by 10-20 %. In the southern hemisphere, it is particularly the case in SON in the regions of wildfires in South Africa. In the northern hemisphere, the zonal gradient of CO between the US, Europe and Asia is not well-captured by the model, especially in MAM.

  15. The development of a classification system for maternity models of care.

    PubMed

    Donnolley, Natasha; Butler-Henderson, Kerryn; Chapman, Michael; Sullivan, Elizabeth

    2016-08-01

    A lack of standard terminology or means to identify and define models of maternity care in Australia has prevented accurate evaluations of outcomes for mothers and babies in different models of maternity care. As part of the Commonwealth-funded National Maternity Data Development Project, a classification system was developed utilising a data set specification that defines characteristics of models of maternity care. The Maternity Care Classification System or MaCCS was developed using a participatory action research design that built upon the published and grey literature. The study identified the characteristics that differentiate models of care and classifies models into eleven different Major Model Categories. The MaCCS will enable individual health services, local health districts (networks), jurisdictional and national health authorities to make better informed decisions for planning, policy development and delivery of maternity services in Australia. © The Author(s) 2016.

  16. Use of the RenalGuard system to prevent contrast-induced AKI: A meta-analysis.

    PubMed

    Mattathil, Stephanie; Ghumman, Saad; Weinerman, Jonathan; Prasad, Anand

    2017-10-01

    Contrast-induced kidney injury (CI-AKI) following cardiovascular interventions results in increased morbidity and mortality. RenalGuard (RG) is a novel, closed loop system which balances volume administration with forced diuresis to maintain a high urine output. We performed a meta-analysis of the existing data comparing use of RG to conventional volume expansion. Ten studies were found eligible, of which four were randomized controlled trials. Of an aggregate sample size (N) of 1585 patients, 698 were enrolled in the four RCTs and 887 belonged to the remaining registries included in this meta-analysis. Primary outcomes included CI-AKI incidence and relative risk. Mortality, dialysis, and major adverse cardiovascular events (MACCE) were secondary outcomes. A random effects model was used and data were evaluated for publication bias. RG was associated with significant risk reduction in CI-AKI compared to control (RR: 0.30, 95%CI: 0.18-0.50, P < 0.01). CI-AKI in RG was found to be 7.7% versus 23.6% in the control group (P < 0.01). Use of RG was associated with decreased mortality (RR: 0.43, 95%CI: 0.18-0.99, P = 0.05), dialysis (RR: 0.20, 95%CI: 0.06-0.61, P = 0.01), and MACCE (RR: 0.42, 95%CI: 0.27-0.65, P < 0.01) compared to control. RG significantly reduces rates of CI-AKI compared to standard volume expansion and is also associated with decreased rates of death, dialysis, and MACCE. © 2017, Wiley Periodicals, Inc.

  17. Construction of a Calibrated Probabilistic Classification Catalog: Application to 50k Variable Sources in the All-Sky Automated Survey

    NASA Astrophysics Data System (ADS)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Brink, Henrik; Crellin-Quick, Arien

    2012-12-01

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  18. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.

    2012-12-15

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In additionmore » to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.« less

  19. A comparison of two brands of clopidogrel in patients with drug-eluting stent implantation.

    PubMed

    Park, Yae Min; Ahn, Taehoon; Lee, Kyounghoon; Shin, Kwen-Chul; Jung, Eul Sik; Shin, Dong Su; Kim, Myeong Gun; Kang, Woong Chol; Han, Seung Hwan; Choi, In Suck; Shin, Eak Kyun

    2012-07-01

    Although generic clopidogrel is widely used, clinical efficacy and safety between generic and original clopidogrel had not been well evaluated. The aim of this study was to evaluate the clinical outcomes of 2 oral formulations of clopidogrel 75 mg tablets in patients with coronary artery disease (CAD) undergoing drug-eluting stent (DES) implantation. Between July 2006 and February 2009, 428 patients that underwent implantation with DES for CAD and completed >1 year of clinical follow-up were enrolled in this study. Patients were divided into the following 2 groups based on treatment formulation, Platless® (test formulation, n=211) or Plavix® (reference formulation, n=217). The incidence of 1-year major adverse cardiovascular and cerebrovascular event (MACCE) and stent thrombosis (ST) were retrospectively reviewed. The baseline demographic and procedural characteristics were not significantly different between two treatment groups. The incidence of 1-year MACCEs was 8.5% {19/211, 2 deaths, 4 myocardial infarctions (MIs), 2 strokes, and 11 target vessel revascularizations (TVRs)} in Platless® group vs. 7.4% (16/217, 4 deaths, 1 MI, 2 strokes, and 9 TVRs) in Plavix® group (p=0.66). The incidence of 1-year ST was 0.5% (1 definite and subacute ST) in Platless® group vs. 0% in Plavix® group (p=0.49). In this study, the 2 tablet preparations of clopidogrel showed similar rates of MACCEs, but additional prospective randomized studies with pharmacodynamics and platelet reactivity are needed to conclude whether generic clopidgrel may replace original clopidogrel.

  20. SecPop Version 4: Sector Population Land Fraction and Economic Estimation Program: Users? Guide Model Manual and Verification Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Scott; Bixler, Nathan E.; McFadden, Katherine Letizia

    In 1973 the U.S. Environmental Protection Agency (EPA) developed SecPop to calculate population estimates to support a study on air quality. The Nuclear Regulatory Commission (NRC) adopted this program to support siting reviews for nuclear power plant construction and license applications. Currently SecPop is used to prepare site data input files for offsite consequence calculations with the MELCOR Accident Consequence Code System (MACCS). SecPop enables the use of site-specific population, land use, and economic data for a polar grid defined by the user. Updated versions of SecPop have been released to use U.S. decennial census population data. SECPOP90 was releasedmore » in 1997 to use 1990 population and economic data. SECPOP2000 was released in 2003 to use 2000 population data and 1997 economic data. This report describes the current code version, SecPop version 4.3, which uses 2010 population data and both 2007 and 2012 economic data. It is also compatible with 2000 census and 2002 economic data. At the time of this writing, the current version of SecPop is 4.3.0, and that version is described herein. This report contains guidance for the installation and use of the code as well as a description of the theory, models, and algorithms involved. This report contains appendices which describe the development of the 2010 census file, 2007 county file, and 2012 county file. Finally, an appendix is included that describes the validation assessments performed.« less

  1. A study of cellular counting to determine minimum thresholds for adequacy for liquid-based cervical cytology using a survey and counting protocol.

    PubMed

    Kitchener, Henry C; Gittins, Matthew; Desai, Mina; Smith, John H F; Cook, Gary; Roberts, Chris; Turnbull, Lesley

    2015-03-01

    Liquid-based cytology (LBC) for cervical screening would benefit from laboratory practice guidelines that define specimen adequacy for reporting of slides. The evidence base required to define cell adequacy should incorporate both ThinPrep™ (TP; Hologic, Inc., Bedford, MA, USA) and SurePath™ (SP; BD Diagnostics, Burlington, NC, USA), the two LBC systems used in the UK cervical screening programmes. The objectives of this study were to determine (1) current practice for reporting LBC in England, Wales and Scotland, (2) a reproducible method for cell counting, (3) the cellularity of slides classified as inadequate, negative or abnormal and (4) the impact of varying cellularity on the likelihood of detecting cytological abnormalities. The study involved four separate arms to pursue each of the four objectives. (1) A questionnaire survey of laboratories was conducted. (2) A standard counting protocol was developed and used by three experienced cytopathologists to determine a reliable and reproducible cell counting method. (3) Slide sets which included a range of cytological abnormalities were each sent to three laboratories for cell counting to study the correlation between cell counts and reported cytological outcomes. (4) Dilution of LBC samples by fluid only (unmixed) or by dilution with a sample containing normal cells (mixed) was performed to study the impact on reporting of reducing either the total cell count or the relative proportion of abnormal to normal cells. The study was conducted within the cervical screening programmes in England, Wales and Scotland, using routinely obtained cervical screening samples, and in 56 participating NHS cervical cytology laboratories. The study involved only routinely obtained cervical screening samples. There was no clinical intervention. The main outcome measures were (1) reliability of counting method, (2) correlation of reported cytology grades with cellularity and (3) levels of detection of abnormal cells in progressively diluted cervical samples. Laboratory practice varied in terms of threshold of cellular adequacy and of morphological markers of adequacy. While SP laboratories generally used a minimum acceptable cell count (MACC) of 15,000, the MACC employed by TP laboratories varied between 5000 and 15,000. The cell counting study showed that a standard protocol achieved moderate to strong inter-rater reproducibility. Analysis of slide reporting from laboratories revealed that a large proportion of the samples reported as inadequate had cell counts above a threshold of 15,000 for SP, and 5000 and 10,000 for TP. Inter-rater unanimity was greater among more cellular preparations. Dilution studies demonstrated greater detection of abnormalities in slides with counts above the MACC and among slides with more than 25 dyskaryotic cells. Variation in laboratory practice demonstrates a requirement for evidence-based standards for designating a MACC. This study has indicated that a MACC of 15,000 and 5000 for SP and TP, respectively, achieves a balance in terms of maintaining sensitivity and low inadequacy rates. The findings of this study should inform the development of laboratory practice guidelines. The National Institute for Health Research Health Technology Assessment programme.

  2. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    2016-12-01

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  3. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  4. The IAGOS information system

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Schultz, Martin; Brötz, Björn; Rauthe-Schöch, Armin; Thouret, Valérie

    2015-04-01

    IAGOS (In-service Aircraft for a Global Observing System) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. The IAGOS database (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data centre Ether (CNES and CNRS). In the framework of the IGAS project (IAGOS for Copernicus Atmospheric Service) interoperability with international portals or other databases is implemented in order to improve IAGOS data discovery. The IGAS data network is composed of three data centres: the IAGOS database in Toulouse including IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data since January 2015; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the MACC data centre in Jülich (http://join.iek.fz-juelich.de). The MACC (Monitoring Atmospheric Composition and Climate) project is a prominent user of the IGAS data network. In June 2015 a new version of the IAGOS database will be released providing improved services such as download in NetCDF or NASA Ames formats; graphical tools (maps, scatter plots, etc.); standardized metadata (ISO 19115) and a better users management. The link with the MACC data centre, through JOIN (Jülich OWS Interface), will allow to combine model outputs with IAGOS data for intercomparison. The interoperability within the IGAS data network, implemented thanks to many web services, will improve the functionalities of the web interfaces of each data centre.

  5. Comparative Effectiveness of Blood Pressure-lowering Drugs in Patients who have Already Suffered From Stroke

    PubMed Central

    Wang, Wei-Ting; You, Li-Kai; Chiang, Chern-En; Sung, Shih-Hsien; Chuang, Shao-Yuan; Cheng, Hao-Min; Chen, Chen-Huan

    2016-01-01

    Abstract Hypertension is the most important risk factor for stroke and stroke recurrence. However, the preferred blood pressure (BP)-lowering drug class for patients who have suffered from a stroke has yet to be determined. To investigate the relative effects of BP-lowering therapies [angiotensin-converting enzyme inhibitor (ACEI), angiotensin receptor blockers (ARB), β blockers, calcium channel blockers (CCBs), diuretics, and combinations of these drugs] in patients with a prior stroke history, we performed a systematic review and meta-analysis using both traditional frequentist and Bayesian random-effects models and meta-regression of randomized controlled trials (RCTs) on the outcomes of recurrent stroke, coronary heart disease (CHD), and any major adverse cardiac and cerebrovascular events (MACCE). Trials were identified from searches of published hypertension guidelines, electronic databases, and previous systematic reviews. Fifteen RCTs composed of 39,329 participants with previous stroke were identified. Compared with the placebo, only ACEI along with diuretics significantly reduced recurrent stroke events [odds ratio (OR) = 0.54, 95% credibility interval (95% CI) 0.33–0.90]. On the basis of the distribution of posterior probabilities, the treatment ranking consistently identified ACEI along with diuretics as the preferred BP-lowering strategy for the reduction of recurrent stroke and CHD (31% and 35%, respectively). For preventing MACCE, diuretics appeared to be the preferred agent for stroke survivors (34%). Moreover, the meta-regression analysis failed to demonstrate a statistical significance between BP reduction and all outcomes (P = 0.1618 for total stroke, 0.4933 for CHD, and 0.2411 for MACCE). Evidence from RCTs supports the use of diuretics-based treatment, especially when combined with ACEI, for the secondary prevention of recurrent stroke and any vascular events in patients who have suffered from stroke. PMID:27082571

  6. Global height-resolved methane retrievals from the Infrared Atmospheric Sounding Interferometer (IASI) on MetOp

    NASA Astrophysics Data System (ADS)

    Siddans, Richard; Knappett, Diane; Kerridge, Brian; Waterfall, Alison; Hurley, Jane; Latter, Barry; Boesch, Hartmut; Parker, Robert

    2017-11-01

    This paper describes the global height-resolved methane (CH4) retrieval scheme for the Infrared Atmospheric Sounding Interferometer (IASI) on MetOp, developed at the Rutherford Appleton Laboratory (RAL). The scheme precisely fits measured spectra in the 7.9 micron region to allow information to be retrieved on two independent layers centred in the upper and lower troposphere. It also uses nitrous oxide (N2O) spectral features in the same spectral interval to directly retrieve effective cloud parameters to mitigate errors in retrieved methane due to residual cloud and other geophysical variables. The scheme has been applied to analyse IASI measurements between 2007 and 2015. Results are compared to model fields from the MACC greenhouse gas inversion and independent measurements from satellite (GOSAT), airborne (HIPPO) and ground (TCCON) sensors. The estimated error on methane mixing ratio in the lower- and upper-tropospheric layers ranges from 20 to 100 and from 30 to 40 ppbv, respectively, and error on the derived column-average ranges from 20 to 40 ppbv. Vertical sensitivity extends through the lower troposphere, though it decreases near to the surface. Systematic differences with the other datasets are typically < 10 ppbv regionally and < 5 ppbv globally. In the Southern Hemisphere, a bias of around 20 ppbv is found with respect to MACC, which is not explained by vertical sensitivity or found in comparison of IASI to TCCON. Comparisons to HIPPO and MACC support the assertion that two layers can be independently retrieved and provide confirmation that the estimated random errors on the column- and layer-averaged amounts are realistic. The data have been made publically available via the Centre for Environmental Data Analysis (CEDA) data archive (Siddans, 2016).

  7. A new method for assessing surface solar irradiance: Heliosat-4

    NASA Astrophysics Data System (ADS)

    Qu, Z.; Oumbe, A.; Blanc, P.; Lefèvre, M.; Wald, L.; Schroedter-Homscheidt, M.; Gesell, G.

    2012-04-01

    Downwelling shortwave irradiance at surface (SSI) is more and more often assessed by means of satellite-derived estimates of optical properties of the atmosphere. Performances are judged satisfactory for the time being but there is an increasing need for the assessment of the direct and diffuse components of the SSI. MINES ParisTech and the German Aerospace Center (DLR) are currently developing the Heliosat-4 method to assess the SSI and its components in a more accurate way than current practices. This method is composed by two parts: a clear sky module based on the radiative transfer model libRadtran, and a cloud-ground module using two-stream and delta-Eddington approximations for clouds and a database of ground albedo. Advanced products derived from geostationary satellites and recent Earth Observation missions are the inputs of the Heliosat-4 method. Such products are: cloud optical depth, cloud phase, cloud type and cloud coverage from APOLLO of DLR, aerosol optical depth, aerosol type, water vapor in clear-sky, ozone from MACC products (FP7), and ground albedo from MODIS of NASA. In this communication, we briefly present Heliosat-4 and focus on its performances. The results of Heliosat-4 for the period 2004-2010 will be compared to the measurements made in five stations within the Baseline Surface Radiation Network. Extensive statistic analysis as well as case studies are performed in order to better understand Heliosat-4 and have an in-depth view of the performance of Heliosat-4, to understand its advantages comparing to existing methods and to identify its defaults for future improvements. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 218793 (MACC project) and no. 283576 (MACC-II project).

  8. Triple antithrombotic therapy versus dual antiplatelet therapy in patients with atrial fibrillation undergoing drug-eluting stent implantation.

    PubMed

    Kang, Dong Oh; Yu, Cheol Woong; Kim, Hee Dong; Cho, Jae Young; Joo, Hyung Joon; Choi, Rak Kyong; Park, Jin Sik; Lee, Hyun Jong; Kim, Je Sang; Park, Jae Hyung; Hong, Soon Jun; Lim, Do-Sun

    2015-08-01

    The optimal antithrombotic regimen in patients with atrial fibrillation (AF) undergoing drug-eluting stent (DES) implantation for complex coronary artery disease is unclear. We compared the net clinical outcomes of triple antithrombotic therapy (TAT; aspirin, thienopyridine, and warfarin) and dual antiplatelet therapy (DAPT; aspirin and thienopyridine) in AF patients who had undergone DES implantation. A total of 367 patients were enrolled and analyzed retrospectively; 131 patients (35.7%) received TAT and 236 patients (64.3%) received DAPT. DAPT and warfarin were maintained for a minimum of 12 and 24 months, respectively. The primary endpoint was the 2-year net clinical outcomes, a composite of major bleeding and major adverse cardiac and cerebral events (MACCE). Propensity score-matching analysis was carried out in 99 patient pairs. The 2-year net clinical outcomes of the TAT group were worse than those of the DAPT group (34.3 vs. 21.1%, P=0.006), which was mainly due to the higher incidence of major bleeding (16.7 vs. 4.6%, P<0.001), without any significant increase in MACCE (22.1 vs. 17.7%, P=0.313). In the multivariate analysis, TAT was an independent predictor of worse net clinical outcomes (odds ratio 1.63, 95% confidence interval 1.06-2.50) and major bleeding (odds ratio 3.54, 95% confidence interval 1.65-7.58). After propensity score matching, the TAT group still had worse net clinical outcomes and a higher incidence of major bleeding compared with the DAPT group. In AF patients undergoing DES implantation, prolonged administration of TAT may be harmful due to the substantial increase in the risk for major bleeding without any reduction in MACCE.

  9. Clinical events after interruption of anticoagulation in patients with atrial fibrillation: An analysis from the ENGAGE AF-TIMI 48 trial.

    PubMed

    Cavallari, Ilaria; Ruff, Christian T; Nordio, Francesco; Deenadayalu, Naveen; Shi, Minggao; Lanz, Hans; Rutman, Howard; Mercuri, Michele F; Antman, Elliott M; Braunwald, Eugene; Giugliano, Robert P

    2018-04-15

    Patients with atrial fibrillation (AF) who interrupt anticoagulation are at high risk of thromboembolism and death. Patients enrolled in the ENGAGE AF-TIMI 48 trial (randomized comparison of edoxaban vs. warfarin) who interrupted study anticoagulant for >3 days were identified. Clinical events (ischemic stroke/systemic embolism, major cardiac and cerebrovascular events [MACCE]) were analyzed from day 4 after interruption until day 34 or study drug resumption. During 2.8 years median follow-up, 13,311 (63%) patients interrupted study drug for >3 days. After excluding those who received open-label anticoagulation during the at-risk window, the population for analysis included 9148 patients. The rates of ischemic stroke/systemic embolism and MACCE post interruption were substantially greater than in patients who never interrupted (15.42 vs. 0.26 and 60.82 vs. 0.36 per 100 patient-years, respectively, p adj  < .001). Patients who interrupted study drug for an adverse event (44.1% of the cohort), compared to those who interrupted for other reasons, had an increased risk of MACCE (HR adj 2.75; 95% CI 2.02-3.74, p < .0001), but similar rates of ischemic stroke/systemic embolism. Rates of clinical events after interruption of warfarin and edoxaban were similar. Interruption of study drug was frequent in patients with AF and was associated with a substantial risk of major cardiac and cerebrovascular events over the ensuing 30 days. This risk was particularly high in patients who interrupted as a result of an adverse event; these patients deserve close monitoring and resumption of anticoagulation as soon as it is safe to do so. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Impact of remote ischaemic preconditioning on major clinical outcomes in patients undergoing cardiovascular surgery: A meta-analysis with trial sequential analysis of 32 randomised controlled trials.

    PubMed

    Wang, Shifei; Li, Hairui; He, Nvqin; Sun, Yili; Guo, Shengcun; Liao, Wangjun; Liao, Yulin; Chen, Yanmei; Bin, Jianping

    2017-01-15

    The impact of remote ischaemic preconditioning (RIPC) on major clinical outcomes in patients undergoing cardiovascular surgery remains controversial. We systematically reviewed the available evidence to evaluate the potential benefits of RIPC in such patients. PubMed, Embase, and Cochrane Library databases were searched for relevant randomised controlled trials (RCTs) conducted between January 2006 and March 2016. The pooled population of patients who underwent cardiovascular surgery was divided into the RIPC and control groups. Trial sequential analysis was applied to judge data reliability. The pooled relative risks (RRs) with 95% confidence intervals (CIs) between the groups were calculated for all-cause mortality, major adverse cardiovascular and cerebral events (MACCEs), myocardial infarction (MI), and renal failure. RIPC was not associated with improvement in all-cause mortality (RR, 1.04; 95%CI, 0.82-1.31; I 2 =26%; P>0.05) or MACCE incidence (RR, 0.90; 95%CI, 0.71-1.14; I 2 =40%; P>0.05) after cardiovascular surgery, and both results were assessed by trial sequential analysis as sufficient and conclusive. Nevertheless, RIPC was associated with a significantly lower incidence of MI (RR, 0.87; 95%CI, 0.76-1.00; I 2 =13%; P≤0.05). However, after excluding a study that had a high contribution to heterogeneity, RIPC was associated with increased rates of renal failure (RR, 1.53; 95%CI, 1.12-2.10; I 2 =5%; P≤0.05). In patients undergoing cardiovascular surgery, RIPC reduced the risk for postoperative MI, but not that for MACCEs or all-cause mortality, a discrepancy likely related to the higher rate of renal failure associated with RIPC. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Red light regulation of ethylene biosynthesis and gravitropism in etiolated pea stems

    NASA Technical Reports Server (NTRS)

    Steed, C. L.; Taylor, L. K.; Harrison, M. A.

    2004-01-01

    During gravitropism, the accumulation of auxin in the lower side of the stem causes increased growth and the subsequent curvature, while the gaseous hormone ethylene plays a modulating role in regulating the kinetics of growth asymmetries. Light also contributes to the control of gravitropic curvature, potentially through its interaction with ethylene biosynthesis. In this study, red-light pulse treatment of etiolated pea epicotyls was evaluated for its effect on ethylene biosynthesis during gravitropic curvature. Ethylene biosynthesis analysis included measurements of ethylene; the ethylene precursor 1-aminocyclopropane-1-carboxylic acid (ACC); malonyl-conjugated ACC (MACC); and expression levels of pea ACC oxidase (Ps-ACO1) and ACC synthase (Ps-ACS1, Ps-ACS2) genes by reverse transcriptase-polymerase chain reaction analysis. Red-pulsed seedlings were given a 6 min pulse of 11 micromoles m-2 s-1 red-light 15 h prior to horizontal reorientation for consistency with the timeline of red-light inhibition of ethylene production. Red-pulse treatment significantly reduced ethylene production and MACC levels in epicotyl tissue. However, there was no effect of red-pulse treatment on ACC level, or expression of ACS or ACO genes. During gravitropic curvature, ethylene production increased from 60 to 120 min after horizontal placement in both control and red-pulsed epicotyls. In red-pulsed tissues, ACC levels increased by 120 min after horizontal reorientation, accompanied by decreased MACC levels in the lower portion of the epicotyl. Overall, our results demonstrate that ethylene production in etiolated epicotyls increases after the initiation of curvature. This ethylene increase may inhibit cell growth in the lower portion of the epicotyl and contribute to tip straightening and reduced overall curvature observed after the initial 60 min of curvature in etiolated pea epicotyls.

  12. Elevated troponin predicts long-term adverse cardiovascular outcomes in hypertensive crisis: a retrospective study.

    PubMed

    Pattanshetty, Deepak J; Bhat, Pradeep K; Aneja, Ashish; Pillai, Dilip P

    2012-12-01

    Hypertensive crisis is associated with poor clinical outcomes. Elevated troponin, frequently observed in hypertensive crisis, may be attributed to myocardial supply-demand mismatch or obstructive coronary artery disease (CAD). However, in patients presenting with hypertensive crisis and an elevated troponin, the prevalence of CAD and the long-term adverse cardiovascular outcomes are unknown. We sought to assess the impact of elevated troponin on cardiovascular outcomes and evaluate the role of troponin as a predictor of obstructive CAD in patients with hypertensive crisis. Patients who presented with hypertensive crisis (n = 236) were screened retrospectively. Baseline and follow-up data including the event rates were obtained using electronic patient records. Those without an assay for cardiac Troponin I (cTnI) (n = 65) were excluded. Of the remaining 171 patients, those with elevated cTnI (cTnI ≥ 0.12 ng/ml) (n = 56) were compared with those with normal cTnI (cTnI < 0.12 ng/ml) (n = 115) at 2 years for the occurrence of major adverse cardiac or cerebrovascular events (MACCE) (composite of myocardial infarction, unstable angina, hypertensive crisis, pulmonary edema, stroke or transient ischemic attack). At 2 years, MACCE occurred in 40 (71.4%) patients with elevated cTnI compared with 44 (38.3%) patients with normal cTnI [hazard ratio: 2.77; 95% confidence interval (CI): 1.79-4.27; P < 0.001]. Also, patients with elevated cTnI were significantly more likely to have underlying obstructive CAD (odds ratio: 8.97; 95% CI: 1.4-55.9; P < 0.01). In patients with hypertensive crisis, elevated cTnI confers a significantly greater risk of long-term MACCE, and is a strong predictor of obstructive CAD.

  13. MISR Regional GoMACCS Map Projection

    Atmospheric Science Data Center

    2017-03-29

    ... Regional Imagery:  Overview  |  Products  |  Data Quality  | Map Projection |  File Format  |  View Data  |  ... is needed if you are doing high precision work. The packages mentioned about (HDF-EOS library, GCTP, and IDL) all convert to and ...

  14. Deconvolution of magnetic acoustic change complex (mACC).

    PubMed

    Bardy, Fabrice; McMahon, Catherine M; Yau, Shu Hui; Johnson, Blake W

    2014-11-01

    The aim of this study was to design a novel experimental approach to investigate the morphological characteristics of auditory cortical responses elicited by rapidly changing synthesized speech sounds. Six sound-evoked magnetoencephalographic (MEG) responses were measured to a synthesized train of speech sounds using the vowels /e/ and /u/ in 17 normal hearing young adults. Responses were measured to: (i) the onset of the speech train, (ii) an F0 increment; (iii) an F0 decrement; (iv) an F2 decrement; (v) an F2 increment; and (vi) the offset of the speech train using short (jittered around 135ms) and long (1500ms) stimulus onset asynchronies (SOAs). The least squares (LS) deconvolution technique was used to disentangle the overlapping MEG responses in the short SOA condition only. Comparison between the morphology of the recovered cortical responses in the short and long SOAs conditions showed high similarity, suggesting that the LS deconvolution technique was successful in disentangling the MEG waveforms. Waveform latencies and amplitudes were different for the two SOAs conditions and were influenced by the spectro-temporal properties of the sound sequence. The magnetic acoustic change complex (mACC) for the short SOA condition showed significantly lower amplitudes and shorter latencies compared to the long SOA condition. The F0 transition showed a larger reduction in amplitude from long to short SOA compared to the F2 transition. Lateralization of the cortical responses were observed under some stimulus conditions and appeared to be associated with the spectro-temporal properties of the acoustic stimulus. The LS deconvolution technique provides a new tool to study the properties of the auditory cortical response to rapidly changing sound stimuli. The presence of the cortical auditory evoked responses for rapid transition of synthesized speech stimuli suggests that the temporal code is preserved at the level of the auditory cortex. Further, the reduced amplitudes and shorter latencies might reflect intrinsic properties of the cortical neurons to rapidly presented sounds. This is the first demonstration of the separation of overlapping cortical responses to rapidly changing speech sounds and offers a potential new biomarker of discrimination of rapid transition of sound. Crown Copyright © 2014. Published by Elsevier Ireland Ltd. All rights reserved.

  15. Marginal abatement cost curves for NOx incorporating both controls and alternative measures

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the efficient marginal abatement cost level for any aggregate emissions target when a least cost approach is implemented. In order for it to represent the efficient MAC level, all abatement opportunities across all sectors and loc...

  16. Consistent evaluation of GOSAT, SCIAMACHY, carbontracker, and MACC through comparisons to TCCON

    DOE PAGES

    Kulawik, S. S.; Wunch, D.; O'Dell, C.; ...

    2015-06-22

    Consistent validation of satellite CO 2 estimates is a prerequisite for using multiple satellite CO 2 measurements for joint flux inversion, and for establishing an accurate long-term atmospheric CO 2 data record. We focus on validating model and satellite observation attributes that impact flux estimates and CO 2 assimilation, including accurate error estimates, correlated and random errors, overall biases, biases by season and latitude, the impact of coincidence criteria, validation of seasonal cycle phase and amplitude, yearly growth, and daily variability. We evaluate dry air mole fraction (X CO 2) for GOSAT (ACOS b3.5) and SCIAMACHY (BESD v2.00.08) as wellmore » as the CarbonTracker (CT2013b) simulated CO 2 mole fraction fields and the MACC CO 2 inversion system (v13.1) and compare these to TCCON observations (GGG2014). We find standard deviations of 0.9 ppm, 0.9, 1.7, and 2.1 ppm versus TCCON for CT2013b, MACC, GOSAT, and SCIAMACHY, respectively, with the single target errors 1.9 and 0.9 times the predicted errors for GOSAT and SCIAMACHY, respectively. When satellite data are averaged and interpreted according to error 2 = a 2+ b 2 / n (where n are the number of observations averaged, a are the systematic (correlated) errors, and b are the random (uncorrelated) errors), we find that the correlated error term a = 0.6 ppm and the uncorrelated error term b = 1.7 ppm for GOSAT and a = 1.0 ppm, b = 1.4 ppm for SCIAMACHY regional averages. Biases at individual stations have year-to-year variability of ~ 0.3 ppm, with biases larger than the TCCON predicted bias uncertainty of 0.4 ppm at many stations. Using fitting software, we find that GOSAT underpredicts the seasonal cycle amplitude in the Northern Hemisphere (NH) between 46–53° N. In the Southern Hemisphere (SH), CT2013b underestimates the seasonal cycle amplitude. Biases are calculated for 3-month intervals and indicate the months that contribute to the observed amplitude differences. The seasonal cycle phase indicates whether a dataset or model lags another dataset in time. We calculate this at a subset of stations where there is adequate satellite data, and find that the GOSAT retrieved phase improves substantially over the prior and the SCIAMACHY retrieved phase improves substantially for 2 of 7 sites. The models reproduce the measured seasonal cycle phase well except for at Lauder125 (CT2013b), Darwin (MACC), and Izana (+ 10 days, CT2013b), as for Bremen and Four Corners, which are highly influenced by local effects. We compare the variability within one day between TCCON and models in JJA; there is correlation between 0.2 and 0.8 in the NH, with models showing 10–100 % the variability of TCCON at different stations (except Bremen and Four Corners which have no variability compared to TCCON) and CT2013b showing more variability than MACC. This paper highlights findings that provide inputs to estimate flux errors in model assimilations, and places where models and satellites need further investigation, e.g. the SH for models and 45–67° N for GOSAT« less

  17. The EDIT-COMGEOM Code

    DTIC Science & Technology

    1975-09-01

    This report assumes a familiarity with the GIFT and MAGIC computer codes. The EDIT-COMGEOM code is a FORTRAN computer code. The EDIT-COMGEOM code...converts the target description data which was used in the MAGIC computer code to the target description data which can be used in the GIFT computer code

  18. Assessment of the MACC reanalysis and its influence as chemical boundary conditions for regional air quality modeling in AQMEII-2

    EPA Science Inventory

    The Air Quality Model Evaluation International Initiative (AQMEII) has now reached its second phase which is dedicated to the evaluation of online coupled chemistry-meteorology models. Sixteen modeling groups from Europe and five from North America have run regional air quality m...

  19. Validating the EXCEL hypothesis: a propensity score matched 3-year comparison of percutaneous coronary intervention versus coronary artery bypass graft in left main patients with SYNTAX score ≤32.

    PubMed

    Capodanno, Davide; Caggegi, Anna; Capranzano, Piera; Cincotta, Glauco; Miano, Marco; Barrano, Gionbattista; Monaco, Sergio; Calvo, Francesco; Tamburino, Corrado

    2011-06-01

    The aim of this study is to verify the study hypothesis of the EXCEL trial by comparing percutaneous coronary intervention (PCI) and coronary artery bypass graft (CABG) in an EXCEL-like population of patients. The upcoming EXCEL trial will test the hypothesis that left main patients with SYNTAX score ≤ 32 experience similar rates of 3-year death, myocardial infarction (MI), or cerebrovascular accidents (CVA) following revascularization by PCI or CABG. We compared the 3-year rates of death/MI/CVA and death/MI/CVA/target vessel revascularization (MACCE) in 556 patients with left main disease and SYNTAX score ≤ 32 undergoing PCI (n = 285) or CABG (n = 271). To account for confounders, outcome parameters underwent extensive statistical adjustment. The unadjusted incidence of death/MI/CVA was similar between PCI and CABG (12.7% vs. 8.4%, P = 0.892), while MACCE were higher in the PCI group compared to the CABG group (27.0% vs. 11.8%, P < 0.001). After propensity score matching, PCI was not associated with a significant increase in the rate of death/MI/CVA (11.8% vs. 10.7%, P = 0.948), while MACCE were more frequently noted among patients treated with PCI (28.8% vs. 14.1%, P = 0.002). Adjustment by means of SYNTAX score and EUROSCORE, covariates with and without propensity score, and propensity score alone did not change significantly these findings. In an EXCEL-like cohort of patients with left main disease, there seems to be a clinical equipoise between PCI and CABG in terms of death/MI/CVA. However, even in patients with SYNTAX score ≤ 32, CABG is superior to PCI when target vessel revascularization is included in the combined endpoint. Copyright © 2011 Wiley-Liss, Inc.

  20. Impact of dual antiplatelet therapy after coronary artery bypass surgery on 1-year outcomes in the Arterial Revascularization Trial.

    PubMed

    Benedetto, Umberto; Altman, Douglas G; Gerry, Stephen; Gray, Alastair; Lees, Belinda; Flather, Marcus; Taggart, David P

    2017-09-01

    There is still little evidence to boldport routine dual antiplatelet therapy (DAPT) with P2Y12 antagonists following coronary artery bypass grafting (CABG). The Arterial Revascularization Trial (ART) was designed to compare 10-year survival after bilateral versus single internal thoracic artery grafting. We aimed to get insights into the effect of DAPT (with clopidogrel) following CABG on 1-year outcomes by performing a post hoc ART analysis. Among patients enrolled in the ART (n = 3102), 609 (21%) and 2308 (79%) were discharged on DAPT or aspirin alone, respectively. The primary end-point was the incidence of major adverse cerebrovascular and cardiac events (MACCE) at 1 year including cardiac death, myocardial infarction, cerebrovascular accident and reintervention; safety end-point was bleeding requiring hospitalization. Propensity score (PS) matching was used to create comparable groups. Among 609 PS-matched pairs, MACCE occurred in 34 (5.6%) and 34 (5.6%) in the DAPT and aspirin alone groups, respectively, with no significant difference between the 2 groups [hazard ratio (HR) 0.97, 95% confidence interval (CI) 0.59-1.59; P = 0.90]. Only 188 (31%) subjects completed 1 year of DAPT, and in this subgroup, MACCE rate was 5.8% (HR 1.11, 95% CI 0.53-2.30; P = 0.78). In the overall sample, bleeding rate was higher in DAPT group (2.3% vs 1.1%; P = 0.02), although this difference was no longer significant after matching (2.3% vs 1.8%; P = 0.54). Based on these findings, when compared with aspirin alone, DAPT with clopidogrel prescribed at discharge was not associated with a significant reduction of adverse cardiac and cerebrovascular events at 1 year following CABG. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  1. Monitoring Air Quality over China: Evaluation of the modeling system of the PANDA project

    NASA Astrophysics Data System (ADS)

    Bouarar, Idir; Katinka Petersen, Anna; Brasseur, Guy; Granier, Claire; Xie, Ying; Wang, Xuemei; Fan, Qi; Wang, Lili

    2015-04-01

    Air pollution has become a pressing problem in Asia and specifically in China due to rapid increase in anthropogenic emissions related to growth of China's economic activity and increasing demand for energy in the past decade. Observed levels of particulate matter and ozone regularly exceed World Health Organization (WHO) air quality guidelines in many parts of the country leading to increased risk of respiratory illnesses and other health problems. The EU-funded project PANDA aims to establish a team of European and Chinese scientists to monitor air pollution over China and elaborate air quality indicators in support of European and Chinese policies. PANDA combines state-of-the-art air pollution modeling with space and surface observations of chemical species to improve methods for monitoring air quality. The modeling system of the PANDA project follows a downscaling approach: global models such as MOZART and MACC system provide initial and boundary conditions to regional WRF-Chem and EMEP simulations over East Asia. WRF-Chem simulations at higher resolution (e.g. 20km) are then performed over a smaller domain covering East China and initial and boundary conditions from this run are used to perform simulations at a finer resolution (e.g. 5km) over specific megacities like Shanghai. Here we present results of model simulations for January and July 2010 performed during the first year of the project. We show an intercomparison of the global (MACC, EMEP) and regional (WRF-Chem) simulations and a comprehensive evaluation with satellite measurements (NO2, CO) and in-situ data (O3, CO, NOx, PM10 and PM2.5) at several surface stations. Using the WRF-Chem model, we demonstrate that model performance is influenced not only by the resolution (e.g. 60km, 20km) but also the emission inventories used (MACCity, HTAPv2), their resolution and diurnal variation, and the choice of initial and boundary conditions (e.g. MOZART, MACC analysis).

  2. The Influence of the North Atlantic Oscillation on Tropospheric Distributions of Ozone and Carbon Monoxide.

    NASA Astrophysics Data System (ADS)

    Knowland, K. E.; Doherty, R. M.; Hodges, K.

    2015-12-01

    The influence of the North Atlantic Oscillation (NAO) on the tropospheric distributions of ozone (O3) and carbon monoxide (CO) has been quantified. The Monitoring Atmospheric Composition and Climate (MACC) Reanalysis, a combined meteorology and composition dataset for the period 2003-2012 (Innes et al., 2013), is used to investigate the composition of the troposphere and lower stratosphere in relation to the location of the storm track as well as other meteorological parameters over the North Atlantic associated with the different NAO phases. Cyclone tracks in the MACC Reanalysis compare well to the cyclone tracks in the widely-used ERA-Interim Reanalysis for the same 10-year period (cyclone tracking performed using the tracking algorithm of Hodges (1995, 1999)), as both are based on the European Centre for Medium-Range Weather Forecasts' (ECMWF) Integrated Forecast System (IFS). A seasonal analysis is performed whereby the MACC reanalysis meteorological fields, O3 and CO mixing ratios are weighted by the monthly NAO index values. The location of the main storm track, which tilts towards high latitudes (toward the Arctic) during positive NAO phases to a more zonal location in the mid-latitudes (toward Europe) during negative NAO phases, impacts the location of both horizontal and vertical transport across the North Atlantic and into the Arctic. During positive NAO seasons, the persistence of cyclones over the North Atlantic coupled with a stronger Azores High promotes strong horizontal transport across the North Atlantic throughout the troposphere. In all seasons, significantly more intense cyclones occur at higher latitudes (north of ~50°C) during the positive phase of the NAO and in the southern mid-latitudes during the negative NAO phase. This impacts the location of stratospheric intrusions within the descending dry airstream behind the associated cold front of the extratropical cyclone and the venting of low-level pollution up into the free troposphere within the warm conveyor belt airstream which rises ahead of the cold front.

  3. Factors affecting cardiovascular and cerebrovascular complications of carotid artery stenting in Northern Michigan: A retrospective study.

    PubMed

    Mammo, Dalia F; Cheng, Chin-I; Ragina, Neli P; Alani, Firas

    This study seeks to identify factors associated with periprocedural complications of carotid artery stenting (CAS) to best understand CAS complication rates and optimize patient outcomes. Periprocedural complications include major adverse cardiovascular and cerebrovascular events (MACCE) that include myocardial infarction (MI), stroke, or death. We retrospectively analyzed 181 patients from Northern Michigan who underwent CAS. Rates of stroke, MI, and death occurring within 30days post-procedure were examined. Associations of open vs. closed cell stent type, demographics, comorbidities, and symptomatic carotid stenosis were compared to determine significance. All patients had three NIH Stroke Scale (NIHSS) exams: at baseline, 24h post-procedure, and at the one-month visit. Cardiac enzymes were measured twice in all patients, within 24h post-procedure. All patients were treated with dual anti-platelet therapy for at least 6months post-procedure. Three patients (1.66%) experienced a major complication within one-month post-procedure. These complications included one MI (0.55%), one stroke (0.55%), and one death (0.55%). The following variable factors were not associated with the occurrence of MACCE complications within 30days post-procedure: stent design (open vs. closed cell) (p=1.000), age ≥80 (p=0.559), smoking history (p=0.569), hypertension (p=1.000), diabetes (p=1.000), and symptomatic carotid stenosis (p=0.254). Age of 80years old or above, symptomatic carotid stenosis, open-cell stent design, and history of diabetes, smoking, or hypertension were not found to have an association with MACCE within 1month after CAS. Future studies using a greater sample size will be beneficial to better assess periprocedural complication risks of CAS, while also considering the effect of operator experience and technological advancements on decreasing periprocedural complication rates. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Effect of Concentration on the Electrochemistry and Speciation of the Magnesium Aluminum Chloride Complex Electrolyte Solution.

    PubMed

    See, Kimberly A; Liu, Yao-Min; Ha, Yeyoung; Barile, Christopher J; Gewirth, Andrew A

    2017-10-18

    Magnesium batteries offer an opportunity to use naturally abundant Mg and achieve large volumetric capacities reaching over four times that of conventional Li-based intercalation anodes. High volumetric capacity is enabled by the use of a Mg metal anode in which charge is stored via electrodeposition and stripping processes, however, electrolytes that support efficient Mg electrodeposition and stripping are few and are often prepared from highly reactive compounds. One interesting electrolyte solution that supports Mg deposition and stripping without the use of highly reactive reagents is the magnesium aluminum chloride complex (MACC) electrolyte. The MACC exhibits high Coulombic efficiencies and low deposition overpotentials following an electrolytic conditioning protocol that stabilizes species necessary for such behavior. Here, we discuss the effect of the MgCl 2 and AlCl 3 concentrations on the deposition overpotential, current density, and the conditioning process. Higher concentrations of MACC exhibit enhanced Mg electrodeposition current density and much faster conditioning. An increase in the salt concentrations causes a shift in the complex equilibria involving both cations. The conditioning process is strongly dependent on the concentration suggesting that the electrolyte is activated through a change in speciation of electrolyte complexes and is not simply due to the annihilation of electrolyte impurities. Additionally, the presence of the [Mg 2 (μ-Cl) 3 ·6THF] + in the electrolyte solution is again confirmed through careful analysis of experimental Raman spectra coupled with simulation and direct observation of the complex in sonic spray ionization mass spectrometry. Importantly, we suggest that the ∼210 cm -1 mode commonly observed in the Raman spectra of many Mg electrolytes is indicative of the C 3v symmetric [Mg 2 (μ-Cl) 3 ·6THF] + . The 210 cm -1 mode is present in many electrolytes containing MgCl 2 , so its assignment is of broad interest to the Mg electrolyte community.

  5. Methodology for Air Quality Forecast Downscaling from Regional- to Street-Scale

    NASA Astrophysics Data System (ADS)

    Baklanov, Alexander; Nuterman, Roman; Mahura, Alexander; Amstrup, Bjarne; Hansen Saas, Bent; Havskov Sørensen, Jens; Lorenzen, Thomas; Weismann, Jakob

    2010-05-01

    The most serious air pollution events occur in cities where there is a combination of high population density and air pollution, e.g. from vehicles. The pollutants can lead to serious human health problems, including asthma, irritation of the lungs, bronchitis, pneumonia, decreased resistance to respiratory infections, and premature death. In particular air pollution is associated with increase in cardiovascular disease and lung cancer. In 2000 WHO estimated that between 2.5 % and 11 % of total annual deaths are caused by exposure to air pollution. However, European-scale air quality models are not suited for local forecasts, as their grid-cell is typically of the order of 5 to 10km and they generally lack detailed representation of urban effects. Two suites are used in the framework of the EC FP7 project MACC (Monitoring of Atmosphere Composition and Climate) to demonstrate how downscaling from the European MACC ensemble to local-scale air quality forecast will be carried out: one will illustrate capabilities for the city of Copenhagen (Denmark); the second will focus on the city of Bucharest (Romania). This work is devoted to the first suite, where methodological aspects of downscaling from regional (European/ Denmark) to urban scale (Copenhagen), and from the urban down to street scale. The first results of downscaling according to the proposed methodology are presented. The potential for downscaling of European air quality forecasts by operating urban and street-level forecast models is evaluated. This will bring a strong support for continuous improvement of the regional forecast modelling systems for air quality in Europe, and underline clear perspectives for the future regional air quality core and downstream services for end-users. At the end of the MACC project, requirements on "how-to-do" downscaling of European air-quality forecasts to the city and street levels with different approaches will be formulated.

  6. The safety, efficacy and cost-effectiveness of stress echocardiography in patients with high pretest probability of coronary artery disease.

    PubMed

    Papachristidis, Alexandros; Demarco, Daniela Cassar; Roper, Damian; Tsironis, Ioannis; Papitsas, Michael; Byrne, Jonathan; Alfakih, Khaled; Monaghan, Mark J

    2017-01-01

    In this study, we assess the clinical and cost-effectiveness of stress echocardiography (SE), as well as the place of SE in patients with high pretest probability (PTP) of coronary artery disease (CAD). We investigated 257 patients with no history of CAD, who underwent SE, and they had a PTP risk score >61% (high PTP). According to the National Institute for Health and Care Excellence guidance (NICE CG95, 2010), these patients should be investigated directly with an invasive coronary angiogram (ICA). We investigated those patients with SE initially and then with ICA when appropriate. Follow-up data with regard to Major Adverse Cardiac and Cerebrovascular Events (MACCE, defined as cardiovascular mortality, cerebrovascular accident (CVA), myocardial infarction (MI) and late revascularisation for acute coronary syndrome/unstable angina) were recorded for a period of 12 months following the SE. The tariff for SE and ICA is £300 and £1400, respectively. 106 patients had a positive SE (41.2%) and 61 of them (57.5%) had further investigation with ICA. 15 (24.6%) of these patients were revascularised. The average cost per patient for investigations was £654.09. If NICE guidance had been followed, the cost would have been significantly higher at £1400 (p<0.001). Overall, 5 MACCE (2.0%) were recorded; 4 (3.8%) in the group of positive SE (2 CVAs and 2 MIs) and 1 (0.7%) in the group of negative SE (1 CVA). There was no MI and no need for revascularisation in the negative SE group. Our approach to investigate patients who present with de novo chest pain and high PTP, with SE initially and subsequently with ICA when appropriate, reduces the cost significantly (£745.91 per patient) with a very low rate of MACCE. However, this study is underpowered to assess safety of SE.

  7. Phytohormone Interaction Modulating Fruit Responses to Photooxidative and Heat Stress on Apple (Malus domestica Borkh.).

    PubMed

    Torres, Carolina A; Sepúlveda, Gloria; Kahlaoui, Besma

    2017-01-01

    Sun-related physiological disorders such as sun damage on apples ( Malus domestica Borkh) are caused by cumulative photooxidative and heat stress during their growing season triggering morphological, physiological, and biochemical changes in fruit tissues not only while it is on the tree but also after it has been harvested. The objective of the work was to establish the interaction of auxin (indole-3-acetic acid; IAA), abscisic acid (ABA), jasmonic acid (JA), salicylic acid (SA), and ethylene (ET) and its precursor ACC (free and conjugated, MACC) during development of sun-injury-related disorders pre- and post-harvest on apples. Peel tissue was extracted from fruit growing under different sun exposures (Non-exposed, NE; Exposed, EX) and with sun injury symptoms (Moderate, Mod). Sampling was carried out every 15 days from 75 days after full bloom (DAFB) until 120 days post-harvest in cold storage (1°C, > 90%RH). Concentrations of IAA, ABA, JA, SA, were determined using UHPLC mass spectrometry, and ET and ACC (free and conjugated MACC) using gas chromatography. IAA was found not to be related directly to sun injury development, but it decreased 60% in sun exposed tissue, and during fruit development. ABA, JA, SA, and ethylene concentrations were significantly higher ( P ≤ 0.05) in Mod tissue, but their concentration, except for ethylene, were not affected by sun exposure. ACC and MACC concentrations increased until 105 DAFB in all sun exposure categories. During post-harvest, ethylene climacteric peak was delayed on EX compared to Mod. ABA and SA concentrations remained stable throughout storage in both tissue. JA dramatically increased post-harvest in both EX and Mod tissue, and orchards, confirming its role in low temperature tolerance. The results suggest that ABA, JA, and SA together with ethylene are modulating some of the abiotic stress defense responses on sun-exposed fruit during photooxidative and heat stress on the tree.

  8. Phytohormone Interaction Modulating Fruit Responses to Photooxidative and Heat Stress on Apple (Malus domestica Borkh.)

    PubMed Central

    Torres, Carolina A.; Sepúlveda, Gloria; Kahlaoui, Besma

    2017-01-01

    Sun-related physiological disorders such as sun damage on apples (Malus domestica Borkh) are caused by cumulative photooxidative and heat stress during their growing season triggering morphological, physiological, and biochemical changes in fruit tissues not only while it is on the tree but also after it has been harvested. The objective of the work was to establish the interaction of auxin (indole-3-acetic acid; IAA), abscisic acid (ABA), jasmonic acid (JA), salicylic acid (SA), and ethylene (ET) and its precursor ACC (free and conjugated, MACC) during development of sun-injury-related disorders pre- and post-harvest on apples. Peel tissue was extracted from fruit growing under different sun exposures (Non-exposed, NE; Exposed, EX) and with sun injury symptoms (Moderate, Mod). Sampling was carried out every 15 days from 75 days after full bloom (DAFB) until 120 days post-harvest in cold storage (1°C, > 90%RH). Concentrations of IAA, ABA, JA, SA, were determined using UHPLC mass spectrometry, and ET and ACC (free and conjugated MACC) using gas chromatography. IAA was found not to be related directly to sun injury development, but it decreased 60% in sun exposed tissue, and during fruit development. ABA, JA, SA, and ethylene concentrations were significantly higher (P ≤ 0.05) in Mod tissue, but their concentration, except for ethylene, were not affected by sun exposure. ACC and MACC concentrations increased until 105 DAFB in all sun exposure categories. During post-harvest, ethylene climacteric peak was delayed on EX compared to Mod. ABA and SA concentrations remained stable throughout storage in both tissue. JA dramatically increased post-harvest in both EX and Mod tissue, and orchards, confirming its role in low temperature tolerance. The results suggest that ABA, JA, and SA together with ethylene are modulating some of the abiotic stress defense responses on sun-exposed fruit during photooxidative and heat stress on the tree. PMID:29491868

  9. TriGuard™ HDH embolic deflection device for cerebral protection during transcatheter aortic valve replacement.

    PubMed

    Samim, Mariam; van der Worp, Bart; Agostoni, Pierfrancesco; Hendrikse, Jeroen; Budde, Ricardo P J; Nijhoff, Freek; Ramjankhan, Faiz; Doevendans, Pieter A; Stella, Pieter R

    2017-02-15

    This study aims to evaluate the safety and performance of the new embolic deflection device TriGuard™HDH in patients undergoing TAVR. Transcatheter aortic valve replacement (TAVR) is associated with a high incidence of new cerebral ischemic lesions. The use of an embolic protection device may reduce the frequency of TAVR-related embolic events. This prospective, single arm feasibility pilot study included 14 patients with severe symptomatic aortic stenosis scheduled for TAVR. Cerebral diffusion weighted magnetic resonance imaging (DWI) was planned in all patients one day before and at day 4 (±2) after the procedure. Major adverse cerebral and cardiac events (MACCEs) were recorded for all patients. Primary endpoints of this study were I) device performance success defined as coverage of the aortic arch takeoffs throughout the entire TAVR procedure and II) MACCE occurrence. Secondary endpoints included the number and the volume of new cerebral ischemic lesions on DWI. Thirteen patients underwent transfemoral TAVR and one patient a transapical procedure. Edwards SAPIEN valve prosthesis was implanted in 8 (57%) patients and Medtronic CoreValve prosthesis in the remaining 6 (43%). Predefined performance success of the TriGuard™HDH device was achieved in 9 (64%) patients. The composite endpoint MACCE occurred in none of the patients. Post-procedural DWI was performed in 11 patients. Comparing the DWI of these patients to a historical control group showed no reduction in number [median 5.5 vs. 5.0, P = 0.857], however there was a significant reduction in mean lesion volume per patient [median 13.8 vs. 25.1, P = 0.049]. This study showed the feasibility and safety of using the TriGuard™HDH for cerebral protection during TAVR. This device did not decrease the number of post-procedural new cerebral DWI lesions, however its use showed decreased lesion volume as compared to unprotected TAVR. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Systematic review of preoperative physical activity and its impact on postcardiac surgical outcomes.

    PubMed

    Kehler, D Scott; Stammers, Andrew N; Tangri, Navdeep; Hiebert, Brett; Fransoo, Randy; Schultz, Annette S H; Macdonald, Kerry; Giacomontonio, Nicholas; Hassan, Ansar; Légaré, Jean-Francois; Arora, Rakesh C; Duhamel, Todd A

    2017-08-11

    The objective of this systematic review was to study the impact of preoperative physical activity levels on adult cardiac surgical patients' postoperative: (1) major adverse cardiac and cerebrovascular events (MACCEs), (2) adverse events within 30 days, (3) hospital length of stay (HLOS), (4) intensive care unit length of stay (ICU LOS), (5) activities of daily living (ADLs), (6) quality of life, (7) cardiac rehabilitation attendance and (8) physical activity behaviour. A systematic search of MEDLINE, Embase, AgeLine and Cochrane library for cohort studies was conducted. Eleven studies (n=5733 patients) met the inclusion criteria. Only self-reported physical activity tools were used. Few studies used multivariate analyses to compare active versus inactive patients prior to surgery. When comparing patients who were active versus inactive preoperatively, there were mixed findings for MACCE, 30 day adverse events, HLOS and ICU LOS. Of the studies that adjusted for confounding variables, five studies found a protective, independent association between physical activity and MACCE (n=1), 30-day postoperative events (n=2), HLOS (n=1) and ICU LOS (n=1), but two studies found no protective association for 30-day postoperative events (n=1) and postoperative ADLs (n=1). No studies investigated if activity status before surgery impacted quality of life or cardiac rehabilitation attendance postoperatively. Three studies found that active patients prior to surgery were more likely to be inactive postoperatively. Due to the mixed findings, the literature does not presently support that self-reported preoperative physical activity behaviour is associated with postoperative cardiac surgical outcomes. Future studies should objectively measure physical activity, clearly define outcomes and adjust for clinically relevant variables. Trial registration number NCT02219815. PROSPERO number CRD42015023606. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. An Analysis of CPA Firm Recruiters' Perceptions of Online Masters of Accounting Degrees

    ERIC Educational Resources Information Center

    Metrejean, Eddie; Noland, Thomas G.

    2011-01-01

    Online education continues to grow at a rapid pace. Assessment of the effectiveness of online programs is needed to differentiate legitimate programs from diploma mills. The authors examined the perceptions of CPA firm recruiters on whether an online Master of Accounting (MACC) matters in the hiring decision. Results show that recruiters do not…

  12. Joint Interdiction

    DTIC Science & Technology

    2016-09-09

    law enforcement detachment (USCG) LEO law enforcement operations LOC line of communications MACCS Marine air command and control system MAS...enemy command and control [C2], intelligence, fires, reinforcing units, lines of communications [ LOCs ], logistics, and other operational- and tactical...enemy naval, engineering, and personnel resources to the tasks of repairing and recovering damaged equipment, facilities, and LOCs . It can draw the

  13. PsyScript: a Macintosh application for scripting experiments.

    PubMed

    Bates, Timothy C; D'Oliveiro, Lawrence

    2003-11-01

    PsyScript is a scriptable application allowing users to describe experiments in Apple's compiled high-level object-oriented AppleScript language, while still supporting millisecond or better within-trial event timing (delays can be in milliseconds or refresh-based, and PsyScript can wait on external I/O, such as eye movement fixations). Because AppleScript is object oriented and system-wide, PsyScript experiments support complex branching, code reuse, and integration with other applications. Included AppleScript-based libraries support file handling and stimulus randomization and sampling, as well as more specialized tasks, such as adaptive testing. Advanced features include support for the BBox serial port button box, as well as a low-cost USB-based digital I/O card for millisecond timing, recording of any number and types of responses within a trial, novel responses, such as graphics tablet drawing, and use of the Macintosh sound facilities to provide an accurate voice key, saving voice responses to disk, scriptable image creation, support for flicker-free animation, and gaze-dependent masking. The application is open source, allowing researchers to enhance the feature set and verify internal functions. Both the application and the source are available for free download at www.maccs.mq.edu.au/-tim/psyscript/.

  14. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  15. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less

  16. Development and application of the GIM code for the Cyber 203 computer

    NASA Technical Reports Server (NTRS)

    Stainaker, J. F.; Robinson, M. A.; Rawlinson, E. G.; Anderson, P. G.; Mayne, A. W.; Spradley, L. W.

    1982-01-01

    The GIM computer code for fluid dynamics research was developed. Enhancement of the computer code, implicit algorithm development, turbulence model implementation, chemistry model development, interactive input module coding and wing/body flowfield computation are described. The GIM quasi-parabolic code development was completed, and the code used to compute a number of example cases. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and implicit finite difference scheme were also added. Development was completed on the interactive module for generating the input data for GIM. Solutions for inviscid hypersonic flow over a wing/body configuration are also presented.

  17. Report from Hawai'i: The Rising Tide of Arts Education in the Islands

    ERIC Educational Resources Information Center

    Wood, Paul

    2005-01-01

    The establishment of Maui Arts & Cultural Center (MACC), a community arts facility that prioritizes education at the top of its mission, has been a significant factor in the growth of arts education in Hawai'i. This article describes the role such a facility can play in the kind of educational reform that people envision, and the author's…

  18. Comparative Evaluation of the Impact of WRF-NMM and WRF-ARW Meteorology on CMAQ Simulations for O3 and Related Species During the 2006 TexAQS/GoMACCS Campaign

    EPA Science Inventory

    In this paper, impact of meteorology derived from the Weather, Research and Forecasting (WRF)– Non–hydrostatic Mesoscale Model (NMM) and WRF–Advanced Research WRF (ARW) meteorological models on the Community Multiscale Air Quality (CMAQ) simulations for ozone and its related prec...

  19. Implementation of a 3D mixing layer code on parallel computers

    NASA Technical Reports Server (NTRS)

    Roe, K.; Thakur, R.; Dang, T.; Bogucz, E.

    1995-01-01

    This paper summarizes our progress and experience in the development of a Computational-Fluid-Dynamics code on parallel computers to simulate three-dimensional spatially-developing mixing layers. In this initial study, the three-dimensional time-dependent Euler equations are solved using a finite-volume explicit time-marching algorithm. The code was first programmed in Fortran 77 for sequential computers. The code was then converted for use on parallel computers using the conventional message-passing technique, while we have not been able to compile the code with the present version of HPF compilers.

  20. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  1. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  2. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  3. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  4. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  5. Corruption in Myanmar - Holding a Country and its People from Economic Prosperity

    DTIC Science & Technology

    2014-10-30

    censorship laws and freedom to information by banning independent newspapers thereby repressing efforts towards democracy even further. 6 The SPP... censorship laws, insisting state officials return embezzled funds, signing and ratifying the United Nations Convention against Corruption (UNCAC), and...instill a culture of change. For example, in Malaysia , the government formed the Malaysian Anti-Corruption Commission (MACC), an independent watch

  6. Effects of continuous positive airway pressure on anxiety, depression, and major cardiac and cerebro-vascular events in obstructive sleep apnea patients with and without coronary artery disease.

    PubMed

    Lee, Ming-Chung; Shen, Yu-Chih; Wang, Ji-Hung; Li, Yu-Ying; Li, Tzu-Hsien; Chang, En-Ting; Wang, Hsiu-Mei

    2017-01-01

    Obstructive sleep apnea (OSA) is associated with bad cardiovascular outcomes and a high prevalence of anxiety and depression. This study investigated the effects of continuous positive airway pressure (CPAP) on the severity of anxiety and depression in OSA patients with or without coronary artery disease (CAD) and on the rate of cardio- and cerebro-vascular events in those with OSA and CAD. This prospective study included patients with moderate-to-severe OSA, with or without a recent diagnosis of CAD; all were started on CPAP therapy. Patients completed the Chinese versions of the Beck Anxiety Inventory (BAI) and Beck Depression Inventory-II (BDI-II) at baseline and after 6-month follow-up. The occurrence of major adverse cardiac and cerebrovascular events (MACCE) was assessed every 3 months up to 1 year. BAI scores decreased from 8.5 ± 8.4 at baseline to 5.4 ± 6.9 at 6 months in CPAP-compliant OSA patients without CAD ( P < 0.05). BAI scores also decreased from 20.7 ± 14.9 to 16.1 ± 14.5 in CPAP-compliant OSA patients with CAD. BDI-II scores decreased in CPAP-compliant OSA patients without CAD (from 11.1 ± 10.7 at baseline to 6.6 ± 9.5 at 6 months) and in CPAP-compliant OSA patients with CAD (from 20.4 ± 14.3 to 15.9 ± 7.3). In addition, there was a large effect size (ES) of BAI and BDI in 6-month CPAP treatment of OSA patients with CAD and a large ES in those with OSA under CPAP treatment. In OSA patients with CAD, the occurrence of MACCE was significantly lower in CPAP-compliant patients than that in CPAP noncompliant patients (11% in CPAP compliant and 50% in noncompliant; P < 0.05). CPAP improved anxiety and depression in OSA patients regardless of CAD. In OSA patients with CAD, CPAP-compliant patients had a lower 1-year rate of MACCE than CPAP-noncompliant patients.

  7. Percutaneous coronary intervention vs coronary artery bypass grafting for left main coronary artery disease? A systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Sharma, Sharan P; Dahal, Khagendra; Khatra, Jaspreet; Rosenfeld, Alan; Lee, Juyong

    2017-06-01

    It is not clear whether percutaneous coronary intervention (PCI) is as effective and safe as coronary artery bypass grafting (CABG) for left main coronary artery disease. We aimed to perform a systematic review and meta-analysis of all randomized controlled trials (RCTs) that compared PCI and CABG in left main coronary disease. We searched PubMed, EMBASE, Cochrane, Scopus and relevant references for RCTs (inception through, November 20, 2016 without language restrictions) and performed meta-analysis using random-effects model. All-cause mortality, myocardial infarction, revascularization rate, stroke, and major adverse cardiac and cerebrovascular events (MACCE) were the measured outcomes. Six RCTs with a total population of 4700 were analyzed. There was no difference in all-cause mortality at 30-day, one-year, and five-year (1.8% vs 1.1%; OR 0.60; 95% CI: 0.26-1.39; P=.23; I 2 =9%) follow-up between PCI and CABG. CABG group had less myocardial infarction (MI) at five-year follow-up than PCI (5% vs 2.5%; OR 2.04; CI: 1.30-3.19; P=.002; I 2 =1%). Revascularization rate favored CABG in one-year (8.6% vs 4.5%; OR 2; CI: 1.46-2.73; P<.0001; I 2 =45%) and five-year (15.9% vs 9.9%; OR 1.73; CI: 1.36-2.20; P<.0001; I 2 =0%) follow-up. Although stroke rate was lower in PCI group at 1 year, there was no difference in longer follow-up. MACCE at 5 years favored CABG (24% vs 18%; OR 1.45; CI: 1.19-1.76; P=.0001; I 2 =0%). On subgroup analysis, MACCE were not different between two groups in low-to-intermediate SYNTAX group while it was higher for PCI group with high SYNTAX group. Percutaneous coronary intervention could be as safe and effective as CABG in a select group of left main coronary artery disease patients. © 2017 John Wiley & Sons Ltd.

  8. Computer Description of Black Hawk Helicopter

    DTIC Science & Technology

    1979-06-01

    Model Combinatorial Geometry Models Black Hawk Helicopter Helicopter GIFT Computer Code Geometric Description of Targets 20. ABSTRACT...description was made using the technique of combinatorial geometry (COM-GEOM) and will be used as input to the GIFT computer code which generates Tliic...rnHp The data used bv the COVART comtmter code was eenerated bv the Geometric Information for Targets ( GIFT )Z computer code. This report documents

  9. User manual for semi-circular compact range reflector code: Version 2

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Burnside, Walter D.

    1987-01-01

    A computer code has been developed at the Ohio State University ElectroScience Laboratory to analyze a semi-circular paraboloidal reflector with or without a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the reflector or its individual components at a given distance from the center of the paraboloid. The code computes the fields along a radial, horizontal, vertical or axial cut at that distance. Thus, it is very effective in computing the size of the sweet spot for a semi-circular compact range reflector. This report describes the operation of the code. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.

  10. Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burk, K.W.; Andrews, G.L.

    1989-02-01

    The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to themore » Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.« less

  11. User's manual for semi-circular compact range reflector code

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Burnside, Walter D.

    1986-01-01

    A computer code was developed to analyze a semi-circular paraboloidal reflector antenna with a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the antenna or its individual components at a given distance from the center of the paraboloid. Thus, it is very effective in computing the size of the sweet spot for RCS or antenna measurement. The operation of the code is described. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.

  12. Highly fault-tolerant parallel computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spielman, D.A.

    We re-introduce the coded model of fault-tolerant computation in which the input and output of a computational device are treated as words in an error-correcting code. A computational device correctly computes a function in the coded model if its input and output, once decoded, are a valid input and output of the function. In the coded model, it is reasonable to hope to simulate all computational devices by devices whose size is greater by a constant factor but which are exponentially reliable even if each of their components can fail with some constant probability. We consider fine-grained parallel computations inmore » which each processor has a constant probability of producing the wrong output at each time step. We show that any parallel computation that runs for time t on w processors can be performed reliably on a faulty machine in the coded model using w log{sup O(l)} w processors and time t log{sup O(l)} w. The failure probability of the computation will be at most t {center_dot} exp(-w{sup 1/4}). The codes used to communicate with our fault-tolerant machines are generalized Reed-Solomon codes and can thus be encoded and decoded in O(n log{sup O(1)} n) sequential time and are independent of the machine they are used to communicate with. We also show how coded computation can be used to self-correct many linear functions in parallel with arbitrarily small overhead.« less

  13. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  14. A generalized one-dimensional computer code for turbomachinery cooling passage flow calculations

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Roelke, Richard J.; Meitner, Peter L.

    1989-01-01

    A generalized one-dimensional computer code for analyzing the flow and heat transfer in the turbomachinery cooling passages was developed. This code is capable of handling rotating cooling passages with turbulators, 180 degree turns, pin fins, finned passages, by-pass flows, tip cap impingement flows, and flow branching. The code is an extension of a one-dimensional code developed by P. Meitner. In the subject code, correlations for both heat transfer coefficient and pressure loss computations were developed to model each of the above mentioned type of coolant passages. The code has the capability of independently computing the friction factor and heat transfer coefficient on each side of a rectangular passage. Either the mass flow at the inlet to the channel or the exit plane pressure can be specified. For a specified inlet total temperature, inlet total pressure, and exit static pressure, the code computers the flow rates through the main branch and the subbranches, flow through tip cap for impingement cooling, in addition to computing the coolant pressure, temperature, and heat transfer coefficient distribution in each coolant flow branch. Predictions from the subject code for both nonrotating and rotating passages agree well with experimental data. The code was used to analyze the cooling passage of a research cooled radial rotor.

  15. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †

    PubMed Central

    Murdani, Muhammad Harist; Hong, Bonghee

    2018-01-01

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes (Ad-Hoc) and neighborhood proximity (Top-K). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space. PMID:29587366

  16. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †.

    PubMed

    Murdani, Muhammad Harist; Kwon, Joonho; Choi, Yoon-Ho; Hong, Bonghee

    2018-03-24

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes ( Ad-Hoc ) and neighborhood proximity ( Top-K ). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.

  17. Comparative Evaluation of the Impact of WRF/NMM and WRF/ARW Meteorology on CMAQ Simulations for PM2.5 and its Related Precursors during the 2006 TexAQS/GoMACCS Study

    EPA Science Inventory

    This study presents a comparative evaluation of the impact of WRF-NMM and WRF-ARW meteorology on CMAQ simulations of PM2.5, its composition and related precursors over the eastern United States with the intensive observations obtained by aircraft (NOAA WP-3), ship and ...

  18. Unveiling the High Energy Obscured Universe: Hunting Collapsed Objects Physics

    NASA Technical Reports Server (NTRS)

    Ubertini, P.; Bazzano, A.; Cocchi, M.; Natalucci, L.; Bassani, L.; Caroli, E.; Stephen, J. B.; Caraveo, P.; Mereghetti, S.; Villa, G.

    2005-01-01

    A large part of energy from space is coming from collapsing stars (SN, Hypernovae) and collapsed stars (black holes, neutron stars and white dwarfs). The peak of their energy release is in the hard-X and gamma-ray wavelengths where photons are insensitive to absorption and can travel from the edge the Universe or the central core of the Galaxy without loosing the primordial information of energy, time signature and polarization. The most efficient process to produce energetic photons is gravitational accretion of matter from a "normal" star onto a collapsed companion (LGxMcollxdMacc/dtx( 1Rdisc)-dMacc/dt x c2), exceeding by far the nuclear reaction capability to generate high energy quanta. Thus our natural laboratory for "in situ" investigations are collapsed objects in which matter and radiation co-exist in extreme conditions of temperature and density due to gravitationally bent geometry and magnetic fields. This is a unique opportunity to study the physics of accretion flows in stellar mass and super-massive Black Holes (SMBHs), plasmoids generated in relativistic jets in galactic microQSOs and AGNs, ionised plasma interacting at the touching point of weakly magnetized NS surface, GRB/Supernovae connection, and the mysterious origins of "dark" GRB and X-ray flash.

  19. Culotte stenting for coronary bifurcation lesions with 2nd and 3rd generation everolimus-eluting stents: the CELTIC Bifurcation Study.

    PubMed

    Walsh, Simon J; Hanratty, Colm G; Watkins, Stuart; Oldroyd, Keith G; Mulvihill, Niall T; Hensey, Mark; Chase, Alex; Smith, Dave; Cruden, Nick; Spratt, James C; Mylotte, Darren; Johnson, Tom; Hill, Jonathan; Hussein, Hafiz M; Bogaerts, Kris; Morice, Marie-Claude; Foley, David P

    2018-05-24

    The aim of this study was to provide contemporary outcome data for patients with de novo coronary disease and Medina 1,1,1 lesions who were treated with a culotte two-stent technique, and to compare the performance of two modern-generation drug-eluting stent (DES) platforms, the 3-connector XIENCE and the 2-connector SYNERGY. Patients with Medina 1,1,1 bifurcation lesions who had disease that was amenable to culotte stenting were randomised 1:1 to treatment with XIENCE or SYNERGY DES. A total of 170 patients were included. Technical success and final kissing balloon inflation occurred in >96% of cases. Major adverse cardiovascular or cerebrovascular events (MACCE: a composite of death, myocardial infarction [MI], cerebrovascular accident [CVA] and target vessel revascularisation [TVR]) occurred in 5.9% of patients by nine months. The primary endpoint was a composite of death, MI, CVA, target vessel failure (TVF), stent thrombosis and binary angiographic restenosis. At nine months, the primary endpoint occurred in 19% of XIENCE patients and 16% of SYNERGY patients (p=0.003 for non-inferiority for platform performance). MACCE rates for culotte stenting using contemporary everolimus-eluting DES are low at nine months. The XIENCE and SYNERGY stents demonstrated comparable performance for the primary endpoint.

  20. Simultaneous Traveling Convection Vortex (TCV) Events and Pc 1-2 Wave Bursts at Cusp/Cleft Latitudes observed in Arctic Canada and Svalbard

    NASA Astrophysics Data System (ADS)

    Posch, J. L.; Witte, A. J.; Engebretson, M. J.; Murr, D.; Lessard, M.; Raita, T.; Singer, H. J.

    2010-12-01

    Traveling convection vortices (TCVs), which appear in ground magnetometer records at near-cusp latitudes as solitary ~5 mHz pulses, are now known to originate in instabilities in the ion foreshock just upstream of Earth’s bow shock. They can also stimulate compressions or relaxations of the dayside magnetosphere (evident in geosynchronous satellite data). These transient compressions can in turn sharply increase the growth rate of electromagnetic ion cyclotron (EMIC) waves, which also appear in ground records at near-cusp latitudes as bursts of Pc 1-2 pulsations. In this study we have identified simultaneous TCV - Pc 1-2 burst events occurring from 2008 through the first 7 months of 2010 in Eastern Arctic Canada and Svalbard, using a combination of fluxgate magnetometers (MACCS and IMAGE) and search coil magnetometers in each region. Magnetometer observations at GOES 10 and 12, at longitudes near the MACCS sites, are also used to characterize the strength of the magnetic perturbations. There is no direct proportion between the amplitude of TCV and Pc 1-2 wave events in either region, consistent with the highly variable densities and pitch angle distributions of plasma of ring current / plasma sheet energies in the outer dayside magnetosphere.

  1. Evaluation of a new microphysical aerosol module in the ECMWF Integrated Forecasting System

    NASA Astrophysics Data System (ADS)

    Woodhouse, Matthew; Mann, Graham; Carslaw, Ken; Morcrette, Jean-Jacques; Schulz, Michael; Kinne, Stefan; Boucher, Olivier

    2013-04-01

    The Monitoring Atmospheric Composition and Climate II (MACC-II) project will provide a system for monitoring and predicting atmospheric composition. As part of the first phase of MACC, the GLOMAP-mode microphysical aerosol scheme (Mann et al., 2010, GMD) was incorporated within the ECMWF Integrated Forecasting System (IFS). The two-moment modal GLOMAP-mode scheme includes new particle formation, condensation, coagulation, cloud-processing, and wet and dry deposition. GLOMAP-mode is already incorporated as a module within the TOMCAT chemistry transport model and within the UK Met Office HadGEM3 general circulation model. The microphysical, process-based GLOMAP-mode scheme allows an improved representation of aerosol size and composition and can simulate aerosol evolution in the troposphere and stratosphere. The new aerosol forecasting and re-analysis system (known as IFS-GLOMAP) will also provide improved boundary conditions for regional air quality forecasts, and will benefit from assimilation of observed aerosol optical depths in near real time. Presented here is an evaluation of the performance of the IFS-GLOMAP system in comparison to in situ aerosol mass and number measurements, and remotely-sensed aerosol optical depth measurements. Future development will provide a fully-coupled chemistry-aerosol scheme, and the capability to resolve nitrate aerosol.

  2. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  3. "Hour of Code": Can It Change Students' Attitudes toward Programming?

    ERIC Educational Resources Information Center

    Du, Jie; Wimmer, Hayden; Rada, Roy

    2016-01-01

    The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…

  4. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    ERIC Educational Resources Information Center

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  5. Guidelines for developing vectorizable computer programs

    NASA Technical Reports Server (NTRS)

    Miner, E. W.

    1982-01-01

    Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.

  6. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  7. Enhanced fault-tolerant quantum computing in d-level systems.

    PubMed

    Campbell, Earl T

    2014-12-05

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  8. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  9. Implementation of radiation shielding calculation methods. Volume 1: Synopsis of methods and summary of results

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.

  10. Nonuniform code concatenation for universal fault-tolerant quantum computing

    NASA Astrophysics Data System (ADS)

    Nikahd, Eesa; Sedighi, Mehdi; Saheb Zamani, Morteza

    2017-09-01

    Using transversal gates is a straightforward and efficient technique for fault-tolerant quantum computing. Since transversal gates alone cannot be computationally universal, they must be combined with other approaches such as magic state distillation, code switching, or code concatenation to achieve universality. In this paper we propose an alternative approach for universal fault-tolerant quantum computing, mainly based on the code concatenation approach proposed in [T. Jochym-O'Connor and R. Laflamme, Phys. Rev. Lett. 112, 010505 (2014), 10.1103/PhysRevLett.112.010505], but in a nonuniform fashion. The proposed approach is described based on nonuniform concatenation of the 7-qubit Steane code with the 15-qubit Reed-Muller code, as well as the 5-qubit code with the 15-qubit Reed-Muller code, which lead to two 49-qubit and 47-qubit codes, respectively. These codes can correct any arbitrary single physical error with the ability to perform a universal set of fault-tolerant gates, without using magic state distillation.

  11. Green's function methods in heavy ion shielding

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.

    1993-01-01

    An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.

  12. Analytical modeling of operating characteristics of premixing-prevaporizing fuel-air mixing passages. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Chiappetta, L. M.; Edwards, D. E.; Mcvey, J. B.

    1982-01-01

    A user's manual describing the operation of three computer codes (ADD code, PTRAK code, and VAPDIF code) is presented. The general features of the computer codes, the input/output formats, run streams, and sample input cases are described.

  13. Automated apparatus and method of generating native code for a stitching machine

    NASA Technical Reports Server (NTRS)

    Miller, Jeffrey L. (Inventor)

    2000-01-01

    A computer system automatically generates CNC code for a stitching machine. The computer determines the locations of a present stitching point and a next stitching point. If a constraint is not found between the present stitching point and the next stitching point, the computer generates code for making a stitch at the next stitching point. If a constraint is found, the computer generates code for changing a condition (e.g., direction) of the stitching machine's stitching head.

  14. Computer codes developed and under development at Lewis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1992-01-01

    The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.

  15. Operational source receptor calculations for large agglomerations

    NASA Astrophysics Data System (ADS)

    Gauss, Michael; Shamsudheen, Semeena V.; Valdebenito, Alvaro; Pommier, Matthieu; Schulz, Michael

    2016-04-01

    For Air quality policy an important question is how much of the air pollution within an urbanized region can be attributed to local sources and how much of it is imported through long-range transport. This is critical information for a correct assessment of the effectiveness of potential emission measures. The ratio between indigenous and long-range transported air pollution for a given region depends on its geographic location, the size of its area, the strength and spatial distribution of emission sources, the time of the year, but also - very strongly - on the current meteorological conditions, which change from day to day and thus make it important to provide such calculations in near-real-time to support short-term legislation. Similarly, long-term analysis over longer periods (e.g. one year), or of specific air quality episodes in the past, can help to scientifically underpin multi-regional agreements and long-term legislation. Within the European MACC projects (Monitoring Atmospheric Composition and Climate) and the transition to the operational CAMS service (Copernicus Atmosphere Monitoring Service) the computationally efficient EMEP MSC-W air quality model has been applied with detailed emission data, comprehensive calculations of chemistry and microphysics, driven by high quality meteorological forecast data (up to 96-hour forecasts), to provide source-receptor calculations on a regular basis in forecast mode. In its current state, the product allows the user to choose among different regions and regulatory pollutants (e.g. ozone and PM) to assess the effectiveness of fictive emission reductions in air pollutant emissions that are implemented immediately, either within the agglomeration or outside. The effects are visualized as bar charts, showing resulting changes in air pollution levels within the agglomeration as a function of time (hourly resolution, 0 to 4 days into the future). The bar charts not only allow assessing the effects of emission reduction measures but they also indicate the relative importance of indigenous versus imported air pollution. The calculations are currently performed weekly by MET Norway for the Paris, London, Berlin, Oslo, Po Valley and Rhine-Ruhr regions and the results are provided free of charge at the MACC website (http://www.gmes-atmosphere.eu/services/aqac/policy_interface/regional_sr/). A proposal to extend this service to all EU capitals on a daily basis within the Copernicus Atmosphere Monitoring Service is currently under review. The tool is an important example illustrating the increased application of scientific tools to operational services that support Air Quality policy. This paper will describe this tool in more detail, focusing on the experimental setup, underlying assumptions, uncertainties, computational demand, and the usefulness for air quality for policy. Options to apply the tool for agglomerations outside the EU will also be discussed (making reference to, e.g., PANDA, which is a European-Chinese collaboration project).

  16. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N., E-mail: zizin@adis.vver.kiae.ru

    2010-12-15

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit ofmore » the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.« less

  17. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    NASA Astrophysics Data System (ADS)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N.; Kryakvin, L. V.; Pitilimov, V. A.; Tereshonok, V. A.

    2010-12-01

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit of the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.

  18. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  19. An Object-Oriented Approach to Writing Computational Electromagnetics Codes

    NASA Technical Reports Server (NTRS)

    Zimmerman, Martin; Mallasch, Paul G.

    1996-01-01

    Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.

  20. Buried Underwater Munitions and Clutter Discrimination

    DTIC Science & Technology

    2010-10-01

    closest point of approach of the cylinder. The k space amplitude beam pattern, sin Δ( ) Δ , in Stanton’s treatment is obtained from the Fourier ...simple modifications to be useful here. First, the amplitude of the incident plane wave P0 should be replaced by P1r0/r, where P1 is the magnitude of...Instrument Source Information Site Selec- tion MACC Phase I Input Location Resolution Age Bathymetry SEA Ltd. SWATHPlus McNinch

  1. Computer Description of the Field Artillery Ammunition Supply Vehicle

    DTIC Science & Technology

    1983-04-01

    Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and

  2. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  3. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  4. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  5. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...

  6. Antenna pattern study, task 2

    NASA Technical Reports Server (NTRS)

    Harper, Warren

    1989-01-01

    Two electromagnetic scattering codes, NEC-BSC and ESP3, were delivered and installed on a NASA VAX computer for use by Marshall Space Flight Center antenna design personnel. The existing codes and certain supplementary software were updated, the codes installed on a computer that will be delivered to the customer, to provide capability for graphic display of the data to be computed by the use of the codes and to assist the customer in the solution of specific problems that demonstrate the use of the codes. With the exception of one code revision, all of these tasks were performed.

  7. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...

  8. Parallel Computation of the Jacobian Matrix for Nonlinear Equation Solvers Using MATLAB

    NASA Technical Reports Server (NTRS)

    Rose, Geoffrey K.; Nguyen, Duc T.; Newman, Brett A.

    2017-01-01

    Demonstrating speedup for parallel code on a multicore shared memory PC can be challenging in MATLAB due to underlying parallel operations that are often opaque to the user. This can limit potential for improvement of serial code even for the so-called embarrassingly parallel applications. One such application is the computation of the Jacobian matrix inherent to most nonlinear equation solvers. Computation of this matrix represents the primary bottleneck in nonlinear solver speed such that commercial finite element (FE) and multi-body-dynamic (MBD) codes attempt to minimize computations. A timing study using MATLAB's Parallel Computing Toolbox was performed for numerical computation of the Jacobian. Several approaches for implementing parallel code were investigated while only the single program multiple data (spmd) method using composite objects provided positive results. Parallel code speedup is demonstrated but the goal of linear speedup through the addition of processors was not achieved due to PC architecture.

  9. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  10. Development of a model and computer code to describe solar grade silicon production processes

    NASA Technical Reports Server (NTRS)

    Gould, R. K.; Srivastava, R.

    1979-01-01

    Two computer codes were developed for describing flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides. The first is the CHEMPART code, an axisymmetric, marching code which treats two phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. It can be used to described flow reactors in which reactants, mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, deposition of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail.

  11. Top-down NOX Emissions of European Cities Derived from Modelled and Spaceborne Tropospheric NO2 Columns

    NASA Astrophysics Data System (ADS)

    Verstraeten, W. W.; Boersma, K. F.; Douros, J.; Williams, J. E.; Eskes, H.; Delcloo, A. W.

    2017-12-01

    High nitrogen oxides (NOX = NO + NO2) concentrations near the surface impact humans and ecosystems badly and play a key role in tropospheric chemistry. NO2 is an important precursor of tropospheric ozone (O3) which in turn affects the production of the hydroxyl radical controlling the chemical lifetime of key atmospheric pollutants and reactive greenhouse gases. Combustion from industrial, traffic and household activities in large and densely populated urban areas result in high NOX emissions. Accurate mapping of these emissions is essential but hard to do since reported emissions factors may differ from real-time emissions in order of magnitude. Modelled NO2 levels and lifetimes also have large associated uncertainties and overestimation in the chemical lifetime which may mask missing NOX chemistry in current chemistry transport models (CTM's). The simultaneously estimation of both the NO2 lifetime and as well as the concentrations by applying the Exponentially Modified Gaussian (EMG) method on tropospheric NO2 columns lines densities should improve the surface NOX emission estimates. Here we evaluate if the EMG methodology applied on the tropospheric NO2 columns simulated by the LOTOS-EUROS (Long Term Ozone Simulation-European Ozone Simulation) CTM can reproduce the NOX emissions used as model input. First we process both the modelled tropospheric NO2 columns for the period April-September 2013 for 21 selected European urban areas under windy conditions (averaged vertical wind speeds between surface and 500 m from ECMWF > 2 m s-1) as well as the accompanying OMI (Ozone Monitoring Instrument) data providing us with real-time observation-based estimates of midday NO2 columns. Then we compare the top-down derived surface NOX emissions with the 2011 MACC-III emission inventory, used in the CTM as input to simulate the NO2 columns. For cities where NOX emissions can be assumed as originating from one large source good agreement is found between the top-down derived NOX emissions from CTM and OMI with the MACC-III inventory. For cities where multiple sources of NOX are observed (e.g. Brussels, London), an adapted methodology is required. For some cities such as St-Petersburg and Moscow the top-down NOX estimates from 2013 OMI data are biased low compared to the MACC-III inventory which uses a 2011 NOX emissions update.

  12. Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO

    NASA Technical Reports Server (NTRS)

    Stallworth, R.; Meyers, C. A.; Stinson, H. C.

    1989-01-01

    Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.

  13. Computational Predictions of the Performance Wright 'Bent End' Propellers

    NASA Technical Reports Server (NTRS)

    Wang, Xiang-Yu; Ash, Robert L.; Bobbitt, Percy J.; Prior, Edwin (Technical Monitor)

    2002-01-01

    Computational analysis of two 1911 Wright brothers 'Bent End' wooden propeller reproductions have been performed and compared with experimental test results from the Langley Full Scale Wind Tunnel. The purpose of the analysis was to check the consistency of the experimental results and to validate the reliability of the tests. This report is one part of the project on the propeller performance research of the Wright 'Bent End' propellers, intend to document the Wright brothers' pioneering propeller design contributions. Two computer codes were used in the computational predictions. The FLO-MG Navier-Stokes code is a CFD (Computational Fluid Dynamics) code based on the Navier-Stokes Equations. It is mainly used to compute the lift coefficient and the drag coefficient at specified angles of attack at different radii. Those calculated data are the intermediate results of the computation and a part of the necessary input for the Propeller Design Analysis Code (based on Adkins and Libeck method), which is a propeller design code used to compute the propeller thrust coefficient, the propeller power coefficient and the propeller propulsive efficiency.

  14. Proceduracy: Computer Code Writing in the Continuum of Literacy

    ERIC Educational Resources Information Center

    Vee, Annette

    2010-01-01

    This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…

  15. Computer Code Aids Design Of Wings

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Darden, Christine M.

    1993-01-01

    AERO2S computer code developed to aid design engineers in selection and evaluation of aerodynamically efficient wing/canard and wing/horizontal-tail configurations that includes simple hinged-flap systems. Code rapidly estimates longitudinal aerodynamic characteristics of conceptual airplane lifting-surface arrangements. Developed in FORTRAN V on CDC 6000 computer system, and ported to MS-DOS environment.

  16. Cloud Computing for Complex Performance Codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  17. APC: A New Code for Atmospheric Polarization Computations

    NASA Technical Reports Server (NTRS)

    Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.

    2014-01-01

    A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.

  18. GIRAFE, a campaign forecast tool for anthropogenic and biomass burning plumes

    NASA Astrophysics Data System (ADS)

    Fontaine, Alain; Mari, Céline; Drouin, Marc-Antoine; Lussac, Laure

    2015-04-01

    GIRAFE (reGIonal ReAl time Fire plumEs, http://girafe.pole-ether.fr, alain.fontaine@obs-mip.fr) is a forecast tool supported by the French atmospheric chemistry data centre Ether (CNES and CNRS), build on the lagrangian particle dispersion model FLEXPART coupled with ECMWF meteorological fields and emission inventories. GIRAFE was used during the CHARMEX campaign (Chemistry-Aerosol Mediterranean Experiment http://charmex.lsce.ipsl.fr) in order to provide daily 5-days plumes trajectory forecast over the Mediterranean Sea. For this field experiment, the lagrangian model was used to mimic carbon monoxide pollution plumes emitted either by anthropogenic or biomass burning emissions. Sources from major industrial areas as Fos-Berre or the Po valley were extracted from the MACC-TNO inventory. Biomass burning sources were estimated based on MODIS fire detection. Comparison with MACC and CHIMERE APIFLAME models revealed that GIRAFE followed pollution plumes from small and short-duration fires which were not captured by low resolution models. GIRAFE was used as a decision-making tool to schedule field campaign like airbone operations or balloons launching. Thanks to recent features, GIRAFE is able to read the ECCAD database (http://eccad.pole-ether.fr) inventories. Global inventories such as MACCITY and ECLIPSE will be used to predict CO plumes trajectories from major urban and industrial sources over West Africa for the DACCIWA campaign (Dynamic-Aerosol-Chemistry-Cloud interactions in West Africa).

  19. Variations of trace gases over the Bay of Bengal during the summer monsoon

    NASA Astrophysics Data System (ADS)

    Girach, I. A.; Ojha, Narendra; Nair, Prabha R.; Tiwari, Yogesh K.; Kumar, K. Ravi

    2018-02-01

    In situ measurements of near-surface ozone (O3), carbon monoxide (CO), and methane (CH4) were carried out over the Bay of Bengal (BoB) as a part of the Continental Tropical Convergence Zone (CTCZ) campaign during the summer monsoon season of 2009. O3, CO and CH4 mixing ratios varied in the ranges of 8-54 ppbv, 50-200 ppbv and 1.57-2.15 ppmv, respectively during 16 July-17 August 2009. The spatial distribution of mean tropospheric O3 from satellite retrievals is found to be similar to that in surface O3 observations, with higher levels over coastal and northern BoB as compared to central BoB. The comparison of in situ measurements with the Monitoring Atmospheric Composition & Climate (MACC) global reanalysis shows that MACC simulations reproduce the observations with small mean biases of 1.6 ppbv, -2.6 ppbv and 0.07 ppmv for O3, CO and CH4, respectively. The analysis of diurnal variation of O3 based on observations and the simulations from Weather Research and Forecasting coupled with Chemistry (WRF-Chem) at a stationary point over the BoB did not show a net photochemical build up during daytime. Satellite retrievals show limitations in capturing CH4 variations as measured by in situ sample analysis highlighting the need of more shipborne in situ measurements of trace gases over this region during monsoon.

  20. Five-year outcomes of staged percutaneous coronary intervention in the SYNTAX study.

    PubMed

    Watkins, Stuart; Oldroyd, Keith G; Preda, Istvan; Holmes, David R; Colombo, Antonio; Morice, Marie-Claude; Leadley, Katrin; Dawkins, Keith D; Mohr, Friedrich W; Serruys, Patrick W; Feldman, Ted E

    2015-04-01

    The SYNTAX study compared PCI with TAXUS Express stents to CABG for the treatment of de novo 3-vessel and/or left main coronary disease. This study aimed to determine patient characteristics and five-year outcomes after a staged PCI strategy compared to single-session PCI. In the SYNTAX trial, staged procedures were discouraged but were allowed within 72 hours or, if renal insufficiency or contrast-induced nephropathy occurred, within 14 days (mean 9.8±18.1 days post initial procedure). A total of 125 (14%) patients underwent staged PCI. These patients had greater disease severity and/or required a more complex procedure. MACCE was significantly increased in staged patients (48.1% vs. 35.5%, p=0.004), as was the composite of death/stroke/MI (32.2% vs. 19%, p=0.0007). Individually, cardiac death and stroke occurred more frequently in the staged PCI group (p=0.03). Repeat revascularisation was significantly higher in staged patients (32.8% vs 24.8%, p=0.035), as was stent thrombosis (10.9% vs. 4.7%, p=0.005). There is a higher incidence of MACCE in patients undergoing staged compared to single-session PCI for 3-vessel and/or left main disease over the first five years of follow-up. However, these patients had more comorbidities and more diffuse disease.

  1. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  2. Calculation of Water Drop Trajectories to and About Arbitrary Three-Dimensional Bodies in Potential Airflow

    NASA Technical Reports Server (NTRS)

    Norment, H. G.

    1980-01-01

    Calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Any subsonic, external, non-lifting flow can be accommodated; flow into, but not through, inlets also can be simulated. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Code descriptions include operating instructions, card inputs and printouts for example problems, and listing of the FORTRAN codes. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.

  3. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  4. PASCO: Structural panel analysis and sizing code: Users manual - Revised

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.

    1981-01-01

    A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.

  5. Computation of Reacting Flows in Combustion Processes

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Chen, Kuo-Huey

    1997-01-01

    The main objective of this research was to develop an efficient three-dimensional computer code for chemically reacting flows. The main computer code developed is ALLSPD-3D. The ALLSPD-3D computer program is developed for the calculation of three-dimensional, chemically reacting flows with sprays. The ALL-SPD code employs a coupled, strongly implicit solution procedure for turbulent spray combustion flows. A stochastic droplet model and an efficient method for treatment of the spray source terms in the gas-phase equations are used to calculate the evaporating liquid sprays. The chemistry treatment in the code is general enough that an arbitrary number of reaction and species can be defined by the users. Also, it is written in generalized curvilinear coordinates with both multi-block and flexible internal blockage capabilities to handle complex geometries. In addition, for general industrial combustion applications, the code provides both dilution and transpiration cooling capabilities. The ALLSPD algorithm, which employs the preconditioning and eigenvalue rescaling techniques, is capable of providing efficient solution for flows with a wide range of Mach numbers. Although written for three-dimensional flows in general, the code can be used for two-dimensional and axisymmetric flow computations as well. The code is written in such a way that it can be run in various computer platforms (supercomputers, workstations and parallel processors) and the GUI (Graphical User Interface) should provide a user-friendly tool in setting up and running the code.

  6. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  7. Final report for the Tera Computer TTI CRADA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, G.S.; Pavlakos, C.; Silva, C.

    1997-01-01

    Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less

  8. Operations analysis (study 2.1). Program listing for the LOVES computer code

    NASA Technical Reports Server (NTRS)

    Wray, S. T., Jr.

    1974-01-01

    A listing of the LOVES computer program is presented. The program is coded partially in SIMSCRIPT and FORTRAN. This version of LOVES is compatible with both the CDC 7600 and the UNIVAC 1108 computers. The code has been compiled, loaded, and executed successfully on the EXEC 8 system for the UNIVAC 1108.

  9. Analysis of the Length of Braille Texts in English Braille American Edition, the Nemeth Code, and Computer Braille Code versus the Unified English Braille Code

    ERIC Educational Resources Information Center

    Knowlton, Marie; Wetzel, Robin

    2006-01-01

    This study compared the length of text in English Braille American Edition, the Nemeth code, and the computer braille code with the Unified English Braille Code (UEBC)--also known as Unified English Braille (UEB). The findings indicate that differences in the length of text are dependent on the type of material that is transcribed and the grade…

  10. A MATLAB based 3D modeling and inversion code for MT data

    NASA Astrophysics Data System (ADS)

    Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.

    2017-07-01

    The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.

  11. Applications of automatic differentiation in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  12. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    NASA Astrophysics Data System (ADS)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.

  13. Performance assessment of KORAT-3D on the ANL IBM-SP computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.

    1999-09-01

    The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less

  14. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  15. Fast H.264/AVC FRExt intra coding using belief propagation.

    PubMed

    Milani, Simone

    2011-01-01

    In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.

  16. 2,445 Hours of Code: What I Learned from Facilitating Hour of Code Events in High School Libraries

    ERIC Educational Resources Information Center

    Colby, Jennifer

    2015-01-01

    This article describes a school librarian's experience with initiating an Hour of Code event for her school's student body. Hadi Partovi of Code.org conceived the Hour of Code "to get ten million students to try one hour of computer science" (Partovi, 2013a), which is implemented during Computer Science Education Week with a goal of…

  17. Numerical algorithm comparison for the accurate and efficient computation of high-incidence vortical flow

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.

    1991-01-01

    Computations from two Navier-Stokes codes, NSS and F3D, are presented for a tangent-ogive-cylinder body at high angle of attack. Features of this steady flow include a pair of primary vortices on the leeward side of the body as well as secondary vortices. The topological and physical plausibility of this vortical structure is discussed. The accuracy of these codes are assessed by comparison of the numerical solutions with experimental data. The effects of turbulence model, numerical dissipation, and grid refinement are presented. The overall efficiency of these codes are also assessed by examining their convergence rates, computational time per time step, and maximum allowable time step for time-accurate computations. Overall, the numerical results from both codes compared equally well with experimental data, however, the NSS code was found to be significantly more efficient than the F3D code.

  18. User's Manual for FEMOM3DR. Version 1.0

    NASA Technical Reports Server (NTRS)

    Reddy, C. J.

    1998-01-01

    FEMoM3DR is a computer code written in FORTRAN 77 to compute radiation characteristics of antennas on 3D body using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. The code is written to handle different feeding structures like coaxial line, rectangular waveguide, and circular waveguide. This code uses the tetrahedral elements, with vector edge basis functions for FEM and triangular elements with roof-top basis functions for MoM. By virtue of FEM, this code can handle any arbitrary shaped three dimensional bodies with inhomogeneous lossy materials; and due to MoM the computational domain can be terminated in any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.

  19. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGrail, B.P.; Mahoney, L.A.

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected tomore » affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.« less

  20. User's manual for a material transport code on the Octopus Computer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.; Mendez, G.D.

    1978-09-15

    A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.

  1. Performance analysis of three dimensional integral equation computations on a massively parallel computer. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Logan, Terry G.

    1994-01-01

    The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.

  2. Computer Description of the M561 Utility Truck

    DTIC Science & Technology

    1984-10-01

    GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom

  3. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L L; Trent, D S; Budden, M J

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.

  4. The influence of commenting validity, placement, and style on perceptions of computer code trustworthiness: A heuristic-systematic processing approach.

    PubMed

    Alarcon, Gene M; Gamble, Rose F; Ryan, Tyler J; Walter, Charles; Jessup, Sarah A; Wood, David W; Capiola, August

    2018-07-01

    Computer programs are a ubiquitous part of modern society, yet little is known about the psychological processes that underlie reviewing code. We applied the heuristic-systematic model (HSM) to investigate the influence of computer code comments on perceptions of code trustworthiness. The study explored the influence of validity, placement, and style of comments in code on trustworthiness perceptions and time spent on code. Results indicated valid comments led to higher trust assessments and more time spent on the code. Properly placed comments led to lower trust assessments and had a marginal effect on time spent on code; however, the effect was no longer significant after controlling for effects of the source code. Low style comments led to marginally higher trustworthiness assessments, but high style comments led to longer time spent on the code. Several interactions were also found. Our findings suggest the relationship between code comments and perceptions of code trustworthiness is not as straightforward as previously thought. Additionally, the current paper extends the HSM to the programming literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Adiabatic topological quantum computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less

  6. Adiabatic topological quantum computing

    DOE PAGES

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; ...

    2015-07-31

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less

  7. Fast Computation of the Two-Point Correlation Function in the Age of Big Data

    NASA Astrophysics Data System (ADS)

    Pellegrino, Andrew; Timlin, John

    2018-01-01

    We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.

  8. Orbital-Dependent Density Functionals for Chemical Catalysis

    DTIC Science & Technology

    2014-10-17

    noncollinear density functional theory to show that the low-spin state of Mn3 in a model of the oxygen -evolving complex of photosystem II avoids...DK, which denotes the cc-pV5Z-DK basis set for 3d metals and hydrogen and the ma-cc- pV5Z-DK basis set for oxygen ) and to nonrelativistic all...cc-pV5Z basis set for oxygen ). As compared to NCBS-DK results, all ECP calculations perform worse than def2-TZVP all-electron relativistic

  9. Design of convolutional tornado code

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  10. Three-dimensional turbopump flowfield analysis

    NASA Technical Reports Server (NTRS)

    Sharma, O. P.; Belford, K. A.; Ni, R. H.

    1992-01-01

    A program was conducted to develop a flow prediction method applicable to rocket turbopumps. The complex nature of a flowfield in turbopumps is described and examples of flowfields are discussed to illustrate that physics based models and analytical calculation procedures based on computational fluid dynamics (CFD) are needed to develop reliable design procedures for turbopumps. A CFD code developed at NASA ARC was used as the base code. The turbulence model and boundary conditions in the base code were modified, respectively, to: (1) compute transitional flows and account for extra rates of strain, e.g., rotation; and (2) compute surface heat transfer coefficients and allow computation through multistage turbomachines. Benchmark quality data from two and three-dimensional cascades were used to verify the code. The predictive capabilities of the present CFD code were demonstrated by computing the flow through a radial impeller and a multistage axial flow turbine. Results of the program indicate that the present code operated in a two-dimensional mode is a cost effective alternative to full three-dimensional calculations, and that it permits realistic predictions of unsteady loadings and losses for multistage machines.

  11. High altitude chemically reacting gas particle mixtures. Volume 3: Computer code user's and applications manual. [rocket nozzle and orbital plume flow fields

    NASA Technical Reports Server (NTRS)

    Smith, S. D.

    1984-01-01

    A users manual for the RAMP2 computer code is provided. The RAMP2 code can be used to model the dominant phenomena which affect the prediction of liquid and solid rocket nozzle and orbital plume flow fields. The general structure and operation of RAMP2 are discussed. A user input/output guide for the modified TRAN72 computer code and the RAMP2F code is given. The application and use of the BLIMPJ module are considered. Sample problems involving the space shuttle main engine and motor are included.

  12. Development of numerical methods for overset grids with applications for the integrated Space Shuttle vehicle

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    1995-01-01

    Algorithms and computer code developments were performed for the overset grid approach to solving computational fluid dynamics problems. The techniques developed are applicable to compressible Navier-Stokes flow for any general complex configurations. The computer codes developed were tested on different complex configurations with the Space Shuttle launch vehicle configuration as the primary test bed. General, efficient and user-friendly codes were produced for grid generation, flow solution and force and moment computation.

  13. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  14. ISSYS: An integrated synergistic Synthesis System

    NASA Technical Reports Server (NTRS)

    Dovi, A. R.

    1980-01-01

    Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.

  15. User's manual for a two-dimensional, ground-water flow code on the Octopus computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.

    1978-08-30

    A ground-water hydrology computer code, programmed by R.L. Taylor (in Proc. American Society of Civil Engineers, Journal of Hydraulics Division, 93(HY2), pp. 25-33 (1967)), has been adapted to the Octopus computer system at Lawrence Livermore Laboratory. Using an example problem, this manual details the input, output, and execution options of the code.

  16. Interactive Synthesis of Code Level Security Rules

    DTIC Science & Technology

    2017-04-01

    Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University

  17. Agricultural Spraying

    NASA Technical Reports Server (NTRS)

    1986-01-01

    AGDISP, a computer code written for Langley by Continuum Dynamics, Inc., aids crop dusting airplanes in targeting pesticides. The code is commercially available and can be run on a personal computer by an inexperienced operator. Called SWA+H, it is used by the Forest Service, FAA, DuPont, etc. DuPont uses the code to "test" equipment on the computer using a laser system to measure particle characteristics of various spray compounds.

  18. The adaption and use of research codes for performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebetrau, A.M.

    1987-05-01

    Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less

  19. Topological color codes on Union Jack lattices: a stable implementation of the whole Clifford group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katzgraber, Helmut G.; Theoretische Physik, ETH Zurich, CH-8093 Zurich; Bombin, H.

    We study the error threshold of topological color codes on Union Jack lattices that allow for the full implementation of the whole Clifford group of quantum gates. After mapping the error-correction process onto a statistical mechanical random three-body Ising model on a Union Jack lattice, we compute its phase diagram in the temperature-disorder plane using Monte Carlo simulations. Surprisingly, topological color codes on Union Jack lattices have a similar error stability to color codes on triangular lattices, as well as to the Kitaev toric code. The enhanced computational capabilities of the topological color codes on Union Jack lattices with respectmore » to triangular lattices and the toric code combined with the inherent robustness of this implementation show good prospects for future stable quantum computer implementations.« less

  20. Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low-Altitude VLF Transmitter

    DTIC Science & Technology

    2007-08-31

    latitude) for 3 different grid spacings. 14 8. Low-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...excellent, validating the new FD code. 16 9. High-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...again excellent. 17 10. Low-altitude fields produced by a 20-k.Hz source computed using the FD and TD codes. 17 11. High-altitude fields produced

  1. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  2. Analysis of airborne antenna systems using geometrical theory of diffraction and moment method computer codes

    NASA Technical Reports Server (NTRS)

    Hartenstein, Richard G., Jr.

    1985-01-01

    Computer codes have been developed to analyze antennas on aircraft and in the presence of scatterers. The purpose of this study is to use these codes to develop accurate computer models of various aircraft and antenna systems. The antenna systems analyzed are a P-3B L-Band antenna, an A-7E UHF relay pod antenna, and traffic advisory antenna system installed on a Bell Long Ranger helicopter. Computer results are compared to measured ones with good agreement. These codes can be used in the design stage of an antenna system to determine the optimum antenna location and save valuable time and costly flight hours.

  3. THC-MP: High performance numerical simulation of reactive transport and multiphase flow in porous media

    NASA Astrophysics Data System (ADS)

    Wei, Xiaohui; Li, Weishan; Tian, Hailong; Li, Hongliang; Xu, Haixiao; Xu, Tianfu

    2015-07-01

    The numerical simulation of multiphase flow and reactive transport in the porous media on complex subsurface problem is a computationally intensive application. To meet the increasingly computational requirements, this paper presents a parallel computing method and architecture. Derived from TOUGHREACT that is a well-established code for simulating subsurface multi-phase flow and reactive transport problems, we developed a high performance computing THC-MP based on massive parallel computer, which extends greatly on the computational capability for the original code. The domain decomposition method was applied to the coupled numerical computing procedure in the THC-MP. We designed the distributed data structure, implemented the data initialization and exchange between the computing nodes and the core solving module using the hybrid parallel iterative and direct solver. Numerical accuracy of the THC-MP was verified through a CO2 injection-induced reactive transport problem by comparing the results obtained from the parallel computing and sequential computing (original code). Execution efficiency and code scalability were examined through field scale carbon sequestration applications on the multicore cluster. The results demonstrate successfully the enhanced performance using the THC-MP on parallel computing facilities.

  4. Code of Ethical Conduct for Computer-Using Educators: An ICCE Policy Statement.

    ERIC Educational Resources Information Center

    Computing Teacher, 1987

    1987-01-01

    Prepared by the International Council for Computers in Education's Ethics and Equity Committee, this code of ethics for educators using computers covers nine main areas: curriculum issues, issues relating to computer access, privacy/confidentiality issues, teacher-related issues, student issues, the community, school organizational issues,…

  5. Embedding Secure Coding Instruction into the IDE: Complementing Early and Intermediate CS Courses with ESIDE

    ERIC Educational Resources Information Center

    Whitney, Michael; Lipford, Heather Richter; Chu, Bill; Thomas, Tyler

    2018-01-01

    Many of the software security vulnerabilities that people face today can be remediated through secure coding practices. A critical step toward the practice of secure coding is ensuring that our computing students are educated on these practices. We argue that secure coding education needs to be included across a computing curriculum. We are…

  6. Calculation of water drop trajectories to and about arbitrary three-dimensional lifting and nonlifting bodies in potential airflow

    NASA Technical Reports Server (NTRS)

    Norment, H. G.

    1985-01-01

    Subsonic, external flow about nonlifting bodies, lifting bodies or combinations of lifting and nonlifting bodies is calculated by a modified version of the Hess lifting code. Trajectory calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Inlet flow can be accommodated, and high Mach number compressibility effects are corrected for approximately. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.

  7. Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.

    DTIC Science & Technology

    1990-09-01

    IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve

  8. A COTS-Based Replacement Strategy for Aging Avionics Computers

    DTIC Science & Technology

    2001-12-01

    Communication Control Unit. A COTS-Based Replacement Strategy for Aging Avionics Computers COTS Microprocessor Real Time Operating System New Native Code...Native Code Objec ts Native Code Thread Real - Time Operating System Legacy Function x Virtual Component Environment Context Switch Thunk Add-in Replace

  9. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  10. The STAGS computer code

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Brogan, F. A.

    1978-01-01

    Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.

  11. Holonomic surface codes for fault-tolerant quantum computation

    NASA Astrophysics Data System (ADS)

    Zhang, Jiang; Devitt, Simon J.; You, J. Q.; Nori, Franco

    2018-02-01

    Surface codes can protect quantum information stored in qubits from local errors as long as the per-operation error rate is below a certain threshold. Here we propose holonomic surface codes by harnessing the quantum holonomy of the system. In our scheme, the holonomic gates are built via auxiliary qubits rather than the auxiliary levels in multilevel systems used in conventional holonomic quantum computation. The key advantage of our approach is that the auxiliary qubits are in their ground state before and after each gate operation, so they are not involved in the operation cycles of surface codes. This provides an advantageous way to implement surface codes for fault-tolerant quantum computation.

  12. Comparison of two- and three-dimensional flow computations with laser anemometer measurements in a transonic compressor rotor

    NASA Technical Reports Server (NTRS)

    Chima, R. V.; Strazisar, A. J.

    1982-01-01

    Two and three dimensional inviscid solutions for the flow in a transonic axial compressor rotor at design speed are compared with probe and laser anemometers measurements at near-stall and maximum-flow operating points. Experimental details of the laser anemometer system and computational details of the two dimensional axisymmetric code and three dimensional Euler code are described. Comparisons are made between relative Mach number and flow angle contours, shock location, and shock strength. A procedure for using an efficient axisymmetric code to generate downstream pressure input for computationally expensive Euler codes is discussed. A film supplement shows the calculations of the two operating points with the time-marching Euler code.

  13. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  14. EAC: A program for the error analysis of STAGS results for plates

    NASA Technical Reports Server (NTRS)

    Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.

    1989-01-01

    A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.

  15. CFD Modeling of Free-Piston Stirling Engines

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.

    2001-01-01

    NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.

  16. On the error statistics of Viterbi decoding and the performance of concatenated codes

    NASA Technical Reports Server (NTRS)

    Miller, R. L.; Deutsch, L. J.; Butman, S. A.

    1981-01-01

    Computer simulation results are presented on the performance of convolutional codes of constraint lengths 7 and 10 concatenated with the (255, 223) Reed-Solomon code (a proposed NASA standard). These results indicate that as much as 0.8 dB can be gained by concatenating this Reed-Solomon code with a (10, 1/3) convolutional code, instead of the (7, 1/2) code currently used by the DSN. A mathematical model of Viterbi decoder burst-error statistics is developed and is validated through additional computer simulations.

  17. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  18. SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMillan, D.B.

    1960-06-01

    >A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)

  19. Computer-assisted coding and clinical documentation: first things first.

    PubMed

    Tully, Melinda; Carmichael, Angela

    2012-10-01

    Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.

  20. Climate Literacy Through Student-Teacher-Scientist Research Partnerships

    NASA Astrophysics Data System (ADS)

    Niepold, F.; Brooks, D.; Lefer, B.; Linsley, A.; Duckenfield, K.

    2006-12-01

    Expanding on the GLOBE Program's Atmosphere and Aerosol investigations, high school students can conduct Earth System scientific research that promotes scientific literacy in both content and the science process. Through the use of Student-Teacher-Scientist partnerships, Earth system scientific investigations can be conducted that serve the needs of the classroom as well as participating scientific investigators. During the proof-of-concept phase of this partnership model, teachers and their students developed science plans, through consultation with scientists, and began collecting atmospheric and aerosol data in support of the Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) campaign in Houston Texas. This effort uses some pre-existing GLOBE materials, but draws on a variety of other resources to tailor the teacher development activities and intended student participation in a way that addresses local and regional problems. Students and teachers have learned about best practices in scientific inquiry and they also helped to expand the pipeline of potential future scientists and researchers for industry, academia, and government. This work began with a Student-Teacher-Scientist partnership started in 2002 during a GLOBE Aerosol Protocol Cross- Ground Validation of AERONET with MODIS Satellite Aerosol Measurements. Several other GLOBE schools, both national and international, have contributed to this research. The current project support of the intensive GoMACCS air quality and atmospheric dynamics field campaign during September and October of 2006. This model will be evaluated for wider use in other project-focused partnerships led by NOAA's Climate Program Office.

  1. Performance evaluation of CESM in simulating the dust cycle

    NASA Astrophysics Data System (ADS)

    Parajuli, S. P.; Yang, Z. L.; Kocurek, G.; Lawrence, D. M.

    2014-12-01

    Mineral dust in the atmosphere has implications for Earth's radiation budget, biogeochemical cycles, hydrological cycles, human health and visibility. Mineral dust is injected into the atmosphere during dust storms when the surface winds are sufficiently strong and the land surface conditions are favorable. Dust storms are very common in specific regions of the world including the Middle East and North Africa (MENA) region, which contains more than 50% of the global dust sources. In this work, we present simulation of the dust cycle under the framework of CESM1.2.2 and evaluate how well the model captures the spatio-temporal characteristics of dust sources, transport and deposition at global scale, especially in dust source regions. We conducted our simulations using two existing erodibility maps (geomorphic and topographic) and a new erodibility map, which is based on the correlation between observed wind and dust. We compare the simulated results with MODIS satellite data, MACC reanalysis data, and AERONET station data. Comparison with MODIS satellite data and MACC reanalysis data shows that all three erodibility maps generally reproduce the spatio-temporal characteristics of dust optical depth globally. However, comparison with AERONET station data shows that the simulated dust optical depth is generally overestimated for all erodibility maps. Results vary greatly by region and scale of observational data. Our results also show that the simulations forced by reanalysis meteorology capture the overall dust cycle more realistically compared to the simulations done using online meteorology.

  2. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.

    1987-01-01

    The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.

  3. Bistatic radar cross section of a perfectly conducting rhombus-shaped flat plate

    NASA Astrophysics Data System (ADS)

    Fenn, Alan J.

    1990-05-01

    The bistatic radar cross section of a perfectly conducting flat plate that has a rhombus shape (equilateral parallelogram) is investigated. The Ohio State University electromagnetic surface patch code (ESP version 4) is used to compute the theoretical bistatic radar cross section of a 35- x 27-in rhombus plate at 1.3 GHz over the bistatic angles 15 deg to 142 deg. The ESP-4 computer code is a method of moments FORTRAN-77 program which can analyze general configurations of plates and wires. This code has been installed and modified at Lincoln Laboratory on a SUN 3 computer network. Details of the code modifications are described. Comparisons of the method of moments simulations and measurements of the rhombus plate are made. It is shown that the ESP-4 computer code provides a high degree of accuracy in the calculation of copolarized and cross-polarized bistatic radar cross section patterns.

  4. ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.

    A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less

  5. Navier-Stokes Simulation of Homogeneous Turbulence on the CYBER 205

    NASA Technical Reports Server (NTRS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.; Rogallo, R. S.

    1984-01-01

    A computer code which solves the Navier-Stokes equations for three dimensional, time-dependent, homogenous turbulence has been written for the CYBER 205. The code has options for both 64-bit and 32-bit arithmetic. With 32-bit computation, mesh sizes up to 64 (3) are contained within core of a 2 million 64-bit word memory. Computer speed timing runs were made for various vector lengths up to 6144. With this code, speeds a little over 100 Mflops have been achieved on a 2-pipe CYBER 205. Several problems encountered in the coding are discussed.

  6. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E.

    1985-01-01

    The tether control law to retrieve the satellite was modified in order to have a smooth retrieval trajectory of the satellite that minimizes the thruster activation. The satellite thrusters were added to the rotational dynamics computer code and a preliminary control logic was implemented to simulate them during the retrieval maneuver. The high resolution computer code for modelling the three dimensional dynamics of untensioned tether, SLACK3, was made fully operative and a set of computer simulations of possible tether breakages was run. The distribution of the electric field around an electrodynamic tether in vacuo severed at some length from the shuttle was computed with a three dimensional electrodynamic computer code.

  7. Experimental and computational surface and flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.

    1990-01-01

    The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.

  8. Computer search for binary cyclic UEP codes of odd length up to 65

    NASA Technical Reports Server (NTRS)

    Lin, Mao-Chao; Lin, Chi-Chang; Lin, Shu

    1990-01-01

    Using an exhaustive computation, the unequal error protection capabilities of all binary cyclic codes of odd length up to 65 that have minimum distances at least 3 are found. For those codes that can only have upper bounds on their unequal error protection capabilities computed, an analytic method developed by Dynkin and Togonidze (1976) is used to show that the upper bounds meet the exact unequal error protection capabilities.

  9. A Combinatorial Geometry Computer Description of the MEP-021A Generator Set

    DTIC Science & Technology

    1979-02-01

    Generator Computer Description Gasoline Generator GIFT MEP-021A 20. ABSTRACT fCbntteu* an rararaa eta* ft namamwaay anal Identify by block number) This... GIFT code is also stored on magnetic tape for future vulnerability analysis. 00,] 󈧚*7,1473 EDITION OF • NOV 65 IS OBSOLETE UNCLASSIFIED SECURITY...the Geometric Information for Targets ( GIFT ) computer code. The GIFT code traces shotlines through a COM-GEOM description from any specified attack

  10. Optimizing a liquid propellant rocket engine with an automated combustor design code (AUTOCOM)

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Reichel, R. H.; Jones, R. T.; Glatt, C. R.

    1972-01-01

    A procedure for automatically designing a liquid propellant rocket engine combustion chamber in an optimal fashion is outlined. The procedure is contained in a digital computer code, AUTOCOM. The code is applied to an existing engine, and design modifications are generated which provide a substantial potential payload improvement over the existing design. Computer time requirements for this payload improvement were small, approximately four minutes in the CDC 6600 computer.

  11. Unaligned instruction relocation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.

    In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unalignedmore » ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.« less

  12. Unaligned instruction relocation

    DOEpatents

    Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.; Sura, Zehra N.

    2018-01-23

    In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unaligned ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.

  13. Computer algorithm for coding gain

    NASA Technical Reports Server (NTRS)

    Dodd, E. E.

    1974-01-01

    Development of a computer algorithm for coding gain for use in an automated communications link design system. Using an empirical formula which defines coding gain as used in space communications engineering, an algorithm is constructed on the basis of available performance data for nonsystematic convolutional encoding with soft-decision (eight-level) Viterbi decoding.

  14. Influence of Northeast Monsoon cold surges on air quality in Southeast Asia

    NASA Astrophysics Data System (ADS)

    Ashfold, M. J.; Latif, M. T.; Samah, A. A.; Mead, M. I.; Harris, N. R. P.

    2017-10-01

    Ozone (O3) is an important ground-level pollutant. O3 levels and emissions of O3 precursors have increased significantly over recent decades in East Asia and export of this O3 eastward across the Pacific Ocean is well documented. Here we show that East Asian O3 is also transported southward to tropical Southeast (SE) Asia during the Northeast Monsoon (NEM) season (defined as November to February), and that this transport pathway is especially strong during 'cold surges'. Our analysis employs reanalysis data and measurements from surface sites in Peninsular Malaysia, both covering 2003-2012, along with trajectory calculations. Using a cold surge index (northerly winds at 925 hPa averaged over 105-110°E, 5°N) to define sub-seasonal strengthening of the NEM winds, we find the largest changes in a region covering much of the Indochinese Peninsula and surrounding seas. Here, the levels of O3 and another key pollutant, carbon monoxide, calculated by the Monitoring Atmospheric Composition and Climate (MACC) Reanalysis are on average elevated by, respectively, >40% (∼15 ppb) and >60% (∼80 ppb) during cold surges. Further, in the broader region of SE Asia local afternoon exceedances of the World Health Organization's air quality guideline for O3 (100 μg m-3, or ∼50 ppb, averaged over 8 h) largely occur during these cold surges. Day-to-day variations in available O3 observations at surface sites on the east coast of Peninsular Malaysia and in corresponding parts of the MACC Reanalysis are similar, and are clearly linked to cold surges. However, observed O3 levels are typically ∼10-20 ppb lower than the MACC Reanalysis. We show that these observations are also subject to influence from local urban pollution. In agreement with past work, we find year-to-year variations in cold surge activity related to the El Nino-Southern Oscillation (ENSO), but this does not appear to be the dominant influence of ENSO on atmospheric composition in this region. Overall, our study indicates that the influence of East Asian pollution on air quality in SE Asia during the NEM could be at least as large as the corresponding, well-studied spring-time influence on North America. Both an enhanced regional observational capability and chemical modelling studies will be required to fully untangle the importance of this long-range influence relative to local processes.

  15. Postoperative Outcomes in Obstructive Sleep Apnea Patients Undergoing Cardiac Surgery: A Systematic Review and Meta-analysis of Comparative Studies.

    PubMed

    Nagappa, Mahesh; Ho, George; Patra, Jayadeep; Wong, Jean; Singh, Mandeep; Kaw, Roop; Cheng, Davy; Chung, Frances

    2017-12-01

    Obstructive sleep apnea (OSA) is a common comorbidity in patients undergoing cardiac surgery and may predispose patients to postoperative complications. The purpose of this meta-analysis is to determine the evidence of postoperative complications associated with OSA patients undergoing cardiac surgery. A literature search of Cochrane Database of Systematic Reviews, Medline, Medline In-process, Web of Science, Scopus, EMBASE, Cochrane Central Register of Controlled Trials, and CINAHL until October 2016 was performed. The search was constrained to studies in adult cardiac surgical patients with diagnosed or suspected OSA. All included studies must report at least 1 postoperative complication. The primary outcome is major adverse cardiac or cerebrovascular events (MACCEs) up to 30 days after surgery, which includes death from all-cause mortality, myocardial infarction, myocardial injury, nonfatal cardiac arrest, revascularization process, pulmonary embolism, deep venous thrombosis, newly documented postoperative atrial fibrillation (POAF), stroke, and congestive heart failure. Secondary outcome is newly documented POAF. The other exploratory outcomes include the following: (1) postoperative tracheal intubation and mechanical ventilation; (2) infection and/or sepsis; (3) unplanned intensive care unit (ICU) admission; and (4) duration of stay in hospital and ICU. Meta-analysis and meta- regression were conducted using Cochrane Review Manager 5.3 (Cochrane, London, UK) and OpenBUGS v3.0, respectively. Eleven comparative studies were included (n = 1801 patients; OSA versus non-OSA: 688 vs 1113, respectively). MACCEs were 33.3% higher odds in OSA versus non-OSA patients (OSA versus non-OSA: 31% vs 10.6%; odds ratio [OR], 2.4; 95% confidence interval [CI], 1.38-4.2; P = .002). The odds of newly documented POAF (OSA versus non-OSA: 31% vs 21%; OR, 1.94; 95% CI, 1.13-3.33; P = .02) was higher in OSA compared to non-OSA. Even though the postoperative tracheal intubation and mechanical ventilation (OSA versus non-OSA: 13% vs 5.4%; OR, 2.67; 95% CI, 1.03-6.89; P = .04) were significantly higher in OSA patients, the length of ICU stay and hospital stay were not significantly prolonged in patients with OSA compared to non-OSA. The majority of OSA patients were not treated with continuous positive airway pressure therapy. Meta-regression and sensitivity analysis of the subgroups did not impact the OR of postoperative complications for OSA versus non-OSA groups. Our meta-analysis demonstrates that after cardiac surgery, MACCEs and newly documented POAF were 33.3% and 18.1% higher odds in OSA versus non-OSA patients, respectively.

  16. Assimilation of atmospheric methane products into the MACC-II system: from SCIAMACHY to TANSO and IASI

    NASA Astrophysics Data System (ADS)

    Massart, S.; Agusti-Panareda, A.; Aben, I.; Butz, A.; Chevallier, F.; Crevosier, C.; Engelen, R.; Frankenberg, C.; Hasekamp, O.

    2014-06-01

    The Monitoring Atmospheric Composition and Climate Interim Implementation (MACC-II) delayed-mode (DM) system has been producing an atmospheric methane (CH4) analysis 6 months behind real time since June 2009. This analysis used to rely on the assimilation of the CH4 product from the SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY) instrument onboard Envisat. Recently the Laboratoire de Météorologie Dynamique (LMD) CH4 products from the Infrared Atmospheric Sounding Interferometer (IASI) and the SRON Netherlands Institute for Space Research CH4 products from the Thermal And Near-infrared Sensor for carbon Observation (TANSO) were added to the DM system. With the loss of Envisat in April 2012, the DM system now has to rely on the assimilation of methane data from TANSO and IASI. This paper documents the impact of this change in the observing system on the methane tropospheric analysis. It is based on four experiments: one free run and three analyses from respectively the assimilation of SCIAMACHY, TANSO and a combination of TANSO and IASI CH4 products in the MACC-II system. The period between December 2010 and April 2012 is studied. The SCIAMACHY experiment globally underestimates the tropospheric methane by 35 part per billion (ppb) compared to the HIAPER Pole-to-Pole Observations (HIPPO) data and by 28 ppb compared the Total Carbon Column Observing Network (TCCON) data, while the free run presents an underestimation of 5 ppb and 1 ppb against the same HIPPO and TCCON data, respectively. The assimilated TANSO product changed in October 2011 from version v.1 to version v.2.0. The analysis of version v.1 globally underestimates the tropospheric methane by 18 ppb compared to the HIPPO data and by 15 ppb compared to the TCCON data. In contrast, the analysis of version v.2.0 globally overestimates the column by 3 ppb. When the high density IASI data are added in the tropical region between 30° N and 30° S, their impact is mainly positive but more pronounced and effective when combined with version v.2.0 of the TANSO products. The resulting analysis globally underestimates the column-averaged dry-air mole fractions of methane (xCH4) just under 1 ppb on average compared to the TCCON data, whereas in the tropics it overestimates xCH4 by about 3 ppb. The random error is estimated to be less than 7 ppb when compared to TCCON data.

  17. Efficacy of multiple arterial coronary bypass grafting in patients with diabetes mellitus.

    PubMed

    Yamaguchi, Atsushi; Kimura, Naoyuki; Itoh, Satoshi; Adachi, Koichi; Yuri, Koichi; Okamura, Homare; Adachi, Hideo

    2016-09-01

    Use of the left internal mammary artery in patients with diabetes mellitus and multivessel coronary artery disease is known to improve survival after coronary artery bypass grafting (CABG); however, the survival benefit of multiple arterial grafts (MAGs) in diabetic patients is debated. We investigated the efficacy of CABG performed with MAGs in diabetic patients. The overall patient group comprised 2618 consecutive patients who underwent isolated CABG at our hospital between 1990 and 2014. Perioperative characteristics, in-hospital outcomes and long-term outcomes were compared between diabetic (n = 1110) and non-diabetic patients (n = 1508). The long-term outcomes of diabetic and non-diabetic patients were analysed between those who received a single arterial graft (SAG) and those who received MAGs. Both full unmatched patient population and propensity-matched patient population analyses (diabetic cohort = 431 pairs, non-diabetic cohort = 577 pairs) were performed. Preoperative comorbidities were much more common in the diabetic patients than in the non-diabetic patients; however, comorbidities were not associated with in-hospital outcomes (diabetes versus non-diabetes group, in-hospital mortality: 2.2 vs 1.5%; deep sternal wound infection: 2.2 vs 1.8%, P > 0.05). Although survival and freedom from major cardiac and cerebrovascular events (MACCEs) at 15 years were lower in the diabetes group than in the non-diabetes group (survival: 48.6 vs 55.0%, P = 0.019; MACCE-free survival: 40.8 vs 46.1%, P = 0.02), cardiac death-free survival at 15 years was similar (81.7 vs 83.9%, P = 0.24). Overall, 12-year survival was higher in both diabetic and non-diabetic patients treated with MAGs than in those treated with an SAG (64.9 vs 56.8%, P = 0.006, and 71.9 vs 60.5%, P < 0.001). Propensity-matched patient cohort analysis revealed improved 12-year survival with MAGs versus SAG in both the diabetes group (64.9 vs 58.8%, P = 0.041) and non-diabetes group (71.4 vs 63.8%, P = 0.014). Similarly, MACCE-free survival was improved in both groups. A long-term survival advantage, with no increase in perioperative morbidity, is conferred with the use of multiple arterial bypass grafts not only in non-diabetic patients but also in diabetic patients. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  18. Plasma CX3CL1 levels and long term outcomes of patients with atrial fibrillation: the West Birmingham Atrial Fibrillation Project.

    PubMed

    Guo, Yutao; Apostalakis, Stavros; Blann, Andrew D; Lip, Gregory Y H

    2014-01-01

    There is growing evidence that chemokines are potentially important mediators of the pathogenesis of atherosclerotic disease. Major atherothrombotic complications, such as stroke and myocardial infarction, are common among atrial fibrillation (AF) patients. This increase in risk of adverse events may be predicted by a score based on the presence of certain clinical features of chronic heart failure, hypertension, age 75 years or greater, diabetes and stroke (the CHADS2 score). Our objective was to assess the prognostic value of plasma chemokines CCL2, CXCL4 and CX3CL1, and their relationship with the CHADS2 score, in AF patients. Plasma CCL2, CXCL4 and CX3CL1 were measured in 441 patients (59% male, mean age 75 years, 12% paroxysmal, 99% on warfarin) with AF. Baseline clinical and demographic factors were used to define each subject's CHADS2 score. Patients were followed up for a mean 2.1 years, and major adverse cardiovascular and cerebrovascular events (MACCE) were sought, being the combination of cardiovascular death, acute coronary events, stroke and systemic embolism. Fifty-five of the AF patients suffered a MACCE (6% per year). Those in the lowest CX3CL1 quartile (≤ 0.24 ng/ml) had fewest MACCE (p = 0.02). In the Cox regression analysis, CX3CL1 levels >0.24 ng/ml (Hazard ratio 2.8, 95% CI 1.02-8.2, p = 0.045) and age (p = 0.042) were independently linked with adverse outcomes. The CX3CL1 levels rose directly with the CHADS2 risk score (p = 0.009). The addition of CX3CL1 did not significantly increased the discriminatory ability of the CHADS2 clinical factor-based risk stratification (c-index 0.60 for CHADS2 alone versus 0.67 for CHADS2 plus CX3CL1 >0.24 ng/ml, p = 0.1). Aspirin use was associated with lower levels of CX3CL1 (p = 0.0002) and diabetes with higher levels (p = 0.031). There was no association between CXCL4 and CCL2 plasma levels and outcomes. There is an independent association between low plasma CX3CL1 levels and low risk of major cardiovascular events in AF patients, as well as a linear association between CX3CL1 plasma levels and CHADS2-defined cardiovascular risk. The potential for CX3CL1 in refining risk stratification in AF patients merits consideration. © 2014 S. Karger AG, Basel.

  19. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  20. Design and optimization of a portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele

    The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.

  1. Development of a thermal and structural analysis procedure for cooled radial turbines

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Deanna, Russell G.

    1988-01-01

    A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine is considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analyses. An inviscid, quasi three-dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous one-dimensional internal flow code for the momentum and energy equation. These boundary conditions are input to a three-dimensional heat conduction code for calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results from this case are included.

  2. COMPUTATION OF GLOBAL PHOTOCHEMISTRY WITH SMVGEAR II (R823186)

    EPA Science Inventory

    A computer model was developed to simulate global gas-phase photochemistry. The model solves chemical equations with SMVGEAR II, a sparse-matrix, vectorized Gear-type code. To obtain SMVGEAR II, the original SMVGEAR code was modified to allow computation of different sets of chem...

  3. Computational strategies for three-dimensional flow simulations on distributed computer systems. Ph.D. Thesis Semiannual Status Report, 15 Aug. 1993 - 15 Feb. 1994

    NASA Technical Reports Server (NTRS)

    Weed, Richard Allen; Sankar, L. N.

    1994-01-01

    An increasing amount of research activity in computational fluid dynamics has been devoted to the development of efficient algorithms for parallel computing systems. The increasing performance to price ratio of engineering workstations has led to research to development procedures for implementing a parallel computing system composed of distributed workstations. This thesis proposal outlines an ongoing research program to develop efficient strategies for performing three-dimensional flow analysis on distributed computing systems. The PVM parallel programming interface was used to modify an existing three-dimensional flow solver, the TEAM code developed by Lockheed for the Air Force, to function as a parallel flow solver on clusters of workstations. Steady flow solutions were generated for three different wing and body geometries to validate the code and evaluate code performance. The proposed research will extend the parallel code development to determine the most efficient strategies for unsteady flow simulations.

  4. Computerized systems analysis and optimization of aircraft engine performance, weight, and life cycle costs

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1979-01-01

    The computational techniques utilized to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements are described. The characteristics and use of the following computer codes are discussed: (1) NNEP - a very general cycle analysis code that can assemble an arbitrary matrix fans, turbines, ducts, shafts, etc., into a complete gas turbine engine and compute on- and off-design thermodynamic performance; (2) WATE - a preliminary design procedure for calculating engine weight using the component characteristics determined by NNEP; (3) POD DRG - a table look-up program to calculate wave and friction drag of nacelles; (4) LIFCYC - a computer code developed to calculate life cycle costs of engines based on the output from WATE; and (5) INSTAL - a computer code developed to calculate installation effects, inlet performance and inlet weight. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight, and cost for representative types of aircraft and missions.

  5. A Combinatorial Geometry Computer Description of the M9 ACE (Armored Combat Earthmover) Vehicle

    DTIC Science & Technology

    1984-12-01

    program requires as input the M9 target descriptions as processed by the Geometric Information for Targets ( GIFT ) ’ computer code. The first step is...model of the target. This COM-GEOM target description is used as input to the Geometric Information For Targets ( GIFT ) computer code. Among other...things, the GIFT code traces shotlines through a COM-GEOM description from any specified aspect, listing pertinent information about each component hit

  6. Characterizing the Properties of a Woven SiC/SiC Composite Using W-CEMCAN Computer Code

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.

    1999-01-01

    A micromechanics based computer code to predict the thermal and mechanical properties of woven ceramic matrix composites (CMC) is developed. This computer code, W-CEMCAN (Woven CEramic Matrix Composites ANalyzer), predicts the properties of two-dimensional woven CMC at any temperature and takes into account various constituent geometries and volume fractions. This computer code is used to predict the thermal and mechanical properties of an advanced CMC composed of 0/90 five-harness (5 HS) Sylramic fiber which had been chemically vapor infiltrated (CVI) with boron nitride (BN) and SiC interphase coatings and melt-infiltrated (MI) with SiC. The predictions, based on the bulk constituent properties from the literature, are compared with measured experimental data. Based on the comparison. improved or calibrated properties for the constituent materials are then developed for use by material developers/designers. The computer code is then used to predict the properties of a composite with the same constituents but with different fiber volume fractions. The predictions are compared with measured data and a good agreement is achieved.

  7. Fault tolerant computing: A preamble for assuring viability of large computer systems

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1977-01-01

    The need for fault-tolerant computing is addressed from the viewpoints of (1) why it is needed, (2) how to apply it in the current state of technology, and (3) what it means in the context of the Phoenix computer system and other related systems. To this end, the value of concurrent error detection and correction is described. User protection, program retry, and repair are among the factors considered. The technology of algebraic codes to protect memory systems and arithmetic codes to protect memory systems and arithmetic codes to protect arithmetic operations is discussed.

  8. The Advanced Software Development and Commercialization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallopoulos, E.; Canfield, T.R.; Minkoff, M.

    1990-09-01

    This is the first of a series of reports pertaining to progress in the Advanced Software Development and Commercialization Project, a joint collaborative effort between the Center for Supercomputing Research and Development of the University of Illinois and the Computing and Telecommunications Division of Argonne National Laboratory. The purpose of this work is to apply techniques of parallel computing that were pioneered by University of Illinois researchers to mature computational fluid dynamics (CFD) and structural dynamics (SD) computer codes developed at Argonne. The collaboration in this project will bring this unique combination of expertise to bear, for the first time,more » on industrially important problems. By so doing, it will expose the strengths and weaknesses of existing techniques for parallelizing programs and will identify those problems that need to be solved in order to enable wide spread production use of parallel computers. Secondly, the increased efficiency of the CFD and SD codes themselves will enable the simulation of larger, more accurate engineering models that involve fluid and structural dynamics. In order to realize the above two goals, we are considering two production codes that have been developed at ANL and are widely used by both industry and Universities. These are COMMIX and WHAMS-3D. The first is a computational fluid dynamics code that is used for both nuclear reactor design and safety and as a design tool for the casting industry. The second is a three-dimensional structural dynamics code used in nuclear reactor safety as well as crashworthiness studies. These codes are currently available for both sequential and vector computers only. Our main goal is to port and optimize these two codes on shared memory multiprocessors. In so doing, we shall establish a process that can be followed in optimizing other sequential or vector engineering codes for parallel processors.« less

  9. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  10. Poetry in Programs: A Brief Examination of Software Aesthetics, Including Observations on the History of Programming Styles and Speculations on Post-object Programming

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    This viewgraph presentation provides samples of computer code which have characteristics of poetic verse, and addresses the theoretical underpinnings of artistic coding, as well as how computer language influences software style, and the possible style of future coding.

  11. Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing

    NASA Astrophysics Data System (ADS)

    Salamone, Joseph A., III

    Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.

  12. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    NASA Astrophysics Data System (ADS)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.

    2017-02-01

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.

  13. Visual Computing Environment Workshop

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles (Compiler)

    1998-01-01

    The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.

  14. Force user's manual: A portable, parallel FORTRAN

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.; Benten, Muhammad S.; Arenstorf, Norbert S.; Ramanan, Aruna V.

    1990-01-01

    The use of Force, a parallel, portable FORTRAN on shared memory parallel computers is described. Force simplifies writing code for parallel computers and, once the parallel code is written, it is easily ported to computers on which Force is installed. Although Force is nearly the same for all computers, specific details are included for the Cray-2, Cray-YMP, Convex 220, Flex/32, Encore, Sequent, Alliant computers on which it is installed.

  15. Monte Carlo simulation of Ising models by multispin coding on a vector computer

    NASA Astrophysics Data System (ADS)

    Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus

    1984-11-01

    Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.

  16. Thrust chamber performance using Navier-Stokes solution. [space shuttle main engine viscous nozzle calculation

    NASA Technical Reports Server (NTRS)

    Chan, J. S.; Freeman, J. A.

    1984-01-01

    The viscous, axisymmetric flow in the thrust chamber of the space shuttle main engine (SSME) was computed on the CRAY 205 computer using the general interpolants method (GIM) code. Results show that the Navier-Stokes codes can be used for these flows to study trends and viscous effects as well as determine flow patterns; but further research and development is needed before they can be used as production tools for nozzle performance calculations. The GIM formulation, numerical scheme, and computer code are described. The actual SSME nozzle computation showing grid points, flow contours, and flow parameter plots is discussed. The computer system and run times/costs are detailed.

  17. Finite difference time domain electromagnetic scattering from frequency-dependent lossy materials

    NASA Technical Reports Server (NTRS)

    Luebbers, Raymond J.; Beggs, John H.

    1991-01-01

    Four different FDTD computer codes and companion Radar Cross Section (RCS) conversion codes on magnetic media are submitted. A single three dimensional dispersive FDTD code for both dispersive dielectric and magnetic materials was developed, along with a user's manual. The extension of FDTD to more complicated materials was made. The code is efficient and is capable of modeling interesting radar targets using a modest computer workstation platform. RCS results for two different plate geometries are reported. The FDTD method was also extended to computing far zone time domain results in two dimensions. Also the capability to model nonlinear materials was incorporated into FDTD and validated.

  18. Multitasking the code ARC3D. [for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Barton, John T.; Hsiung, Christopher C.

    1986-01-01

    The CRAY multitasking system was developed in order to utilize all four processors and sharply reduce the wall clock run time. This paper describes the techniques used to modify the computational fluid dynamics code ARC3D for this run and analyzes the achieved speedup. The ARC3D code solves either the Euler or thin-layer N-S equations using an implicit approximate factorization scheme. Results indicate that multitask processing can be used to achieve wall clock speedup factors of over three times, depending on the nature of the program code being used. Multitasking appears to be particularly advantageous for large-memory problems running on multiple CPU computers.

  19. The joint methane profiles retrieval approach from GOSAT TIR and SWIR spectra

    NASA Astrophysics Data System (ADS)

    Zadvornykh, Ilya V.; Gribanov, Konstantin G.; Zakharov, Vyacheslav I.; Imasu, Ryoichi

    2017-11-01

    In this paper we present a method, using methane as example, which allows more accurate greenhouse gases retrieval in the Earth's atmosphere. Using the new version of the FIRE-ARMS software, supplemented with the VLIDORT vector radiation transfer model, we carried out joint methane retrieval from TIR (Thermal Infrared Range) and SWIR (ShortWavelength Infrared Range) GOSAT spectra using optimal estimation method. MACC reanalysis data from the European Center for Medium-Range Forecasts (ECMWF), supplemented by data from aircraft measurements of the HIPPO experiment were used as a statistical ensemble.

  20. Addressing the challenges of standalone multi-core simulations in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Ocaya, R. O.; Terblans, J. J.

    2017-07-01

    Computational modelling in material science involves mathematical abstractions of force fields between particles with the aim to postulate, develop and understand materials by simulation. The aggregated pairwise interactions of the material's particles lead to a deduction of its macroscopic behaviours. For practically meaningful macroscopic scales, a large amount of data are generated, leading to vast execution times. Simulation times of hours, days or weeks for moderately sized problems are not uncommon. The reduction of simulation times, improved result accuracy and the associated software and hardware engineering challenges are the main motivations for many of the ongoing researches in the computational sciences. This contribution is concerned mainly with simulations that can be done on a "standalone" computer based on Message Passing Interfaces (MPI), parallel code running on hardware platforms with wide specifications, such as single/multi- processor, multi-core machines with minimal reconfiguration for upward scaling of computational power. The widely available, documented and standardized MPI library provides this functionality through the MPI_Comm_size (), MPI_Comm_rank () and MPI_Reduce () functions. A survey of the literature shows that relatively little is written with respect to the efficient extraction of the inherent computational power in a cluster. In this work, we discuss the main avenues available to tap into this extra power without compromising computational accuracy. We also present methods to overcome the high inertia encountered in single-node-based computational molecular dynamics. We begin by surveying the current state of the art and discuss what it takes to achieve parallelism, efficiency and enhanced computational accuracy through program threads and message passing interfaces. Several code illustrations are given. The pros and cons of writing raw code as opposed to using heuristic, third-party code are also discussed. The growing trend towards graphical processor units and virtual computing clouds for high-performance computing is also discussed. Finally, we present the comparative results of vacancy formation energy calculations using our own parallelized standalone code called Verlet-Stormer velocity (VSV) operating on 30,000 copper atoms. The code is based on the Sutton-Chen implementation of the Finnis-Sinclair pairwise embedded atom potential. A link to the code is also given.

  1. Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing

    DTIC Science & Technology

    2008-01-01

    complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING

  2. Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D

    NASA Technical Reports Server (NTRS)

    Carle, Alan; Fagan, Mike; Green, Lawrence L.

    1998-01-01

    This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

  3. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  4. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  5. Development of a 3-D upwind PNS code for chemically reacting hypersonic flowfields

    NASA Technical Reports Server (NTRS)

    Tannehill, J. C.; Wadawadigi, G.

    1992-01-01

    Two new parabolized Navier-Stokes (PNS) codes were developed to compute the three-dimensional, viscous, chemically reacting flow of air around hypersonic vehicles such as the National Aero-Space Plane (NASP). The first code (TONIC) solves the gas dynamic and species conservation equations in a fully coupled manner using an implicit, approximately-factored, central-difference algorithm. This code was upgraded to include shock fitting and the capability of computing the flow around complex body shapes. The revised TONIC code was validated by computing the chemically-reacting (M(sub infinity) = 25.3) flow around a 10 deg half-angle cone at various angles of attack and the Ames All-Body model at 0 deg angle of attack. The results of these calculations were in good agreement with the results from the UPS code. One of the major drawbacks of the TONIC code is that the central-differencing of fluxes across interior flowfield discontinuities tends to introduce errors into the solution in the form of local flow property oscillations. The second code (UPS), originally developed for a perfect gas, has been extended to permit either perfect gas, equilibrium air, or nonequilibrium air computations. The code solves the PNS equations using a finite-volume, upwind TVD method based on Roe's approximate Riemann solver that was modified to account for real gas effects. The dissipation term associated with this algorithm is sufficiently adaptive to flow conditions that, even when attempting to capture very strong shock waves, no additional smoothing is required. For nonequilibrium calculations, the code solves the fluid dynamic and species continuity equations in a loosely-coupled manner. This code was used to calculate the hypersonic, laminar flow of chemically reacting air over cones at various angles of attack. In addition, the flow around the McDonnel Douglas generic option blended-wing-body was computed and comparisons were made between the perfect gas, equilibrium air, and the nonequilibrium air results.

  6. Linear chirp phase perturbing approach for finding binary phased codes

    NASA Astrophysics Data System (ADS)

    Li, Bing C.

    2017-05-01

    Binary phased codes have many applications in communication and radar systems. These applications require binary phased codes to have low sidelobes in order to reduce interferences and false detection. Barker codes are the ones that satisfy these requirements and they have lowest maximum sidelobes. However, Barker codes have very limited code lengths (equal or less than 13) while many applications including low probability of intercept radar, and spread spectrum communication, require much higher code lengths. The conventional techniques of finding binary phased codes in literatures include exhaust search, neural network, and evolutionary methods, and they all require very expensive computation for large code lengths. Therefore these techniques are limited to find binary phased codes with small code lengths (less than 100). In this paper, by analyzing Barker code, linear chirp, and P3 phases, we propose a new approach to find binary codes. Experiments show that the proposed method is able to find long low sidelobe binary phased codes (code length >500) with reasonable computational cost.

  7. Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Bartels, Robert E.

    2002-01-01

    A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.

  8. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  9. Manual for obscuration code with space station applications

    NASA Technical Reports Server (NTRS)

    Marhefka, R. J.; Takacs, L.

    1986-01-01

    The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  11. Progressive fracture of fiber composites

    NASA Technical Reports Server (NTRS)

    Irvin, T. B.; Ginty, C. A.

    1983-01-01

    Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.

  12. Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1994-01-01

    An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.

  13. Design geometry and design/off-design performance computer codes for compressors and turbines

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1995-01-01

    This report summarizes some NASA Lewis (i.e., government owned) computer codes capable of being used for airbreathing propulsion system studies to determine the design geometry and to predict the design/off-design performance of compressors and turbines. These are not CFD codes; velocity-diagram energy and continuity computations are performed fore and aft of the blade rows using meanline, spanline, or streamline analyses. Losses are provided by empirical methods. Both axial-flow and radial-flow configurations are included.

  14. PerSEUS: Ultra-Low-Power High Performance Computing for Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Doxas, I.; Andreou, A.; Lyon, J.; Angelopoulos, V.; Lu, S.; Pritchett, P. L.

    2017-12-01

    Peta-op SupErcomputing Unconventional System (PerSEUS) aims to explore the use for High Performance Scientific Computing (HPC) of ultra-low-power mixed signal unconventional computational elements developed by Johns Hopkins University (JHU), and demonstrate that capability on both fluid and particle Plasma codes. We will describe the JHU Mixed-signal Unconventional Supercomputing Elements (MUSE), and report initial results for the Lyon-Fedder-Mobarry (LFM) global magnetospheric MHD code, and a UCLA general purpose relativistic Particle-In-Cell (PIC) code.

  15. Multiple grid problems on concurrent-processing computers

    NASA Technical Reports Server (NTRS)

    Eberhardt, D. S.; Baganoff, D.

    1986-01-01

    Three computer codes were studied which make use of concurrent processing computer architectures in computational fluid dynamics (CFD). The three parallel codes were tested on a two processor multiple-instruction/multiple-data (MIMD) facility at NASA Ames Research Center, and are suggested for efficient parallel computations. The first code is a well-known program which makes use of the Beam and Warming, implicit, approximate factored algorithm. This study demonstrates the parallelism found in a well-known scheme and it achieved speedups exceeding 1.9 on the two processor MIMD test facility. The second code studied made use of an embedded grid scheme which is used to solve problems having complex geometries. The particular application for this study considered an airfoil/flap geometry in an incompressible flow. The scheme eliminates some of the inherent difficulties found in adapting approximate factorization techniques onto MIMD machines and allows the use of chaotic relaxation and asynchronous iteration techniques. The third code studied is an application of overset grids to a supersonic blunt body problem. The code addresses the difficulties encountered when using embedded grids on a compressible, and therefore nonlinear, problem. The complex numerical boundary system associated with overset grids is discussed and several boundary schemes are suggested. A boundary scheme based on the method of characteristics achieved the best results.

  16. Binary weight distributions of some Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Arnold, S.

    1992-01-01

    The binary weight distributions of the (7,5) and (15,9) Reed-Solomon (RS) codes and their duals are computed using the MacWilliams identities. Several mappings of symbols to bits are considered and those offering the largest binary minimum distance are found. These results are then used to compute bounds on the soft-decoding performance of these codes in the presence of additive Gaussian noise. These bounds are useful for finding large binary block codes with good performance and for verifying the performance obtained by specific soft-coding algorithms presently under development.

  17. A high temperature fatigue life prediction computer code based on the total strain version of StrainRange Partitioning (SRP)

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.; Saltsman, James F.

    1993-01-01

    A recently developed high-temperature fatigue life prediction computer code is presented and an example of its usage given. The code discussed is based on the Total Strain version of Strainrange Partitioning (TS-SRP). Included in this code are procedures for characterizing the creep-fatigue durability behavior of an alloy according to TS-SRP guidelines and predicting cyclic life for complex cycle types for both isothermal and thermomechanical conditions. A reasonably extensive materials properties database is included with the code.

  18. Turbomachinery Heat Transfer and Loss Modeling for 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Ameri, Ali

    2005-01-01

    This report's contents focus on making use of NASA Glenn on-site computational facilities,to develop, validate, and apply models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes to enhance the capability to compute heat transfer and losses in turbomachiney.

  19. Real-time computer treatment of THz passive device images with the high image quality

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  20. Fingerprinting Communication and Computation on HPC Machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean

    2010-06-02

    How do we identify what is actually running on high-performance computing systems? Names of binaries, dynamic libraries loaded, or other elements in a submission to a batch queue can give clues, but binary names can be changed, and libraries provide limited insight and resolution on the code being run. In this paper, we present a method for"fingerprinting" code running on HPC machines using elements of communication and computation. We then discuss how that fingerprint can be used to determine if the code is consistent with certain other types of codes, what a user usually runs, or what the user requestedmore » an allocation to do. In some cases, our techniques enable us to fingerprint HPC codes using runtime MPI data with a high degree of accuracy.« less

  1. Practices in source code sharing in astrophysics

    NASA Astrophysics Data System (ADS)

    Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly

    2013-02-01

    While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.

  2. Development of V/STOL methodology based on a higher order panel method

    NASA Technical Reports Server (NTRS)

    Bhateley, I. C.; Howell, G. A.; Mann, H. W.

    1983-01-01

    The development of a computational technique to predict the complex flowfields of V/STOL aircraft was initiated in which a number of modules and a potential flow aerodynamic code were combined in a comprehensive computer program. The modules were developed in a building-block approach to assist the user in preparing the geometric input and to compute parameters needed to simulate certain flow phenomena that cannot be handled directly within a potential flow code. The PAN AIR aerodynamic code, which is higher order panel method, forms the nucleus of this program. PAN AIR's extensive capability for allowing generalized boundary conditions allows the modules to interact with the aerodynamic code through the input and output files, thereby requiring no changes to the basic code and easy replacement of updated modules.

  3. Lattice surgery on the Raussendorf lattice

    NASA Astrophysics Data System (ADS)

    Herr, Daniel; Paler, Alexandru; Devitt, Simon J.; Nori, Franco

    2018-07-01

    Lattice surgery is a method to perform quantum computation fault-tolerantly by using operations on boundary qubits between different patches of the planar code. This technique allows for universal planar code computation without eliminating the intrinsic two-dimensional nearest-neighbor properties of the surface code that eases physical hardware implementations. Lattice surgery approaches to algorithmic compilation and optimization have been demonstrated to be more resource efficient for resource-intensive components of a fault-tolerant algorithm, and consequently may be preferable over braid-based logic. Lattice surgery can be extended to the Raussendorf lattice, providing a measurement-based approach to the surface code. In this paper we describe how lattice surgery can be performed on the Raussendorf lattice and therefore give a viable alternative to computation using braiding in measurement-based implementations of topological codes.

  4. 40 CFR 1033.110 - Emission diagnostics-general requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engine operation. (d) Record and store in computer memory any diagnostic trouble codes showing a... and understand the diagnostic trouble codes stored in the onboard computer with generic tools and...

  5. Airfoil Vibration Dampers program

    NASA Technical Reports Server (NTRS)

    Cook, Robert M.

    1991-01-01

    The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.

  6. Computer optimization of reactor-thermoelectric space power systems

    NASA Technical Reports Server (NTRS)

    Maag, W. L.; Finnegan, P. M.; Fishbach, L. H.

    1973-01-01

    A computer simulation and optimization code that has been developed for nuclear space power systems is described. The results of using this code to analyze two reactor-thermoelectric systems are presented.

  7. A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals

    NASA Technical Reports Server (NTRS)

    Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.

    1994-01-01

    Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.

  8. Command History for 1989.

    DTIC Science & Technology

    1990-09-01

    13 Bart Kuhn, GM-14 Samantha K. Maddox , GS-04 Mike Nakada, GM- 13 John Wolfe, GM-14 Reynaldo I. Monzon, GS- 12 Jose G. Suarez, GS- 11 19 Product...1410-09 GS-334-09 Janice Whiting Procurement Clerk Code 21 GS-1106-05 Separations Samantha Maddox Hoa T. Lu Supply Clerk Computer Specialist Code 21...Jennifer Thorp Royal S. Magnus Student Aide Personnel Research Psychologist Code 23 Code 12 GW-322-03 GS-180-11 Linda L. Turnmire Yvonne S. Baker Computer

  9. Ascent Aerodynamic Pressure Distributions on WB001

    NASA Technical Reports Server (NTRS)

    Vu, B.; Ruf, J.; Canabal, F.; Brunty, J.

    1996-01-01

    To support the reusable launch vehicle concept study, the aerodynamic data and surface pressure for WB001 were predicted using three computational fluid dynamic (CFD) codes at several flow conditions between code to code and code to aerodynamic database as well as available experimental data. A set of particular solutions have been selected and recommended for use in preliminary conceptual designs. These computational fluid dynamic (CFD) results have also been provided to the structure group for wing loading analysis.

  10. User's guide for vectorized code EQUIL for calculating equilibrium chemistry on Control Data STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Kumar, A.; Graves, R. A., Jr.; Weilmuenster, K. J.

    1980-01-01

    A vectorized code, EQUIL, was developed for calculating the equilibrium chemistry of a reacting gas mixture on the Control Data STAR-100 computer. The code provides species mole fractions, mass fractions, and thermodynamic and transport properties of the mixture for given temperature, pressure, and elemental mass fractions. The code is set up for the electrons H, He, C, O, N system of elements. In all, 24 chemical species are included.

  11. Computer code for charge-exchange plasma propagation

    NASA Technical Reports Server (NTRS)

    Robinson, R. S.; Kaufman, H. R.

    1981-01-01

    The propagation of the charge-exchange plasma from an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ASNI Standard FORTRAN.

  12. Self-Scheduling Parallel Methods for Multiple Serial Codes with Application to WOPWOP

    NASA Technical Reports Server (NTRS)

    Long, Lyle N.; Brentner, Kenneth S.

    2000-01-01

    This paper presents a scheme for efficiently running a large number of serial jobs on parallel computers. Two examples are given of computer programs that run relatively quickly, but often they must be run numerous times to obtain all the results needed. It is very common in science and engineering to have codes that are not massive computing challenges in themselves, but due to the number of instances that must be run, they do become large-scale computing problems. The two examples given here represent common problems in aerospace engineering: aerodynamic panel methods and aeroacoustic integral methods. The first example simply solves many systems of linear equations. This is representative of an aerodynamic panel code where someone would like to solve for numerous angles of attack. The complete code for this first example is included in the appendix so that it can be readily used by others as a template. The second example is an aeroacoustics code (WOPWOP) that solves the Ffowcs Williams Hawkings equation to predict the far-field sound due to rotating blades. In this example, one quite often needs to compute the sound at numerous observer locations, hence parallelization is utilized to automate the noise computation for a large number of observers.

  13. Computer Code for Transportation Network Design and Analysis

    DOT National Transportation Integrated Search

    1977-01-01

    This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...

  14. Current and anticipated uses of the thermal hydraulics codes at the NRC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caruso, R.

    1997-07-01

    The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support thesemore » needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.« less

  15. Analyzing Pulse-Code Modulation On A Small Computer

    NASA Technical Reports Server (NTRS)

    Massey, David E.

    1988-01-01

    System for analysis pulse-code modulation (PCM) comprises personal computer, computer program, and peripheral interface adapter on circuit board that plugs into expansion bus of computer. Functions essentially as "snapshot" PCM decommutator, which accepts and stores thousands of frames of PCM data, sifts through them repeatedly to process according to routines specified by operator. Enables faster testing and involves less equipment than older testing systems.

  16. A fast technique for computing syndromes of BCH and RS codes. [deep space network

    NASA Technical Reports Server (NTRS)

    Reed, I. S.; Truong, T. K.; Miller, R. L.

    1979-01-01

    A combination of the Chinese Remainder Theorem and Winograd's algorithm is used to compute transforms of odd length over GF(2 to the m power). Such transforms are used to compute the syndromes needed for decoding CBH and RS codes. The present scheme requires substantially fewer multiplications and additions than the conventional method of computing the syndromes directly.

  17. Computational techniques for solar wind flows past terrestrial planets: Theory and computer programs

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Chaussee, D. S.; Trudinger, B. C.; Spreiter, J. R.

    1977-01-01

    The interaction of the solar wind with terrestrial planets can be predicted using a computer program based on a single fluid, steady, dissipationless, magnetohydrodynamic model to calculate the axisymmetric, supersonic, super-Alfvenic solar wind flow past both magnetic and nonmagnetic planets. The actual calculations are implemented by an assemblage of computer codes organized into one program. These include finite difference codes which determine the gas-dynamic solution, together with a variety of special purpose output codes for determining and automatically plotting both flow field and magnetic field results. Comparisons are made with previous results, and results are presented for a number of solar wind flows. The computational programs developed are documented and are presented in a general user's manual which is included.

  18. Numerical computation of space shuttle orbiter flow field

    NASA Technical Reports Server (NTRS)

    Tannehill, John C.

    1988-01-01

    A new parabolized Navier-Stokes (PNS) code has been developed to compute the hypersonic, viscous chemically reacting flow fields around 3-D bodies. The flow medium is assumed to be a multicomponent mixture of thermally perfect but calorically imperfect gases. The new PNS code solves the gas dynamic and species conservation equations in a coupled manner using a noniterative, implicit, approximately factored, finite difference algorithm. The space-marching method is made well-posed by special treatment of the streamwise pressure gradient term. The code has been used to compute hypersonic laminar flow of chemically reacting air over cones at angle of attack. The results of the computations are compared with the results of reacting boundary-layer computations and show excellent agreement.

  19. Determining mode excitations of vacuum electronics devices via three-dimensional simulations using the SOS code

    NASA Technical Reports Server (NTRS)

    Warren, Gary

    1988-01-01

    The SOS code is used to compute the resonance modes (frequency-domain information) of sample devices and separately to compute the transient behavior of the same devices. A code, DOT, is created to compute appropriate dot products of the time-domain and frequency-domain results. The transient behavior of individual modes in the device is then plotted. Modes in a coupled-cavity traveling-wave tube (CCTWT) section excited beam in separate simulations are analyzed. Mode energy vs. time and mode phase vs. time are computed and it is determined whether the transient waves are forward or backward waves for each case. Finally, the hot-test mode frequencies of the CCTWT section are computed.

  20. An evaluation of TRAC-PF1/MOD1 computer code performance during posttest simulations of Semiscale MOD-2C feedwater line break transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, D.G.: Watkins, J.C.

    This report documents an evaluation of the TRAC-PF1/MOD1 reactor safety analysis computer code during computer simulations of feedwater line break transients. The experimental data base for the evaluation included the results of three bottom feedwater line break tests performed in the Semiscale Mod-2C test facility. The tests modeled 14.3% (S-FS-7), 50% (S-FS-11), and 100% (S-FS-6B) breaks. The test facility and the TRAC-PF1/MOD1 model used in the calculations are described. Evaluations of the accuracy of the calculations are presented in the form of comparisons of measured and calculated histories of selected parameters associated with the primary and secondary systems. In additionmore » to evaluating the accuracy of the code calculations, the computational performance of the code during the simulations was assessed. A conclusion was reached that the code is capable of making feedwater line break transient calculations efficiently, but there is room for significant improvements in the simulations that were performed. Recommendations are made for follow-on investigations to determine how to improve future feedwater line break calculations and for code improvements to make the code easier to use.« less

  1. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA andmore » MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  2. Prediction of sound radiated from different practical jet engine inlets

    NASA Technical Reports Server (NTRS)

    Zinn, B. T.; Meyer, W. L.

    1980-01-01

    Existing computer codes for calculating the far field radiation patterns surrounding various practical jet engine inlet configurations under different excitation conditions were upgraded. The computer codes were refined and expanded so that they are now more efficient computationally by a factor of about three and they are now capable of producing accurate results up to nondimensional wave numbers of twenty. Computer programs were also developed to help generate accurate geometrical representations of the inlets to be investigated. This data is required as input for the computer programs which calculate the sound fields. This new geometry generating computer program considerably reduces the time required to generate the input data which was one of the most time consuming steps in the process. The results of sample runs using the NASA-Lewis QCSEE inlet are presented and comparison of run times and accuracy are made between the old and upgraded computer codes. The overall accuracy of the computations is determined by comparison of the results of the computations with simple source solutions.

  3. Error Suppression for Hamiltonian-Based Quantum Computation Using Subsystem Codes

    NASA Astrophysics Data System (ADS)

    Marvian, Milad; Lidar, Daniel A.

    2017-01-01

    We present general conditions for quantum error suppression for Hamiltonian-based quantum computation using subsystem codes. This involves encoding the Hamiltonian performing the computation using an error detecting subsystem code and the addition of a penalty term that commutes with the encoded Hamiltonian. The scheme is general and includes the stabilizer formalism of both subspace and subsystem codes as special cases. We derive performance bounds and show that complete error suppression results in the large penalty limit. To illustrate the power of subsystem-based error suppression, we introduce fully two-local constructions for protection against local errors of the swap gate of adiabatic gate teleportation and the Ising chain in a transverse field.

  4. Error Suppression for Hamiltonian-Based Quantum Computation Using Subsystem Codes.

    PubMed

    Marvian, Milad; Lidar, Daniel A

    2017-01-20

    We present general conditions for quantum error suppression for Hamiltonian-based quantum computation using subsystem codes. This involves encoding the Hamiltonian performing the computation using an error detecting subsystem code and the addition of a penalty term that commutes with the encoded Hamiltonian. The scheme is general and includes the stabilizer formalism of both subspace and subsystem codes as special cases. We derive performance bounds and show that complete error suppression results in the large penalty limit. To illustrate the power of subsystem-based error suppression, we introduce fully two-local constructions for protection against local errors of the swap gate of adiabatic gate teleportation and the Ising chain in a transverse field.

  5. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  6. Navier-Stokes and Comprehensive Analysis Performance Predictions of the NREL Phase VI Experiment

    NASA Technical Reports Server (NTRS)

    Duque, Earl P. N.; Burklund, Michael D.; Johnson, Wayne

    2003-01-01

    A vortex lattice code, CAMRAD II, and a Reynolds-Averaged Navier-Stoke code, OVERFLOW-D2, were used to predict the aerodynamic performance of a two-bladed horizontal axis wind turbine. All computations were compared with experimental data that was collected at the NASA Ames Research Center 80- by 120-Foot Wind Tunnel. Computations were performed for both axial as well as yawed operating conditions. Various stall delay models and dynamics stall models were used by the CAMRAD II code. Comparisons between the experimental data and computed aerodynamic loads show that the OVERFLOW-D2 code can accurately predict the power and spanwise loading of a wind turbine rotor.

  7. Fault-tolerance in Two-dimensional Topological Systems

    NASA Astrophysics Data System (ADS)

    Anderson, Jonas T.

    This thesis is a collection of ideas with the general goal of building, at least in the abstract, a local fault-tolerant quantum computer. The connection between quantum information and topology has proven to be an active area of research in several fields. The introduction of the toric code by Alexei Kitaev demonstrated the usefulness of topology for quantum memory and quantum computation. Many quantum codes used for quantum memory are modeled by spin systems on a lattice, with operators that extract syndrome information placed on vertices or faces of the lattice. It is natural to wonder whether the useful codes in such systems can be classified. This thesis presents work that leverages ideas from topology and graph theory to explore the space of such codes. Homological stabilizer codes are introduced and it is shown that, under a set of reasonable assumptions, any qubit homological stabilizer code is equivalent to either a toric code or a color code. Additionally, the toric code and the color code correspond to distinct classes of graphs. Many systems have been proposed as candidate quantum computers. It is very desirable to design quantum computing architectures with two-dimensional layouts and low complexity in parity-checking circuitry. Kitaev's surface codes provided the first example of codes satisfying this property. They provided a new route to fault tolerance with more modest overheads and thresholds approaching 1%. The recently discovered color codes share many properties with the surface codes, such as the ability to perform syndrome extraction locally in two dimensions. Some families of color codes admit a transversal implementation of the entire Clifford group. This work investigates color codes on the 4.8.8 lattice known as triangular codes. I develop a fault-tolerant error-correction strategy for these codes in which repeated syndrome measurements on this lattice generate a three-dimensional space-time combinatorial structure. I then develop an integer program that analyzes this structure and determines the most likely set of errors consistent with the observed syndrome values. I implement this integer program to find the threshold for depolarizing noise on small versions of these triangular codes. Because the threshold for magic-state distillation is likely to be higher than this value and because logical CNOT gates can be performed by code deformation in a single block instead of between pairs of blocks, the threshold for fault-tolerant quantum memory for these codes is also the threshold for fault-tolerant quantum computation with them. Since the advent of a threshold theorem for quantum computers much has been improved upon. Thresholds have increased, architectures have become more local, and gate sets have been simplified. The overhead for magic-state distillation has been studied, but not nearly to the extent of the aforementioned topics. A method for greatly reducing this overhead, known as reusable magic states, is studied here. While examples of reusable magic states exist for Clifford gates, I give strong reasons to believe they do not exist for non-Clifford gates.

  8. System, methods and apparatus for program optimization for multi-threaded processor architectures

    DOEpatents

    Bastoul, Cedric; Lethin, Richard A; Leung, Allen K; Meister, Benoit J; Szilagyi, Peter; Vasilache, Nicolas T; Wohlford, David E

    2015-01-06

    Methods, apparatus and computer software product for source code optimization are provided. In an exemplary embodiment, a first custom computing apparatus is used to optimize the execution of source code on a second computing apparatus. In this embodiment, the first custom computing apparatus contains a memory, a storage medium and at least one processor with at least one multi-stage execution unit. The second computing apparatus contains at least two multi-stage execution units that allow for parallel execution of tasks. The first custom computing apparatus optimizes the code for parallelism, locality of operations and contiguity of memory accesses on the second computing apparatus. This Abstract is provided for the sole purpose of complying with the Abstract requirement rules. This Abstract is submitted with the explicit understanding that it will not be used to interpret or to limit the scope or the meaning of the claims.

  9. An efficient method for computing unsteady transonic aerodynamics of swept wings with control surfaces

    NASA Technical Reports Server (NTRS)

    Liu, D. D.; Kao, Y. F.; Fung, K. Y.

    1989-01-01

    A transonic equivalent strip (TES) method was further developed for unsteady flow computations of arbitrary wing planforms. The TES method consists of two consecutive correction steps to a given nonlinear code such as LTRAN2; namely, the chordwise mean flow correction and the spanwise phase correction. The computation procedure requires direct pressure input from other computed or measured data. Otherwise, it does not require airfoil shape or grid generation for given planforms. To validate the computed results, four swept wings of various aspect ratios, including those with control surfaces, are selected as computational examples. Overall trends in unsteady pressures are established with those obtained by XTRAN3S codes, Isogai's full potential code and measured data by NLR and RAE. In comparison with these methods, the TES has achieved considerable saving in computer time and reasonable accuracy which suggests immediate industrial applications.

  10. Method and computer program product for maintenance and modernization backlogging

    DOEpatents

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  11. Development Of A Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Yoon, Seokkwan; Kwak, Dochan

    1993-01-01

    Report discusses aspects of development of CENS3D computer code, solving three-dimensional Navier-Stokes equations of compressible, viscous, unsteady flow. Implements implicit finite-difference or finite-volume numerical-integration scheme, called "lower-upper symmetric-Gauss-Seidel" (LU-SGS), offering potential for very low computer time per iteration and for fast convergence.

  12. A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics

    NASA Technical Reports Server (NTRS)

    Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela

    2015-01-01

    Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information

  13. "SMART": A Compact and Handy FORTRAN Code for the Physics of Stellar Atmospheres

    NASA Astrophysics Data System (ADS)

    Sapar, A.; Poolamäe, R.

    2003-01-01

    A new computer code SMART (Spectra from Model Atmospheres by Radiative Transfer) for computing the stellar spectra, forming in plane-parallel atmospheres, has been compiled by us and A. Aret. To guarantee wide compatibility of the code with shell environment, we chose FORTRAN-77 as programming language and tried to confine ourselves to common part of its numerous versions both in WINDOWS and LINUX. SMART can be used for studies of several processes in stellar atmospheres. The current version of the programme is undergoing rapid changes due to our goal to elaborate a simple, handy and compact code. Instead of linearisation (being a mathematical method of recurrent approximations) we propose to use the physical evolutionary changes or in other words relaxation of quantum state populations rates from LTE to NLTE has been studied using small number of NLTE states. This computational scheme is essentially simpler and more compact than the linearisation. This relaxation scheme enables using instead of the Λ-iteration procedure a physically changing emissivity (or the source function) which incorporates in itself changing Menzel coefficients for NLTE quantum state populations. However, the light scattering on free electrons is in the terms of Feynman graphs a real second-order quantum process and cannot be reduced to consequent processes of absorption and emission as in the case of radiative transfer in spectral lines. With duly chosen input parameters the code SMART enables computing radiative acceleration to the matter of stellar atmosphere in turbulence clumps. This also enables to connect the model atmosphere in more detail with the problem of the stellar wind triggering. Another problem, which has been incorporated into the computer code SMART, is diffusion of chemical elements and their isotopes in the atmospheres of chemically peculiar (CP) stars due to usual radiative acceleration and the essential additional acceleration generated by the light-induced drift. As a special case, using duly chosen pixels on the stellar disk, the spectrum of rotating star can be computed. No instrumental broadening has been incorporated in the code of SMART. To facilitate study of stellar spectra, a GUI (Graphical User Interface) with selection of labels by ions has been compiled to study the spectral lines of different elements and ions in the computed emergent flux. An amazing feature of SMART is that its code is very short: it occupies only 4 two-sided two-column A4 sheets in landscape format. In addition, if well commented, it is quite easily readable and understandable. We have used the tactics of writing the comments on the right-side margin (columns starting from 73). Such short code has been composed widely using the unified input physics (for example the ionisation cross-sections for bound-free transitions and the electron and ion collision rates). As current restriction to the application area of the present version of the SMART is that molecules are since ignored. Thus, it can be used only for luke and hot stellar atmospheres. In the computer code we have tried to avoid bulky often over-optimised methods, primarily meant to spare the time of computations. For instance, we compute the continuous absorption coefficient at every wavelength. Nevertheless, during an hour by the personal computer in our disposal AMD Athlon XP 1700+, 512MB DDRAM) a stellar spectrum with spectral step resolution λ / dλ = 3D100,000 for spectral interval 700 -- 30,000 Å is computed. The model input data and the line data used by us are both the ones computed and compiled by R. Kurucz. In order to follow presence and representability of quantum states and to enumerate them for NLTE studies a C++ code, transforming the needed data to the LATEX version, has been compiled. Thus we have composed a quantum state list for all neutrals and ions in the Kurucz file 'gfhyperall.dat'. The list enables more adequately to compose the concept of super-states, including partly correlating super-states. We are grateful to R. Kurucz for making available by CD-ROMs and Internet his computer codes ATLAS and SYNTHE used by us as a starting point in composing of the new computer code. We are also grateful to Estonian Science Foundation for grant ESF-4701.

  14. Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold

    1997-01-01

    The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.

  15. Transferring ecosystem simulation codes to supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1995-01-01

    Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.

  16. Duct flow nonuniformities for Space Shuttle Main Engine (SSME)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A three-duct Space Shuttle Main Engine (SSME) Hot Gas Manifold geometry code was developed for use. The methodology of the program is described, recommendations on its implementation made, and an input guide, input deck listing, and a source code listing provided. The code listing is strewn with an abundance of comments to assist the user in following its development and logic. A working source deck will be provided. A thorough analysis was made of the proper boundary conditions and chemistry kinetics necessary for an accurate computational analysis of the flow environment in the SSME fuel side preburner chamber during the initial startup transient. Pertinent results were presented to facilitate incorporation of these findings into an appropriate CFD code. The computation must be a turbulent computation, since the flow field turbulent mixing will have a profound effect on the chemistry. Because of the additional equations demanded by the chemistry model it is recommended that for expediency a simple algebraic mixing length model be adopted. Performing this computation for all or selected time intervals of the startup time will require an abundance of computer CPU time regardless of the specific CFD code selected.

  17. War of Ontology Worlds: Mathematics, Computer Code, or Esperanto?

    PubMed Central

    Rzhetsky, Andrey; Evans, James A.

    2011-01-01

    The use of structured knowledge representations—ontologies and terminologies—has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies. PMID:21980276

  18. Verifying a computational method for predicting extreme ground motion

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  19. An evaluation of four single element airfoil analytic methods

    NASA Technical Reports Server (NTRS)

    Freuler, R. J.; Gregorek, G. M.

    1979-01-01

    A comparison of four computer codes for the analysis of two-dimensional single element airfoil sections is presented for three classes of section geometries. Two of the computer codes utilize vortex singularities methods to obtain the potential flow solution. The other two codes solve the full inviscid potential flow equation using finite differencing techniques, allowing results to be obtained for transonic flow about an airfoil including weak shocks. Each program incorporates boundary layer routines for computing the boundary layer displacement thickness and boundary layer effects on aerodynamic coefficients. Computational results are given for a symmetrical section represented by an NACA 0012 profile, a conventional section illustrated by an NACA 65A413 profile, and a supercritical type section for general aviation applications typified by a NASA LS(1)-0413 section. The four codes are compared and contrasted in the areas of method of approach, range of applicability, agreement among each other and with experiment, individual advantages and disadvantages, computer run times and memory requirements, and operational idiosyncrasies.

  20. War of ontology worlds: mathematics, computer code, or Esperanto?

    PubMed

    Rzhetsky, Andrey; Evans, James A

    2011-09-01

    The use of structured knowledge representations-ontologies and terminologies-has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies.

  1. 48 CFR 1819.1005 - Applicability.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...

  2. 48 CFR 1819.1005 - Applicability.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...

  3. 48 CFR 1819.1005 - Applicability.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...

  4. 40 CFR 1048.110 - How must my engines diagnose malfunctions?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., the MIL may stay off during later engine operation. (d) Store trouble codes in computer memory. Record and store in computer memory any diagnostic trouble codes showing a malfunction that should illuminate...

  5. Recent applications of the transonic wing analysis computer code, TWING

    NASA Technical Reports Server (NTRS)

    Subramanian, N. R.; Holst, T. L.; Thomas, S. D.

    1982-01-01

    An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.

  6. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

  7. User's Manual for FEMOM3DS. Version 1.0

    NASA Technical Reports Server (NTRS)

    Reddy, C.J.; Deshpande, M. D.

    1997-01-01

    FEMOM3DS is a computer code written in FORTRAN 77 to compute electromagnetic(EM) scattering characteristics of a three dimensional object with complex materials using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. This code uses the tetrahedral elements, with vector edge basis functions for FEM in the volume of the cavity and the triangular elements with the basis functions similar to that described for MoM at the outer boundary. By virtue of FEM, this code can handle any arbitrarily shaped three-dimensional cavities filled with inhomogeneous lossy materials. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.

  8. Performance measures for transform data coding.

    NASA Technical Reports Server (NTRS)

    Pearl, J.; Andrews, H. C.; Pratt, W. K.

    1972-01-01

    This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.

  9. The Unified English Braille Code: Examination by Science, Mathematics, and Computer Science Technical Expert Braille Readers

    ERIC Educational Resources Information Center

    Holbrook, M. Cay; MacCuspie, P. Ann

    2010-01-01

    Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…

  10. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  11. Improvements to a method for the geometrically nonlinear analysis of compressively loaded stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Stoll, Frederick

    1993-01-01

    The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.

  12. CAVE: A computer code for two-dimensional transient heating analysis of conceptual thermal protection systems for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Rathjen, K. A.

    1977-01-01

    A digital computer code CAVE (Conduction Analysis Via Eigenvalues), which finds application in the analysis of two dimensional transient heating of hypersonic vehicles is described. The CAVE is written in FORTRAN 4 and is operational on both IBM 360-67 and CDC 6600 computers. The method of solution is a hybrid analytical numerical technique that is inherently stable permitting large time steps even with the best of conductors having the finest of mesh size. The aerodynamic heating boundary conditions are calculated by the code based on the input flight trajectory or can optionally be calculated external to the code and then entered as input data. The code computes the network conduction and convection links, as well as capacitance values, given basic geometrical and mesh sizes, for four generations (leading edges, cooled panels, X-24C structure and slabs). Input and output formats are presented and explained. Sample problems are included. A brief summary of the hybrid analytical-numerical technique, which utilizes eigenvalues (thermal frequencies) and eigenvectors (thermal mode vectors) is given along with aerodynamic heating equations that have been incorporated in the code and flow charts.

  13. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE PAGES

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...

    2017-03-20

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  14. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  15. Method for rapid high-frequency seismogram calculation

    NASA Astrophysics Data System (ADS)

    Stabile, Tony Alfredo; De Matteis, Raffaella; Zollo, Aldo

    2009-02-01

    We present a method for rapid, high-frequency seismogram calculation that makes use of an algorithm to automatically generate an exhaustive set of seismic phases with an appreciable amplitude on the seismogram. The method uses a hierarchical order of ray and seismic-phase generation, taking into account some existing constraints for ray paths and some physical constraints. To compute synthetic seismograms, the COMRAD code (from the Italian: "COdice Multifase per il RAy-tracing Dinamico") uses as core a dynamic ray-tracing code. To validate the code, we have computed in a layered medium synthetic seismograms using both COMRAD and a code that computes the complete wave field by the discrete wave number method. The seismograms are compared according to a time-frequency misfit criteria based on the continuous wavelet transform of the signals. Although the number of phases is considerably reduced by the selection criteria, the results show that the loss in amplitude on the whole seismogram is negligible. Moreover, the time for the computing of the synthetics using the COMRAD code (truncating the ray series at the 10th generation) is 3-4-fold less than that needed for the AXITRA code (up to a frequency of 25 Hz).

  16. Gender-Based Differences in Outcomes After Orbital Atherectomy for the Treatment of De Novo Severely Calcified Coronary Lesions.

    PubMed

    Lee, Michael S; Shlofmitz, Evan; Mansourian, Pejman; Sethi, Sanjum; Shlofmitz, Richard A

    2016-11-01

    We evaluated the relationship between gender and angiographic and clinical outcomes in patients with severely calcified lesions who underwent orbital atherectomy. Female gender is associated with increased risk of adverse clinical events after percutaneous coronary intervention (PCI). Severe coronary artery calcification increases the complexity of PCI and increases the risk of adverse cardiac events. Orbital atherectomy is effective in plaque modification, which facilitates stent delivery and expansion. Whether gender differences exist after orbital atherectomy is unclear. Our analysis retrospectively analyzed 458 consecutive real-world patients (314 males and 144 females) from three centers who underwent orbital atherectomy. The primary endpoint was the major adverse cardiac and cerebrovascular event (MACCE) rate, defined as the composite of death, myocardial infarction (MI), target-vessel revascularization (TVR), and stroke, at 30 days. The primary endpoint of MACCE was low and similar in females and males (0.7% vs 2.9%; P=.14). The individual endpoints of death (0.7% vs 1.6%; P=.43), MI (0.7% vs 1.3%; P=.58), TVR (0% vs 0%; P>.99), and stroke (0% vs 0.3%; P=.50) were low in both groups and did not differ. Angiographic complications were low: perforation (0.8% vs 0.7%; P>.90), dissection (0.8% vs 1.1%; P=.80), and no-reflow (0.8% vs 0.7%; P>.90). Plaque modification with orbital atherectomy was safe and provided similar angiographic and clinical outcomes between females and males. Randomized trials with longer-term follow-up are needed to support our results.

  17. An overview of cancer research in South African academic and research institutions, 2013 - 2014.

    PubMed

    Moodley, Jennifer; Stefan, D Cristina; Sewram, Vikash; Ruff, Paul; Freeman, Melvyn; Asante-Shongwe, Kwanele

    2016-05-10

    Cancer is emerging as a critical public health problem in South Africa (SA). Recognising the importance of research in addressing the cancer burden, the Ministerial Advisory Committee on the Prevention and Control of Cancer (MACC) research working group undertook a review of the current cancer research landscape in SA and related this to the cancer burden. Academic and research institutions in SA were contacted to provide information on the titles of all current and recently completed (2013/2014) cancer research projects. Three MACC research working group members used the project titles to independently classify the projects by type of research (basic, clinical and public health - projects could be classified in more than one category) and disease site. A more detailed classification of projects addressing the five most common cancers diagnosed in males and females in SA was conducted using an adapted Common Scientific Outline (CSO) categorisation. Information was available on 556 cancer research projects. Overall, 301 projects were classified as clinical, 254 as basic science and 71 as public health research. The most common cancers being researched were cancers of the breast (n=95 projects) and cervix (n=43), leukaemia (n=36), non-Hodgkin's lymphoma (n=35) and lung cancer (n=23). Classification of the five most common cancers in males and females in SA, using the adapted CSO categories, showed that the majority of projects related to treatment, with relatively few projects on prevention, survivorship and patient perspectives. Our findings established that there is a dearth of public health cancer research in SA.

  18. Guideline-adherence and perspectives in the acute management of unstable angina - Initial results from the German chest pain unit registry.

    PubMed

    Breuckmann, Frank; Hochadel, Matthias; Darius, Harald; Giannitsis, Evangelos; Münzel, Thomas; Maier, Lars S; Schmitt, Claus; Schumacher, Burghard; Heusch, Gerd; Voigtländer, Thomas; Mudra, Harald; Senges, Jochen

    2015-08-01

    We investigated the current management of unstable angina pectoris (UAP) in certified chest pain units (CPUs) in Germany and focused on the European Society of Cardiology (ESC) guideline-adherence in the timing of invasive strategies or choice of conservative treatment options. More specifically, we analyzed differences in clinical outcome with respect to guideline-adherence. Prospective data from 1400 UAP patients were collected. Analyses of high-risk criteria with indication for invasive management and 3-month clinical outcome data were performed. Guideline-adherence was tested for a primarily conservative strategy as well as for percutaneous coronary intervention (PCI) within <24 and <72h after admission. Overall guideline-conforming management was performed in 38.2%. In UAP patients at risk, undertreatment caused by an insufficient consideration of risk criteria was obvious in 78%. Reciprocally, overtreatment in the absence of adequate risk markers was performed in 27%, whereas a guideline-conforming primarily conservative strategy was chosen in 73% of the low-risk patients. Together, the 3-month major adverse coronary and cerebrovascular events (MACCE) were low (3.6%). Nonetheless, guideline-conforming treatment was even associated with significantly lower MACCE rates (1.6% vs. 4.0%, p<0.05). The data suggest an inadequate adherence to ESC guidelines in nearly two thirds of the patients, particularly in those patients at high to intermediate risk with secondary risk factors, emphasizing the need for further attention to consistent risk profiling in the CPU and its certification process. Copyright © 2014 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  19. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    PubMed

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  20. Mixed Layer Heights Derived from the NASA Langley Research Center Airborne High Spectral Resolution Lidar

    NASA Technical Reports Server (NTRS)

    Scarino, Amy J.; Burton, Sharon P.; Ferrare, Rich A.; Hostetler, Chris A.; Hair, Johnathan W.; Obland, Michael D.; Rogers, Raymond R.; Cook, Anthony L.; Harper, David B.; Fast, Jerome; hide

    2012-01-01

    The NASA airborne High Spectral Resolution Lidar (HSRL) has been deployed on board the NASA Langley Research Center's B200 aircraft to several locations in North America from 2006 to 2012 to aid in characterizing aerosol properties for over fourteen field missions. Measurements of aerosol extinction (532 nm), backscatter (532 and 1064 nm), and depolarization (532 and 1064 nm) during 349 science flights, many in coordination with other participating research aircraft, satellites, and ground sites, constitute a diverse data set for use in characterizing the spatial and temporal distribution of aerosols, as well as properties and variability of the Mixing Layer (ML) height. We describe the use of the HSRL data collected during these missions for computing ML heights and show how the HSRL data can be used to determine the fraction of aerosol optical thickness within and above the ML, which is important for air quality assessments. We describe the spatial and temporal variations in ML heights found in the diverse locations associated with these experiments. We also describe how the ML heights derived from HSRL have been used to help assess simulations of Planetary Boundary Layer (PBL) derived using various models, including the Weather Research and Forecasting Chemistry (WRF-Chem), NASA GEOS-5 model, and the ECMWF/MACC models.

  1. Validation of the NCC Code for Staged Transverse Injection and Computations for a RBCC Combustor

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Liu, Nan-Suey

    2005-01-01

    The NCC code was validated for a case involving staged transverse injection into Mach 2 flow behind a rearward facing step. Comparisons with experimental data and with solutions from the FPVortex code was then used to perform computations to study fuel-air mixing for the combustor of a candidate rocket based combined cycle engine geometry. Comparisons with a one-dimensional analysis and a three-dimensional code (VULCAN) were performed to assess the qualitative and quantitative performance of the NCC solver.

  2. A supersonic three-dimensional code for flow over blunt bodies: Program documentation and test cases

    NASA Technical Reports Server (NTRS)

    Chaussee, D. S.; Mcmillan, O. J.

    1980-01-01

    The use of a computer code for the calculation of steady, supersonic, three dimensional, inviscid flow over blunt bodies is illustrated. Input and output are given and explained for two cases: a pointed code of 20 deg half angle at 15 deg angle of attack in a free stream with M sub infinite = 7, and a cone-ogive-cylinder at 10 deg angle of attack with M sub infinite = 2.86. A source listing of the computer code is provided.

  3. PLASIM: A computer code for simulating charge exchange plasma propagation

    NASA Technical Reports Server (NTRS)

    Robinson, R. S.; Deininger, W. D.; Winder, D. R.; Kaufman, H. R.

    1982-01-01

    The propagation of the charge exchange plasma for an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ANSI Standard FORTRAN.

  4. Extension, validation and application of the NASCAP code

    NASA Technical Reports Server (NTRS)

    Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.

    1979-01-01

    Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.

  5. Calculation of inviscid flow over shuttle-like vehicles at high angles of attack and comparisons with experimental data

    NASA Technical Reports Server (NTRS)

    Weilmuenster, K. J.; Hamilton, H. H., II

    1983-01-01

    A computer code HALIS, designed to compute the three dimensional flow about shuttle like configurations at angles of attack greater than 25 deg, is described. Results from HALIS are compared where possible with an existing flow field code; such comparisons show excellent agreement. Also, HALIS results are compared with experimental pressure distributions on shuttle models over a wide range of angle of attack. These comparisons are excellent. It is demonstrated that the HALIS code can incorporate equilibrium air chemistry in flow field computations.

  6. Analysis of JSI TRIGA MARK II reactor physical parameters calculated with TRIPOLI and MCNP.

    PubMed

    Henry, R; Tiselj, I; Snoj, L

    2015-03-01

    New computational model of the JSI TRIGA Mark II research reactor was built for TRIPOLI computer code and compared with existing MCNP code model. The same modelling assumptions were used in order to check the differences of the mathematical models of both Monte Carlo codes. Differences between the TRIPOLI and MCNP predictions of keff were up to 100pcm. Further validation was performed with analyses of the normalized reaction rates and computations of kinetic parameters for various core configurations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. A comparison of two central difference schemes for solving the Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Maksymiuk, C. M.; Swanson, R. C.; Pulliam, T. H.

    1990-01-01

    Five viscous transonic airfoil cases were computed by two significantly different computational fluid dynamics codes: An explicit finite-volume algorithm with multigrid, and an implicit finite-difference approximate-factorization method with Eigenvector diagonalization. Both methods are described in detail, and their performance on the test cases is compared. The codes utilized the same grids, turbulence model, and computer to provide the truest test of the algorithms. The two approaches produce very similar results, which, for attached flows, also agree well with experimental results; however, the explicit code is considerably faster.

  8. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens.

    PubMed

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D; Volz, Kerstin

    2017-06-01

    We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Evaluation of the efficiency and fault density of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  10. Application of a personal computer for the uncoupled vibration analysis of wind turbine blade and counterweight assemblies

    NASA Technical Reports Server (NTRS)

    White, P. R.; Little, R. R.

    1985-01-01

    A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.

  11. TERRA: a computer code for simulating the transport of environmentally released radionuclides through agriculture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baes, C.F. III; Sharp, R.D.; Sjoreen, A.L.

    1984-11-01

    TERRA is a computer code which calculates concentrations of radionuclides and ingrowing daughters in surface and root-zone soil, produce and feed, beef, and milk from a given deposition rate at any location in the conterminous United States. The code is fully integrated with seven other computer codes which together comprise a Computerized Radiological Risk Investigation System, CRRIS. Output from either the long range (> 100 km) atmospheric dispersion code RETADD-II or the short range (<80 km) atmospheric dispersion code ANEMOS, in the form of radionuclide air concentrations and ground deposition rates by downwind location, serves as input to TERRA. User-definedmore » deposition rates and air concentrations may also be provided as input to TERRA through use of the PRIMUS computer code. The environmental concentrations of radionuclides predicted by TERRA serve as input to the ANDROS computer code which calculates population and individual intakes, exposures, doses, and risks. TERRA incorporates models to calculate uptake from soil and atmospheric deposition on four groups of produce for human consumption and four groups of livestock feeds. During the environmental transport simulation, intermediate calculations of interception fraction for leafy vegetables, produce directly exposed to atmospherically depositing material, pasture, hay, and silage are made based on location-specific estimates of standing crop biomass. Pasture productivity is estimated by a model which considers the number and types of cattle and sheep, pasture area, and annual production of other forages (hay and silage) at a given location. Calculations are made of the fraction of grain imported from outside the assessment area. TERRA output includes the above calculations and estimated radionuclide concentrations in plant produce, milk, and a beef composite by location.« less

  12. A survey to identify the clinical coding and classification systems currently in use across Europe.

    PubMed

    de Lusignan, S; Minmagh, C; Kennedy, J; Zeimet, M; Bommezijn, H; Bryant, J

    2001-01-01

    This is a survey to identify what clinical coding systems are currently in use across the European Union, and the states seeking membership to it. We sought to identify what systems are currently used and to what extent they were subject to local adaptation. Clinical coding should facilitate identifying key medical events in a computerised medical record, and aggregating information across groups of records. The emerging new driver is as the enabler of the life-long computerised medical record. A prerequisite for this level of functionality is the transfer of information between different computer systems. This transfer can be facilitated either by working on the interoperability problems between disparate systems or by harmonising the underlying data. This paper examines the extent to which the latter has occurred across Europe. Literature and Internet search. Requests for information via electronic mail to pan-European mailing lists of health informatics professionals. Coding systems are now a de facto part of health information systems across Europe. There are relatively few coding systems in existence across Europe. ICD9 and ICD 10, ICPC and Read were the most established. However the local adaptation of these classification systems either on a by country or by computer software manufacturer basis; significantly reduces the ability for the meaning coded with patients computer records to be easily transferred from one medical record system to another. There is no longer any debate as to whether a coding or classification system should be used. Convergence of different classifications systems should be encouraged. Countries and computer manufacturers within the EU should be encouraged to stop making local modifications to coding and classification systems, as this practice risks significantly slowing progress towards easy transfer of records between computer systems.

  13. PIC codes for plasma accelerators on emerging computer architectures (GPUS, Multicore/Manycore CPUS)

    NASA Astrophysics Data System (ADS)

    Vincenti, Henri

    2016-03-01

    The advent of exascale computers will enable 3D simulations of a new laser-plasma interaction regimes that were previously out of reach of current Petasale computers. However, the paradigm used to write current PIC codes will have to change in order to fully exploit the potentialities of these new computing architectures. Indeed, achieving Exascale computing facilities in the next decade will be a great challenge in terms of energy consumption and will imply hardware developments directly impacting our way of implementing PIC codes. As data movement (from die to network) is by far the most energy consuming part of an algorithm future computers will tend to increase memory locality at the hardware level and reduce energy consumption related to data movement by using more and more cores on each compute nodes (''fat nodes'') that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, CPU machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD register length is expected to double every four years. GPU's also have a reduced clock speed per core and can process Multiple Instructions on Multiple Datas (MIMD). At the software level Particle-In-Cell (PIC) codes will thus have to achieve both good memory locality and vectorization (for Multicore/Manycore CPU) to fully take advantage of these upcoming architectures. In this talk, we present the portable solutions we implemented in our high performance skeleton PIC code PICSAR to both achieve good memory locality and cache reuse as well as good vectorization on SIMD architectures. We also present the portable solutions used to parallelize the Pseudo-sepctral quasi-cylindrical code FBPIC on GPUs using the Numba python compiler.

  14. Some Problems and Solutions in Transferring Ecosystem Simulation Codes to Supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1994-01-01

    Many computer codes for the simulation of ecological systems have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Recent recognition of ecosystem science as a High Performance Computing and Communications Program Grand Challenge area emphasizes supercomputers (both parallel and distributed systems) as the next set of tools for ecological simulation. Transferring ecosystem simulation codes to such systems is not a matter of simply compiling and executing existing code on the supercomputer since there are significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers. To more appropriately match the application to the architecture (necessary to achieve reasonable performance), the parallelism (if it exists) of the original application must be exploited. We discuss our work in transferring a general grassland simulation model (developed on a VAX in the FORTRAN computer programming language) to a Cray Y-MP. We show the Cray shared-memory vector-architecture, and discuss our rationale for selecting the Cray. We describe porting the model to the Cray and executing and verifying a baseline version, and we discuss the changes we made to exploit the parallelism in the application and to improve code execution. As a result, the Cray executed the model 30 times faster than the VAX 11/785 and 10 times faster than a Sun 4 workstation. We achieved an additional speed-up of approximately 30 percent over the original Cray run by using the compiler's vectorizing capabilities and the machine's ability to put subroutines and functions "in-line" in the code. With the modifications, the code still runs at only about 5% of the Cray's peak speed because it makes ineffective use of the vector processing capabilities of the Cray. We conclude with a discussion and future plans.

  15. User's manual: Subsonic/supersonic advanced panel pilot code

    NASA Technical Reports Server (NTRS)

    Moran, J.; Tinoco, E. N.; Johnson, F. T.

    1978-01-01

    Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.

  16. Experimental aerothermodynamic research of hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Cleary, Joseph W.

    1987-01-01

    The 2-D and 3-D advance computer codes being developed for use in the design of such hypersonic aircraft as the National Aero-Space Plane require comparison of the computational results with a broad spectrum of experimental data to fully assess the validity of the codes. This is particularly true for complex flow fields with control surfaces present and for flows with separation, such as leeside flow. Therefore, the objective is to provide a hypersonic experimental data base required for validation of advanced computational fluid dynamics (CFD) computer codes and for development of more thorough understanding of the flow physics necessary for these codes. This is being done by implementing a comprehensive test program for a generic all-body hypersonic aircraft model in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel over a broad range of test conditions to obtain pertinent surface and flowfield data. Results from the flow visualization portion of the investigation are presented.

  17. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  18. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  19. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  20. Global Magnetohydrodynamic Simulation Using High Performance FORTRAN on Parallel Computers

    NASA Astrophysics Data System (ADS)

    Ogino, T.

    High Performance Fortran (HPF) is one of modern and common techniques to achieve high performance parallel computation. We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5 VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.

  1. 77 FR 37091 - Agency Information Collection Activities: Request for Comments for a New Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... analysis and design, and computer software design and coding. Given the fact that over $500 million were... acoustic algorithms, computer architecture, and source code that dated to the 1970s. Since that time... 2012. Version 3.0 is an entirely new, state-of-the-art computer program used for predicting noise...

  2. Education:=Coding+Aesthetics; Aesthetic Understanding, Computer Science Education, and Computational Thinking

    ERIC Educational Resources Information Center

    Good, Jonathon; Keenan, Sarah; Mishra, Punya

    2016-01-01

    The popular press is rife with examples of how students in the United States and around the globe are learning to program, make, and tinker. The Hour of Code, maker-education, and similar efforts are advocating that more students be exposed to principles found within computer science. We propose an expansion beyond simply teaching computational…

  3. Numerical, Analytical, Experimental Study of Fluid Dynamic Forces in Seals Volume 6: Description of Scientific CFD Code SCISEAL

    NASA Technical Reports Server (NTRS)

    Athavale, Mahesh; Przekwas, Andrzej

    2004-01-01

    The objectives of the program were to develop computational fluid dynamics (CFD) codes and simpler industrial codes for analyzing and designing advanced seals for air-breathing and space propulsion engines. The CFD code SCISEAL is capable of producing full three-dimensional flow field information for a variety of cylindrical configurations. An implicit multidomain capability allow the division of complex flow domains to allow optimum use of computational cells. SCISEAL also has the unique capability to produce cross-coupled stiffness and damping coefficients for rotordynamic computations. The industrial codes consist of a series of separate stand-alone modules designed for expeditious parametric analyses and optimization of a wide variety of cylindrical and face seals. Coupled through a Knowledge-Based System (KBS) that provides a user-friendly Graphical User Interface (GUI), the industrial codes are PC based using an OS/2 operating system. These codes were designed to treat film seals where a clearance exists between the rotating and stationary components. Leakage is inhibited by surface roughness, small but stiff clearance films, and viscous pumping devices. The codes have demonstrated to be a valuable resource for seal development of future air-breathing and space propulsion engines.

  4. Establishment of a Beta Test Center for the NPARC Code at Central State University

    NASA Technical Reports Server (NTRS)

    Okhio, Cyril B.

    1996-01-01

    Central State University has received a supplementary award to purchase computer workstations for the NPARC (National Propulsion Ames Research Center) computational fluid dynamics code BETA Test Center. The computational code has also been acquired for installation on the workstations. The acquisition of this code is an initial step for CSU in joining an alliance composed of NASA, AEDC, The Aerospace Industry, and academia. A post-Doctoral research Fellow from a neighboring university will assist the PI in preparing a template for Tutorial documents for the BETA test center. The major objective of the alliance is to establish a national applications-oriented CFD capability, centered on the NPARC code. By joining the alliance, the BETA test center at CSU will allow the PI, as well as undergraduate and post-graduate students to test the capability of the NPARC code in predicting the physics of aerodynamic/geometric configurations that are of interest to the alliance. Currently, CSU is developing a once a year, hands-on conference/workshop based upon the experience acquired from running other codes similar to the NPARC code in the first year of this grant.

  5. Microgravity computing codes. User's guide

    NASA Astrophysics Data System (ADS)

    1982-01-01

    Codes used in microgravity experiments to compute fluid parameters and to obtain data graphically are introduced. The computer programs are stored on two diskettes, compatible with the floppy disk drives of the Apple 2. Two versions of both disks are available (DOS-2 and DOS-3). The codes are written in BASIC and are structured as interactive programs. Interaction takes place through the keyboard of any Apple 2-48K standard system with single floppy disk drive. The programs are protected against wrong commands given by the operator. The programs are described step by step in the same order as the instructions displayed on the monitor. Most of these instructions are shown, with samples of computation and of graphics.

  6. Computer access security code system

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  7. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less

  8. Quantum computing with Majorana fermion codes

    NASA Astrophysics Data System (ADS)

    Litinski, Daniel; von Oppen, Felix

    2018-05-01

    We establish a unified framework for Majorana-based fault-tolerant quantum computation with Majorana surface codes and Majorana color codes. All logical Clifford gates are implemented with zero-time overhead. This is done by introducing a protocol for Pauli product measurements with tetrons and hexons which only requires local 4-Majorana parity measurements. An analogous protocol is used in the fault-tolerant setting, where tetrons and hexons are replaced by Majorana surface code patches, and parity measurements are replaced by lattice surgery, still only requiring local few-Majorana parity measurements. To this end, we discuss twist defects in Majorana fermion surface codes and adapt the technique of twist-based lattice surgery to fermionic codes. Moreover, we propose a family of codes that we refer to as Majorana color codes, which are obtained by concatenating Majorana surface codes with small Majorana fermion codes. Majorana surface and color codes can be used to decrease the space overhead and stabilizer weight compared to their bosonic counterparts.

  9. Computational Nuclear Physics and Post Hartree-Fock Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lietz, Justin; Sam, Novario; Hjorth-Jensen, M.

    We present a computational approach to infinite nuclear matter employing Hartree-Fock theory, many-body perturbation theory and coupled cluster theory. These lectures are closely linked with those of chapters 9, 10 and 11 and serve as input for the correlation functions employed in Monte Carlo calculations in chapter 9, the in-medium similarity renormalization group theory of dense fermionic systems of chapter 10 and the Green's function approach in chapter 11. We provide extensive code examples and benchmark calculations, allowing thereby an eventual reader to start writing her/his own codes. We start with an object-oriented serial code and end with discussions onmore » strategies for porting the code to present and planned high-performance computing facilities.« less

  10. Computer codes for thermal analysis of a solid rocket motor nozzle

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1988-01-01

    A number of computer codes are available for performing thermal analysis of solid rocket motor nozzles. Aerotherm Chemical Equilibrium (ACE) computer program can be used to perform one-dimensional gas expansion to determine the state of the gas at each location of a nozzle. The ACE outputs can be used as input to a computer program called Momentum/Energy Integral Technique (MEIT) for predicting boundary layer development development, shear, and heating on the surface of the nozzle. The output from MEIT can be used as input to another computer program called Aerotherm Charring Material Thermal Response and Ablation Program (CMA). This program is used to calculate oblation or decomposition response of the nozzle material. A code called Failure Analysis Nonlinear Thermal and Structural Integrated Code (FANTASTIC) is also likely to be used for performing thermal analysis of solid rocket motor nozzles after the program is duly verified. A part of the verification work on FANTASTIC was done by using one and two dimension heat transfer examples with known answers. An attempt was made to prepare input for performing thermal analysis of the CCT nozzle using the FANTASTIC computer code. The CCT nozzle problem will first be solved by using ACE, MEIT, and CMA. The same problem will then be solved using FANTASTIC. These results will then be compared for verification of FANTASTIC.

  11. New Parallel computing framework for radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, M.A.; /Michigan State U., NSCL; Mokhov, N.V.

    A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility canmore » be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  12. Verification and Validation: High Charge and Energy (HZE) Transport Codes and Future Development

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Mertens, Christopher J.; Blattnig, Steve R.; Clowdsley, Martha S.; Cucinotta, Francis A.; Tweed, John; Heinbockel, John H.; Walker, Steven A.; Nealy, John E.

    2005-01-01

    In the present paper, we give the formalism for further developing a fully three-dimensional HZETRN code using marching procedures but also development of a new Green's function code is discussed. The final Green's function code is capable of not only validation in the space environment but also in ground based laboratories with directed beams of ions of specific energy and characterized with detailed diagnostic particle spectrometer devices. Special emphasis is given to verification of the computational procedures and validation of the resultant computational model using laboratory and spaceflight measurements. Due to historical requirements, two parallel development paths for computational model implementation using marching procedures and Green s function techniques are followed. A new version of the HZETRN code capable of simulating HZE ions with either laboratory or space boundary conditions is under development. Validation of computational models at this time is particularly important for President Bush s Initiative to develop infrastructure for human exploration with first target demonstration of the Crew Exploration Vehicle (CEV) in low Earth orbit in 2008.

  13. User's manual for CBS3DS, version 1.0

    NASA Astrophysics Data System (ADS)

    Reddy, C. J.; Deshpande, M. D.

    1995-10-01

    CBS3DS is a computer code written in FORTRAN 77 to compute the backscattering radar cross section of cavity backed apertures in infinite ground plane and slots in thick infinite ground plane. CBS3DS implements the hybrid Finite Element Method (FEM) and Method of Moments (MoM) techniques. This code uses the tetrahedral elements, with vector edge basis functions for FEM in the volume of the cavity/slot and the triangular elements with the basis functions for MoM at the apertures. By virtue of FEM, this code can handle any arbitrarily shaped three-dimensional cavities filled with inhomogeneous lossy materials; due to MoM, the apertures can be of any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computer the code is intended to run.

  14. Program optimizations: The interplay between power, performance, and energy

    DOE PAGES

    Leon, Edgar A.; Karlin, Ian; Grant, Ryan E.; ...

    2016-05-16

    Practical considerations for future supercomputer designs will impose limits on both instantaneous power consumption and total energy consumption. Working within these constraints while providing the maximum possible performance, application developers will need to optimize their code for speed alongside power and energy concerns. This paper analyzes the effectiveness of several code optimizations including loop fusion, data structure transformations, and global allocations. A per component measurement and analysis of different architectures is performed, enabling the examination of code optimizations on different compute subsystems. Using an explicit hydrodynamics proxy application from the U.S. Department of Energy, LULESH, we show how code optimizationsmore » impact different computational phases of the simulation. This provides insight for simulation developers into the best optimizations to use during particular simulation compute phases when optimizing code for future supercomputing platforms. Here, we examine and contrast both x86 and Blue Gene architectures with respect to these optimizations.« less

  15. Computational simulation of progressive fracture in fiber composites

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.

  16. A computer-aided design system geared toward conceptual design in a research environment. [for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    STACK S. H.

    1981-01-01

    A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.

  17. With or without you: predictive coding and Bayesian inference in the brain

    PubMed Central

    Aitchison, Laurence; Lengyel, Máté

    2018-01-01

    Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084

  18. Comparison of the LLNL ALE3D and AKTS Thermal Safety Computer Codes for Calculating Times to Explosion in ODTX and STEX Thermal Cookoff Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wemhoff, A P; Burnham, A K

    2006-04-05

    Cross-comparison of the results of two computer codes for the same problem provides a mutual validation of their computational methods. This cross-validation exercise was performed for LLNL's ALE3D code and AKTS's Thermal Safety code, using the thermal ignition of HMX in two standard LLNL cookoff experiments: the One-Dimensional Time to Explosion (ODTX) test and the Scaled Thermal Explosion (STEX) test. The chemical kinetics model used in both codes was the extended Prout-Tompkins model, a relatively new addition to ALE3D. This model was applied using ALE3D's new pseudospecies feature. In addition, an advanced isoconversional kinetic approach was used in the AKTSmore » code. The mathematical constants in the Prout-Tompkins code were calibrated using DSC data from hermetically sealed vessels and the LLNL optimization code Kinetics05. The isoconversional kinetic parameters were optimized using the AKTS Thermokinetics code. We found that the Prout-Tompkins model calculations agree fairly well between the two codes, and the isoconversional kinetic model gives very similar results as the Prout-Tompkins model. We also found that an autocatalytic approach in the beta-delta phase transition model does affect the times to explosion for some conditions, especially STEX-like simulations at ramp rates above 100 C/hr, and further exploration of that effect is warranted.« less

  19. Ducted-Fan Engine Acoustic Predictions using a Navier-Stokes Code

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Biedron, R. T.; Farassat, F.; Spence, P. L.

    1998-01-01

    A Navier-Stokes computer code is used to predict one of the ducted-fan engine acoustic modes that results from rotor-wake/stator-blade interaction. A patched sliding-zone interface is employed to pass information between the moving rotor row and the stationary stator row. The code produces averaged aerodynamic results downstream of the rotor that agree well with a widely used average-passage code. The acoustic mode of interest is generated successfully by the code and is propagated well upstream of the rotor; temporal and spatial numerical resolution are fine enough such that attenuation of the signal is small. Two acoustic codes are used to find the far-field noise. Near-field propagation is computed by using Eversman's wave envelope code, which is based on a finite-element model. Propagation to the far field is accomplished by using the Kirchhoff formula for moving surfaces with the results of the wave envelope code as input data. Comparison of measured and computed far-field noise levels show fair agreement in the range of directivity angles where the peak radiation lobes from the inlet are observed. Although only a single acoustic mode is targeted in this study, the main conclusion is a proof-of-concept: Navier-Stokes codes can be used both to generate and propagate rotor/stator acoustic modes forward through an engine, where the results can be coupled to other far-field noise prediction codes.

  20. Relative efficiency and accuracy of two Navier-Stokes codes for simulating attached transonic flow over wings

    NASA Technical Reports Server (NTRS)

    Bonhaus, Daryl L.; Wornom, Stephen F.

    1991-01-01

    Two codes which solve the 3-D Thin Layer Navier-Stokes (TLNS) equations are used to compute the steady state flow for two test cases representing typical finite wings at transonic conditions. Several grids of C-O topology and varying point densities are used to determine the effects of grid refinement. After a description of each code and test case, standards for determining code efficiency and accuracy are defined and applied to determine the relative performance of the two codes in predicting turbulent transonic wing flows. Comparisons of computed surface pressure distributions with experimental data are made.

  1. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  2. Procedures for the computation of unsteady transonic flows including viscous effects

    NASA Technical Reports Server (NTRS)

    Rizzetta, D. P.

    1982-01-01

    Modifications of the code LTRAN2, developed by Ballhaus and Goorjian, which account for viscous effects in the computation of planar unsteady transonic flows are presented. Two models are considered and their theoretical development and numerical implementation is discussed. Computational examples employing both models are compared with inviscid solutions and with experimental data. Use of the modified code is described.

  3. Expanding Capacity and Promoting Inclusion in Introductory Computer Science: A Focus on Near-Peer Mentor Preparation and Code Review

    ERIC Educational Resources Information Center

    Pon-Barry, Heather; Packard, Becky Wai-Ling; St. John, Audrey

    2017-01-01

    A dilemma within computer science departments is developing sustainable ways to expand capacity within introductory computer science courses while remaining committed to inclusive practices. Training near-peer mentors for peer code review is one solution. This paper describes the preparation of near-peer mentors for their role, with a focus on…

  4. 77 FR 37736 - Agency Information Collection Activities: Request for Comments for a New Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-22

    ... analysis and design, and computer software design and coding. Given the fact that over $500 million were... acoustic algorithms, computer architecture, and source code that dated to the 1970s. Since that time... towards the end of 2012. Version 3.0 is an entirely new, state-of-the-art computer program used for...

  5. 28 CFR 802.4 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Administration COURT SERVICES AND OFFENDER SUPERVISION AGENCY FOR THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS... proprietary interest in the information. (e) Computer software means tools by which records are created, stored, and retrieved. Normally, computer software, including source code, object code, and listings of...

  6. Error threshold for color codes and random three-body Ising models.

    PubMed

    Katzgraber, Helmut G; Bombin, H; Martin-Delgado, M A

    2009-08-28

    We study the error threshold of color codes, a class of topological quantum codes that allow a direct implementation of quantum Clifford gates suitable for entanglement distillation, teleportation, and fault-tolerant quantum computation. We map the error-correction process onto a statistical mechanical random three-body Ising model and study its phase diagram via Monte Carlo simulations. The obtained error threshold of p(c) = 0.109(2) is very close to that of Kitaev's toric code, showing that enhanced computational capabilities do not necessarily imply lower resistance to noise.

  7. Photoionization and High Density Gas

    NASA Technical Reports Server (NTRS)

    Kallman, T.; Bautista, M.; White, Nicholas E. (Technical Monitor)

    2002-01-01

    We present results of calculations using the XSTAR version 2 computer code. This code is loosely based on the XSTAR v.1 code which has been available for public use for some time. However it represents an improvement and update in several major respects, including atomic data, code structure, user interface, and improved physical description of ionization/excitation. In particular, it now is applicable to high density situations in which significant excited atomic level populations are likely to occur. We describe the computational techniques and assumptions, and present sample runs with particular emphasis on high density situations.

  8. The mathematical theory of signal processing and compression-designs

    NASA Astrophysics Data System (ADS)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  9. Solution of 3-dimensional time-dependent viscous flows. Part 2: Development of the computer code

    NASA Technical Reports Server (NTRS)

    Weinberg, B. C.; Mcdonald, H.

    1980-01-01

    There is considerable interest in developing a numerical scheme for solving the time dependent viscous compressible three dimensional flow equations to aid in the design of helicopter rotors. The development of a computer code to solve a three dimensional unsteady approximate form of the Navier-Stokes equations employing a linearized block emplicit technique in conjunction with a QR operator scheme is described. Results of calculations of several Cartesian test cases are presented. The computer code can be applied to more complex flow fields such as these encountered on rotating airfoils.

  10. Computer program for prediction of the deposition of material released from fixed and rotary wing aircraft

    NASA Technical Reports Server (NTRS)

    Teske, M. E.

    1984-01-01

    This is a user manual for the computer code ""AGDISP'' (AGricultural DISPersal) which has been developed to predict the deposition of material released from fixed and rotary wing aircraft in a single-pass, computationally efficient manner. The formulation of the code is novel in that the mean particle trajectory and the variance about the mean resulting from turbulent fluid fluctuations are simultaneously predicted. The code presently includes the capability of assessing the influence of neutral atmospheric conditions, inviscid wake vortices, particle evaporation, plant canopy and terrain on the deposition pattern.

  11. HOMAR: A computer code for generating homotopic grids using algebraic relations: User's manual

    NASA Technical Reports Server (NTRS)

    Moitra, Anutosh

    1989-01-01

    A computer code for fast automatic generation of quasi-three-dimensional grid systems for aerospace configurations is described. The code employs a homotopic method to algebraically generate two-dimensional grids in cross-sectional planes, which are stacked to produce a three-dimensional grid system. Implementation of the algebraic equivalents of the homotopic relations for generating body geometries and grids are explained. Procedures for controlling grid orthogonality and distortion are described. Test cases with description and specification of inputs are presented in detail. The FORTRAN computer program and notes on implementation and use are included.

  12. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  13. Top-down NOX emissions over European cities from LOTOS-EUROS simulated and OMI observed tropospheric NO2 columns using the Exponentially Modified Gaussian approach

    NASA Astrophysics Data System (ADS)

    Verstraeten, Willem W.; Folkert Boersma, K.; Douros, John; Williams, Jason E.; Eskes, Henk H.; Delcloo, Andy

    2017-04-01

    High nitrogen oxides concentrations at the surface (NOX = NO + NO2) impact humans and ecosystem badly and play a key role in tropospheric chemistry. Surface NOX emissions drive major processes in regional and global chemistry transport models (CTM). NOX contributes to the formation of acid rain, act as aerosol precursors and is an important trace gas for the formation of tropospheric ozone (O3). Via tropospheric O3, NOX indirectly affects the production of the hydroxyl radical which controls the chemical lifetime of key atmospheric pollutants and reactive greenhouse gases. High NOX emissions are mainly observed in polluted regions produced by anthropogenic combustion from industrial, traffic and household activities typically observed in large and densely populated urban areas. Accurate NOX inventories are essential, but state-of the- art emission databases may vary substantially and uncertainties are high since reported emissions factors may differ in order of magnitude and more. To date, the modelled NO2 concentrations and lifetimes have large associated uncertainties due to the highly non-linear small-scale chemistry that occurs in urban areas and uncertainties in the reaction rate data, missing nitrogen (N) species and volatile organic compounds (VOC) emissions, and incomplete knowledge of nitrogen oxides chemistry. Any overestimation in the chemical lifetime may mask missing NOX chemistry in current CTM's. By simultaneously estimating both the NO2 lifetime and concentrations, for instance by using the Exponentially Modified Gaussian (EMG), a better surface NOX emission flux estimate can be obtained. Here we evaluate if the EMG methodology can reproduce the emissions input from the tropospheric NO2 columns simulated by the LOTOS-EUROS (Long Term Ozone Simulation-European Ozone Simulation) CTM model. We apply the EMG methodology on LOTOS-EUROS simulated tropospheric NO2 columns for the period April-September 2013 for 21 selected European urban areas under windy conditions (surface wind speeds > 3 m s-1). We then compare the top-down derived surface NOX emissions with the 2011 MACC-III emission inventory, used in the LOTOS-EUROS model as input to simulate the NO2 columns. We also apply the EMG methodology on OMI (Ozone Monitoring Instrument) tropospheric NO2 column data, providing us with real-time observation-based estimates of midday NO2 lifetime and NOX emissions over 21 European cities in 2013. Results indicate that the top-down derived NOX emissions from LOTOS-EUROS (respectively OMI) are comparable with the MACC-III inventory with a R2 of 0.99 (respectively R2 = 0.79). For St-Petersburg and Moscow the top-down NOX estimates from 2013 OMI data are biased low compared to the MACC-III inventory which uses a 2011 NOX emissions update.

  14. Development and application of computational aerothermodynamics flowfield computer codes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj

    1994-01-01

    Research was performed in the area of computational modeling and application of hypersonic, high-enthalpy, thermo-chemical nonequilibrium flow (Aerothermodynamics) problems. A number of computational fluid dynamic (CFD) codes were developed and applied to simulate high altitude rocket-plume, the Aeroassist Flight Experiment (AFE), hypersonic base flow for planetary probes, the single expansion ramp model (SERN) connected with the National Aerospace Plane, hypersonic drag devices, hypersonic ramp flows, ballistic range models, shock tunnel facility nozzles, transient and steady flows in the shock tunnel facility, arc-jet flows, thermochemical nonequilibrium flows around simple and complex bodies, axisymmetric ionized flows of interest to re-entry, unsteady shock induced combustion phenomena, high enthalpy pulsed facility simulations, and unsteady shock boundary layer interactions in shock tunnels. Computational modeling involved developing appropriate numerical schemes for the flows on interest and developing, applying, and validating appropriate thermochemical processes. As part of improving the accuracy of the numerical predictions, adaptive grid algorithms were explored, and a user-friendly, self-adaptive code (SAGE) was developed. Aerothermodynamic flows of interest included energy transfer due to strong radiation, and a significant level of effort was spent in developing computational codes for calculating radiation and radiation modeling. In addition, computational tools were developed and applied to predict the radiative heat flux and spectra that reach the model surface.

  15. GCKP84-general chemical kinetics code for gas-phase flow and batch processes including heat transfer effects

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.; Scullin, V. J.

    1984-01-01

    A general chemical kinetics code is described for complex, homogeneous ideal gas reactions in any chemical system. The main features of the GCKP84 code are flexibility, convenience, and speed of computation for many different reaction conditions. The code, which replaces the GCKP code published previously, solves numerically the differential equations for complex reaction in a batch system or one dimensional inviscid flow. It also solves numerically the nonlinear algebraic equations describing the well stirred reactor. A new state of the art numerical integration method is used for greatly increased speed in handling systems of stiff differential equations. The theory and the computer program, including details of input preparation and a guide to using the code are given.

  16. A Multiple Sphere T-Matrix Fortran Code for Use on Parallel Computer Clusters

    NASA Technical Reports Server (NTRS)

    Mackowski, D. W.; Mishchenko, M. I.

    2011-01-01

    A general-purpose Fortran-90 code for calculation of the electromagnetic scattering and absorption properties of multiple sphere clusters is described. The code can calculate the efficiency factors and scattering matrix elements of the cluster for either fixed or random orientation with respect to the incident beam and for plane wave or localized- approximation Gaussian incident fields. In addition, the code can calculate maps of the electric field both interior and exterior to the spheres.The code is written with message passing interface instructions to enable the use on distributed memory compute clusters, and for such platforms the code can make feasible the calculation of absorption, scattering, and general EM characteristics of systems containing several thousand spheres.

  17. Computation of neutron fluxes in clusters of fuel pins arranged in hexagonal assemblies (2D and 3D)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabha, H.; Marleau, G.

    2012-07-01

    For computations of fluxes, we have used Carvik's method of collision probabilities. This method requires tracking algorithms. An algorithm to compute tracks (in 2D and 3D) has been developed for seven hexagonal geometries with cluster of fuel pins. This has been implemented in the NXT module of the code DRAGON. The flux distribution in cluster of pins has been computed by using this code. For testing the results, they are compared when possible with the EXCELT module of the code DRAGON. Tracks are plotted in the NXT module by using MATLAB, these plots are also presented here. Results are presentedmore » with increasing number of lines to show the convergence of these results. We have numerically computed volumes, surface areas and the percentage errors in these computations. These results show that 2D results converge faster than 3D results. The accuracy on the computation of fluxes up to second decimal is achieved with fewer lines. (authors)« less

  18. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  19. ASTEC—the Aarhus STellar Evolution Code

    NASA Astrophysics Data System (ADS)

    Christensen-Dalsgaard, Jørgen

    2008-08-01

    The Aarhus code is the result of a long development, starting in 1974, and still ongoing. A novel feature is the integration of the computation of adiabatic oscillations for specified models as part of the code. It offers substantial flexibility in terms of microphysics and has been carefully tested for the computation of solar models. However, considerable development is still required in the treatment of nuclear reactions, diffusion and convective mixing.

  20. Instrumentation for Verification of Bomb Damage Repair Computer Code.

    DTIC Science & Technology

    1981-09-01

    record the data, a conventional 14-track FM analog tape recorder was retained. The unknown factors of signal duration, test duration, and signal ...Kirtland Air Force Base computer centers for more detailed analyses. In addition to the analog recorder, signal conditioning equipment and amplifiers were...necessary to allow high quality data to be recorded. An Interrange Instrumentation Group (IRIG) code generator/reader placed a coded signal on the tape

  1. Feasibility of a computer-assisted feedback system between dispatch centre and ambulances.

    PubMed

    Lindström, Veronica; Karlsten, Rolf; Falk, Ann-Charlotte; Castrèn, Maaret

    2011-06-01

    The aim of the study was to evaluate the feasibility of a newly developed computer-assisted feedback system between dispatch centre and ambulances in Stockholm, Sweden. A computer-assisted feedback system based on a Finnish model was designed to fit the Swedish emergency medical system. Feedback codes were identified and divided into three categories; assessment of patients' primary condition when ambulance arrives at scene, no transport by the ambulance and level of priority. Two ambulances and one emergency medical communication centre (EMCC) in Stockholm participated in the study. A sample of 530 feedback codes sent through the computer-assisted feedback system was reviewed. The information on the ambulance medical records was compared with the feedback codes used and 240 assignments were further analyzed. The used feedback codes sent from ambulance to EMCC were correct in 92% of the assignments. The most commonly used feedback code sent to the emergency medical dispatchers was 'agree with the dispatchers' assessment'. In addition, in 160 assignments there was a mismatch between emergency medical dispatchers and ambulance nurse assessments. Our results have shown a high agreement between medical dispatchers and ambulance nurse assessment. The feasibility of the feedback codes seems to be acceptable based on the small margin of error. The computer-assisted feedback system may, when used on a daily basis, make it possible for the medical dispatchers to receive feedback in a structural way. The EMCC organization can directly evaluate any changes in the assessment protocol by structured feedback sent from the ambulance.

  2. A Computer Code for Swirling Turbulent Axisymmetric Recirculating Flows in Practical Isothermal Combustor Geometries

    NASA Technical Reports Server (NTRS)

    Lilley, D. G.; Rhode, D. L.

    1982-01-01

    A primitive pressure-velocity variable finite difference computer code was developed to predict swirling recirculating inert turbulent flows in axisymmetric combustors in general, and for application to a specific idealized combustion chamber with sudden or gradual expansion. The technique involves a staggered grid system for axial and radial velocities, a line relaxation procedure for efficient solution of the equations, a two-equation k-epsilon turbulence model, a stairstep boundary representation of the expansion flow, and realistic accommodation of swirl effects. A user's manual, dealing with the computational problem, showing how the mathematical basis and computational scheme may be translated into a computer program is presented. A flow chart, FORTRAN IV listing, notes about various subroutines and a user's guide are supplied as an aid to prospective users of the code.

  3. Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing

    NASA Technical Reports Server (NTRS)

    Ozguner, Fusun

    1996-01-01

    Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.

  4. Computing Legacy Software Behavior to Understand Functionality and Security Properties: An IBM/370 Demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J

    Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and securitymore » vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.« less

  5. Parametric bicubic spline and CAD tools for complex targets shape modelling in physical optics radar cross section prediction

    NASA Astrophysics Data System (ADS)

    Delogu, A.; Furini, F.

    1991-09-01

    Increasing interest in radar cross section (RCS) reduction is placing new demands on theoretical, computation, and graphic techniques for calculating scattering properties of complex targets. In particular, computer codes capable of predicting the RCS of an entire aircraft at high frequency and of achieving RCS control with modest structural changes, are becoming of paramount importance in stealth design. A computer code, evaluating the RCS of arbitrary shaped metallic objects that are computer aided design (CAD) generated, and its validation with measurements carried out using ALENIA RCS test facilities are presented. The code, based on the physical optics method, is characterized by an efficient integration algorithm with error control, in order to contain the computer time within acceptable limits, and by an accurate parametric representation of the target surface in terms of bicubic splines.

  6. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  7. Assessment of the impact of the change from manual to automated coding on mortality statistics in Australia.

    PubMed

    McKenzie, Kirsten; Walker, Sue; Tong, Shilu

    It remains unclear whether the change from a manual to an automated coding system (ACS) for deaths has significantly affected the consistency of Australian mortality data. The underlying causes of 34,000 deaths registered in 1997 in Australia were dual coded, in ICD-9 manually, and by using an automated computer coding program. The diseases most affected by the change from manual to ACS were senile/presenile dementia, and pneumonia. The most common disease to which a manually assigned underlying cause of senile dementia was coded with ACS was unspecified psychoses (37.2%). Only 12.5% of codes assigned by ACS as senile dementia were coded the same by manual coders. This study indicates some important differences in mortality rates when comparing mortality data that have been coded manually with those coded using an automated computer coding program. These differences may be related to both the different interpretation of ICD coding rules between manual and automated coding, and different co-morbidities or co-existing conditions among demographic groups.

  8. Adaptive neural coding: from biological to behavioral decision-making

    PubMed Central

    Louie, Kenway; Glimcher, Paul W.; Webb, Ryan

    2015-01-01

    Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666

  9. Star adaptation for two-algorithms used on serial computers

    NASA Technical Reports Server (NTRS)

    Howser, L. M.; Lambiotte, J. J., Jr.

    1974-01-01

    Two representative algorithms used on a serial computer and presently executed on the Control Data Corporation 6000 computer were adapted to execute efficiently on the Control Data STAR-100 computer. Gaussian elimination for the solution of simultaneous linear equations and the Gauss-Legendre quadrature formula for the approximation of an integral are the two algorithms discussed. A description is given of how the programs were adapted for STAR and why these adaptations were necessary to obtain an efficient STAR program. Some points to consider when adapting an algorithm for STAR are discussed. Program listings of the 6000 version coded in 6000 FORTRAN, the adapted STAR version coded in 6000 FORTRAN, and the STAR version coded in STAR FORTRAN are presented in the appendices.

  10. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    PubMed

    Tai, Tracy Waize; Anandarajah, Sobanna; Dhoul, Neil; de Lusignan, Simon

    2007-01-01

    Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.

  11. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    NASA Astrophysics Data System (ADS)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.

  12. ACON: a multipurpose production controller for plasma physics codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snell, C.

    1983-01-01

    ACON is a BCON controller designed to run large production codes on the CTSS Cray-1 or the LTSS 7600 computers. ACON can also be operated interactively, with input from the user's terminal. The controller can run one code or a sequence of up to ten codes during the same job. Options are available to get and save Mass storage files, to perform Historian file updating operations, to compile and load source files, and to send out print and film files. Special features include ability to retry after Mass failures, backup options for saving files, startup messages for the various codes,more » and ability to reserve specified amounts of computer time after successive code runs. ACON's flexibility and power make it useful for running a number of different production codes.« less

  13. Quantum Error Correction

    NASA Astrophysics Data System (ADS)

    Lidar, Daniel A.; Brun, Todd A.

    2013-09-01

    Prologue; Preface; Part I. Background: 1. Introduction to decoherence and noise in open quantum systems Daniel Lidar and Todd Brun; 2. Introduction to quantum error correction Dave Bacon; 3. Introduction to decoherence-free subspaces and noiseless subsystems Daniel Lidar; 4. Introduction to quantum dynamical decoupling Lorenza Viola; 5. Introduction to quantum fault tolerance Panos Aliferis; Part II. Generalized Approaches to Quantum Error Correction: 6. Operator quantum error correction David Kribs and David Poulin; 7. Entanglement-assisted quantum error-correcting codes Todd Brun and Min-Hsiu Hsieh; 8. Continuous-time quantum error correction Ognyan Oreshkov; Part III. Advanced Quantum Codes: 9. Quantum convolutional codes Mark Wilde; 10. Non-additive quantum codes Markus Grassl and Martin Rötteler; 11. Iterative quantum coding systems David Poulin; 12. Algebraic quantum coding theory Andreas Klappenecker; 13. Optimization-based quantum error correction Andrew Fletcher; Part IV. Advanced Dynamical Decoupling: 14. High order dynamical decoupling Zhen-Yu Wang and Ren-Bao Liu; 15. Combinatorial approaches to dynamical decoupling Martin Rötteler and Pawel Wocjan; Part V. Alternative Quantum Computation Approaches: 16. Holonomic quantum computation Paolo Zanardi; 17. Fault tolerance for holonomic quantum computation Ognyan Oreshkov, Todd Brun and Daniel Lidar; 18. Fault tolerant measurement-based quantum computing Debbie Leung; Part VI. Topological Methods: 19. Topological codes Héctor Bombín; 20. Fault tolerant topological cluster state quantum computing Austin Fowler and Kovid Goyal; Part VII. Applications and Implementations: 21. Experimental quantum error correction Dave Bacon; 22. Experimental dynamical decoupling Lorenza Viola; 23. Architectures Jacob Taylor; 24. Error correction in quantum communication Mark Wilde; Part VIII. Critical Evaluation of Fault Tolerance: 25. Hamiltonian methods in QEC and fault tolerance Eduardo Novais, Eduardo Mucciolo and Harold Baranger; 26. Critique of fault-tolerant quantum information processing Robert Alicki; References; Index.

  14. Visual Computing Environment

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Putt, Charles W.

    1997-01-01

    The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.

  15. Efficient Helicopter Aerodynamic and Aeroacoustic Predictions on Parallel Computers

    NASA Technical Reports Server (NTRS)

    Wissink, Andrew M.; Lyrintzis, Anastasios S.; Strawn, Roger C.; Oliker, Leonid; Biswas, Rupak

    1996-01-01

    This paper presents parallel implementations of two codes used in a combined CFD/Kirchhoff methodology to predict the aerodynamics and aeroacoustics properties of helicopters. The rotorcraft Navier-Stokes code, TURNS, computes the aerodynamic flowfield near the helicopter blades and the Kirchhoff acoustics code computes the noise in the far field, using the TURNS solution as input. The overall parallel strategy adds MPI message passing calls to the existing serial codes to allow for communication between processors. As a result, the total code modifications required for parallel execution are relatively small. The biggest bottleneck in running the TURNS code in parallel comes from the LU-SGS algorithm that solves the implicit system of equations. We use a new hybrid domain decomposition implementation of LU-SGS to obtain good parallel performance on the SP-2. TURNS demonstrates excellent parallel speedups for quasi-steady and unsteady three-dimensional calculations of a helicopter blade in forward flight. The execution rate attained by the code on 114 processors is six times faster than the same cases run on one processor of the Cray C-90. The parallel Kirchhoff code also shows excellent parallel speedups and fast execution rates. As a performance demonstration, unsteady acoustic pressures are computed at 1886 far-field observer locations for a sample acoustics problem. The calculation requires over two hundred hours of CPU time on one C-90 processor but takes only a few hours on 80 processors of the SP2. The resultant far-field acoustic field is analyzed with state of-the-art audio and video rendering of the propagating acoustic signals.

  16. Computer code for the optimization of performance parameters of mixed explosive formulations.

    PubMed

    Muthurajan, H; Sivabalan, R; Talawar, M B; Venugopalan, S; Gandhe, B R

    2006-08-25

    LOTUSES is a novel computer code, which has been developed for the prediction of various thermodynamic properties such as heat of formation, heat of explosion, volume of explosion gaseous products and other related performance parameters. In this paper, we report LOTUSES (Version 1.4) code which has been utilized for the optimization of various high explosives in different combinations to obtain maximum possible velocity of detonation. LOTUSES (Version 1.4) code will vary the composition of mixed explosives automatically in the range of 1-100% and computes the oxygen balance as well as the velocity of detonation for various compositions in preset steps. Further, the code suggests the compositions for which least oxygen balance and the higher velocity of detonation could be achieved. Presently, the code can be applied for two component explosive compositions. The code has been validated with well-known explosives like, TNT, HNS, HNF, TATB, RDX, HMX, AN, DNA, CL-20 and TNAZ in different combinations. The new algorithm incorporated in LOTUSES (Version 1.4) enhances the efficiency and makes it a more powerful tool for the scientists/researches working in the field of high energy materials/hazardous materials.

  17. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  18. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  19. Computer programs for predicting supersonic and hypersonic interference flow fields and heating

    NASA Technical Reports Server (NTRS)

    Morris, D. J.; Keyes, J. W.

    1973-01-01

    This report describes computer codes which calculate two-dimensional shock interference patterns. These codes compute the six types of interference flows as defined by Edney (Aeronaut. Res. Inst. of Sweden FAA Rep. 115). Results include properties of the inviscid flow field and the inviscid-viscous interaction at the surface along with peak pressure and peak heating at the impingement point.

  20. A Model Code of Ethics for the Use of Computers in Education.

    ERIC Educational Resources Information Center

    Shere, Daniel T.; Cannings, Terence R.

    Two Delphi studies were conducted by the Ethics and Equity Committee of the International Council for Computers in Education (ICCE) to obtain the opinions of experts on areas that should be covered by ethical guides for the use of computers in education and for software development, and to develop a model code of ethics for each of these areas.…

  1. Nonlinear Computational Aeroelasticity: Formulations and Solution Algorithms

    DTIC Science & Technology

    2003-03-01

    problem is proposed. Fluid-structure coupling algorithms are then discussed with some emphasis on distributed computing strategies. Numerical results...the structure and the exchange of structure motion to the fluid. The computational fluid dynamics code PFES is our finite element code for the numerical ...unstructured meshes). It was numerically demonstrated [1-3] that EBS can be less diffusive than SUPG [4-6] and the standard Finite Volume schemes

  2. MIADS2 ... an alphanumeric map information assembly and display system for a large computer

    Treesearch

    Elliot L. Amidon

    1966-01-01

    A major improvement and extension of the Map Information Assembly and Display System (MIADS) developed in 1964 is described. Basic principles remain unchanged, but the computer programs have been expanded and rewritten for a large computer, in Fortran IV and MAP languages. The code system is extended from 99 integers to about 2,200 alphanumeric 2-character codes. Hand-...

  3. Parallelisation study of a three-dimensional environmental flow model

    NASA Astrophysics Data System (ADS)

    O'Donncha, Fearghal; Ragnoli, Emanuele; Suits, Frank

    2014-03-01

    There are many simulation codes in the geosciences that are serial and cannot take advantage of the parallel computational resources commonly available today. One model important for our work in coastal ocean current modelling is EFDC, a Fortran 77 code configured for optimal deployment on vector computers. In order to take advantage of our cache-based, blade computing system we restructured EFDC from serial to parallel, thereby allowing us to run existing models more quickly, and to simulate larger and more detailed models that were previously impractical. Since the source code for EFDC is extensive and involves detailed computation, it is important to do such a port in a manner that limits changes to the files, while achieving the desired speedup. We describe a parallelisation strategy involving surgical changes to the source files to minimise error-prone alteration of the underlying computations, while allowing load-balanced domain decomposition for efficient execution on a commodity cluster. The use of conjugate gradient posed particular challenges due to implicit non-local communication posing a hindrance to standard domain partitioning schemes; a number of techniques are discussed to address this in a feasible, computationally efficient manner. The parallel implementation demonstrates good scalability in combination with a novel domain partitioning scheme that specifically handles mixed water/land regions commonly found in coastal simulations. The approach presented here represents a practical methodology to rejuvenate legacy code on a commodity blade cluster with reasonable effort; our solution has direct application to other similar codes in the geosciences.

  4. Analysis of internal flows relative to the space shuttle main engine

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Cooperative efforts between the Lockheed-Huntsville Computational Mechanics Group and the NASA-MSFC Computational Fluid Dynamics staff has resulted in improved capabilities for numerically simulating incompressible flows generic to the Space Shuttle Main Engine (SSME). A well established and documented CFD code was obtained, modified, and applied to laminar and turbulent flows of the type occurring in the SSME Hot Gas Manifold. The INS3D code was installed on the NASA-MSFC CRAY-XMP computer system and is currently being used by NASA engineers. Studies to perform a transient analysis of the FPB were conducted. The COBRA/TRAC code is recommended for simulating the transient flow of oxygen into the LOX manifold. Property data for modifying the code to represent LOX/GOX flow was collected. The ALFA code was developed and recommended for representing the transient combustion in the preburner. These two codes will couple through the transient boundary conditions to simulate the startup and/or shutdown of the fuel preburner. A study, NAS8-37461, is currently being conducted to implement this modeling effort.

  5. Near Zone: Basic scattering code user's manual with space station applications

    NASA Technical Reports Server (NTRS)

    Marhefka, R. J.; Silvestro, J. W.

    1989-01-01

    The Electromagnetic Code - Basic Scattering Code, Version 3, is a user oriented computer code to analyze near and far zone patterns of antennas in the presence of scattering structures, to provide coupling between antennas in a complex environment, and to determine radiation hazard calculations at UHF and above. The analysis is based on uniform asymptotic techniques formulated in terms of the Uniform Geometrical Theory of Diffraction (UTD). Complicated structures can be simulated by arbitrarily oriented flat plates and an infinite ground plane that can be perfectly conducting or dielectric. Also, perfectly conducting finite elliptic cylinder, elliptic cone frustum sections, and finite composite ellipsoids can be used to model the superstructure of a ship, the body of a truck, and airplane, a satellite, etc. This manual gives special consideration to space station modeling applications. This is a user manual designed to give an overall view of the operation of the computer code, to instruct a user in how to model structures, and to show the validity of the code by comparing various computed results against measured and alternative calculations such as method of moments whenever available.

  6. ASHMET: A computer code for estimating insolation incident on tilted surfaces

    NASA Technical Reports Server (NTRS)

    Elkin, R. F.; Toelle, R. G.

    1980-01-01

    A computer code, ASHMET, was developed by MSFC to estimate the amount of solar insolation incident on the surfaces of solar collectors. Both tracking and fixed-position collectors were included. Climatological data for 248 U. S. locations are built into the code. The basic methodology used by ASHMET is the ASHRAE clear-day insolation relationships modified by a clearness index derived from SOLMET-measured solar radiation data to a horizontal surface.

  7. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  8. Computer-Access-Code Matrices

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr.

    1990-01-01

    Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.

  9. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  10. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  11. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  12. Computational methods for coupling microstructural and micromechanical materials response simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were appliedmore » to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.« less

  13. A proposed study of multiple scattering through clouds up to 1 THz

    NASA Technical Reports Server (NTRS)

    Gerace, G. C.; Smith, E. K.

    1992-01-01

    A rigorous computation of the electromagnetic field scattered from an atmospheric liquid water cloud is proposed. The recent development of a fast recursive algorithm (Chew algorithm) for computing the fields scattered from numerous scatterers now makes a rigorous computation feasible. A method is presented for adapting this algorithm to a general case where there are an extremely large number of scatterers. It is also proposed to extend a new binary PAM channel coding technique (El-Khamy coding) to multiple levels with non-square pulse shapes. The Chew algorithm can be used to compute the transfer function of a cloud channel. Then the transfer function can be used to design an optimum El-Khamy code. In principle, these concepts can be applied directly to the realistic case of a time-varying cloud (adaptive channel coding and adaptive equalization). A brief review is included of some preliminary work on cloud dispersive effects on digital communication signals and on cloud liquid water spectra and correlations.

  14. An analytical procedure and automated computer code used to design model nozzles which meet MSFC base pressure similarity parameter criteria. [space shuttle

    NASA Technical Reports Server (NTRS)

    Sulyma, P. R.

    1980-01-01

    Fundamental equations and similarity definition and application are described as well as the computational steps of a computer program developed to design model nozzles for wind tunnel tests conducted to define power-on aerodynamic characteristics of the space shuttle over a range of ascent trajectory conditions. The computer code capabilities, a user's guide for the model nozzle design program, and the output format are examined. A program listing is included.

  15. Application of computer generated color graphic techniques to the processing and display of three dimensional fluid dynamic data

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.; Putt, C. W.; Giamati, C. C.

    1981-01-01

    Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.

  16. CARES/LIFE Software Commercialization

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.

  17. Pretest aerosol code comparisons for LWR aerosol containment tests LA1 and LA2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, A.L.; Wilson, J.H.; Arwood, P.C.

    The Light-Water-Reactor (LWR) Aerosol Containment Experiments (LACE) are being performed in Richland, Washington, at the Hanford Engineering Development Laboratory (HEDL) under the leadership of an international project board and the Electric Power Research Institute. These tests have two objectives: (1) to investigate, at large scale, the inherent aerosol retention behavior in LWR containments under simulated severe accident conditions, and (2) to provide an experimental data base for validating aerosol behavior and thermal-hydraulic computer codes. Aerosol computer-code comparison activities are being coordinated at the Oak Ridge National Laboratory. For each of the six LACE tests, ''pretest'' calculations (for code-to-code comparisons) andmore » ''posttest'' calculations (for code-to-test data comparisons) are being performed. The overall goals of the comparison effort are (1) to provide code users with experience in applying their codes to LWR accident-sequence conditions and (2) to evaluate and improve the code models.« less

  18. Using concatenated quantum codes for universal fault-tolerant quantum gates.

    PubMed

    Jochym-O'Connor, Tomas; Laflamme, Raymond

    2014-01-10

    We propose a method for universal fault-tolerant quantum computation using concatenated quantum error correcting codes. The concatenation scheme exploits the transversal properties of two different codes, combining them to provide a means to protect against low-weight arbitrary errors. We give the required properties of the error correcting codes to ensure universal fault tolerance and discuss a particular example using the 7-qubit Steane and 15-qubit Reed-Muller codes. Namely, other than computational basis state preparation as required by the DiVincenzo criteria, our scheme requires no special ancillary state preparation to achieve universality, as opposed to schemes such as magic state distillation. We believe that optimizing the codes used in such a scheme could provide a useful alternative to state distillation schemes that exhibit high overhead costs.

  19. Proceedings of the 14th International Conference on the Numerical Simulation of Plasmas

    NASA Astrophysics Data System (ADS)

    Partial Contents are as follows: Numerical Simulations of the Vlasov-Maxwell Equations by Coupled Particle-Finite Element Methods on Unstructured Meshes; Electromagnetic PIC Simulations Using Finite Elements on Unstructured Grids; Modelling Travelling Wave Output Structures with the Particle-in-Cell Code CONDOR; SST--A Single-Slice Particle Simulation Code; Graphical Display and Animation of Data Produced by Electromagnetic, Particle-in-Cell Codes; A Post-Processor for the PEST Code; Gray Scale Rendering of Beam Profile Data; A 2D Electromagnetic PIC Code for Distributed Memory Parallel Computers; 3-D Electromagnetic PIC Simulation on the NRL Connection Machine; Plasma PIC Simulations on MIMD Computers; Vlasov-Maxwell Algorithm for Electromagnetic Plasma Simulation on Distributed Architectures; MHD Boundary Layer Calculation Using the Vortex Method; and Eulerian Codes for Plasma Simulations.

  20. Lean coding machine. Facilities target productivity and job satisfaction with coding automation.

    PubMed

    Rollins, Genna

    2010-07-01

    Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Andrew; Lawrence, Earl

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less

  2. Scheduling Operations for Massive Heterogeneous Clusters

    NASA Technical Reports Server (NTRS)

    Humphrey, John; Spagnoli, Kyle

    2013-01-01

    High-performance computing (HPC) programming has become increasingly difficult with the advent of hybrid supercomputers consisting of multicore CPUs and accelerator boards such as the GPU. Manual tuning of software to achieve high performance on this type of machine has been performed by programmers. This is needlessly difficult and prone to being invalidated by new hardware, new software, or changes in the underlying code. A system was developed for task-based representation of programs, which when coupled with a scheduler and runtime system, allows for many benefits, including higher performance and utilization of computational resources, easier programming and porting, and adaptations of code during runtime. The system consists of a method of representing computer algorithms as a series of data-dependent tasks. The series forms a graph, which can be scheduled for execution on many nodes of a supercomputer efficiently by a computer algorithm. The schedule is executed by a dispatch component, which is tailored to understand all of the hardware types that may be available within the system. The scheduler is informed by a cluster mapping tool, which generates a topology of available resources and their strengths and communication costs. Software is decoupled from its hardware, which aids in porting to future architectures. A computer algorithm schedules all operations, which for systems of high complexity (i.e., most NASA codes), cannot be performed optimally by a human. The system aids in reducing repetitive code, such as communication code, and aids in the reduction of redundant code across projects. It adds new features to code automatically, such as recovering from a lost node or the ability to modify the code while running. In this project, the innovators at the time of this reporting intend to develop two distinct technologies that build upon each other and both of which serve as building blocks for more efficient HPC usage. First is the scheduling and dynamic execution framework, and the second is scalable linear algebra libraries that are built directly on the former.

  3. Computer code for single-point thermodynamic analysis of hydrogen/oxygen expander-cycle rocket engines

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.; Jones, Scott M.

    1991-01-01

    This analysis and this computer code apply to full, split, and dual expander cycles. Heat regeneration from the turbine exhaust to the pump exhaust is allowed. The combustion process is modeled as one of chemical equilibrium in an infinite-area or a finite-area combustor. Gas composition in the nozzle may be either equilibrium or frozen during expansion. This report, which serves as a users guide for the computer code, describes the system, the analysis methodology, and the program input and output. Sample calculations are included to show effects of key variables such as nozzle area ratio and oxidizer-to-fuel mass ratio.

  4. Development of a cryogenic mixed fluid J-T cooling computer code, 'JTMIX'

    NASA Technical Reports Server (NTRS)

    Jones, Jack A.

    1991-01-01

    An initial study was performed for analyzing and predicting the temperatures and cooling capacities when mixtures of fluids are used in Joule-Thomson coolers and in heat pipes. A computer code, JTMIX, was developed for mixed gas J-T analysis for any fluid combination of neon, nitrogen, various hydrocarbons, argon, oxygen, carbon monoxide, carbon dioxide, and hydrogen sulfide. When used in conjunction with the NIST computer code, DDMIX, it has accurately predicted order-of-magnitude increases in J-T cooling capacities when various hydrocarbons are added to nitrogen, and it predicts nitrogen normal boiling point depressions to as low as 60 K when neon is added.

  5. Particle Hydrodynamics with Material Strength for Multi-Layer Orbital Debris Shield Design

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.

    1999-01-01

    Three dimensional simulation of oblique hypervelocity impact on orbital debris shielding places extreme demands on computer resources. Research to date has shown that particle models provide the most accurate and efficient means for computer simulation of shield design problems. In order to employ a particle based modeling approach to the wall plate impact portion of the shield design problem, it is essential that particle codes be augmented to represent strength effects. This report describes augmentation of a Lagrangian particle hydrodynamics code developed by the principal investigator, to include strength effects, allowing for the entire shield impact problem to be represented using a single computer code.

  6. Laser Signature Prediction Using The VALUE Computer Program

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander; Hoffman, George A.; Patton, Ronald

    1989-09-01

    A variety of enhancements are being made to the 1976-vintage LASERX computer code. These include: - Surface characterization with BDRF tabular data - Specular reflection from transparent surfaces - Generation of glint direction maps - Generation of relative range imagery - Interface to the LOWTRAN atmospheric transmission code - Interface to the LEOPS laser sensor code - User friendly menu prompting for easy setup Versions of VALUE have been written for both VAX/VMS and PC/DOS computer environments. Outputs have also been revised to be user friendly and include tables, plots, and images for (1) intensity, (2) cross section,(3) reflectance, (4) relative range, (5) region type, and (6) silhouette.

  7. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  8. Experimental and analytical comparison of flowfields in a 110 N (25 lbf) H2/O2 rocket

    NASA Technical Reports Server (NTRS)

    Reed, Brian D.; Penko, Paul F.; Schneider, Steven J.; Kim, Suk C.

    1991-01-01

    A gaseous hydrogen/gaseous oxygen 110 N (25 lbf) rocket was examined through the RPLUS code using the full Navier-Stokes equations with finite rate chemistry. Performance tests were conducted on the rocket in an altitude test facility. Preliminary parametric analyses were performed for a range of mixture ratios and fuel film cooling pcts. It is shown that the computed values of specific impulse and characteristic exhaust velocity follow the trend of the experimental data. Specific impulse computed by the code is lower than the comparable test values by about two to three percent. The computed characteristic exhaust velocity values are lower than the comparable test values by three to four pct. Thrust coefficients computed by the code are found to be within two pct. of the measured values. It is concluded that the discrepancy between computed and experimental performance values could not be attributed to experimental uncertainty.

  9. Reliability model of a monopropellant auxiliary propulsion system

    NASA Technical Reports Server (NTRS)

    Greenberg, J. S.

    1971-01-01

    A mathematical model and associated computer code has been developed which computes the reliability of a monopropellant blowdown hydrazine spacecraft auxiliary propulsion system as a function of time. The propulsion system is used to adjust or modify the spacecraft orbit over an extended period of time. The multiple orbit corrections are the multiple objectives which the auxiliary propulsion system is designed to achieve. Thus the reliability model computes the probability of successfully accomplishing each of the desired orbit corrections. To accomplish this, the reliability model interfaces with a computer code that models the performance of a blowdown (unregulated) monopropellant auxiliary propulsion system. The computer code acts as a performance model and as such gives an accurate time history of the system operating parameters. The basic timing and status information is passed on to and utilized by the reliability model which establishes the probability of successfully accomplishing the orbit corrections.

  10. A Computational Study of an Oscillating VR-12 Airfoil with a Gurney Flap

    NASA Technical Reports Server (NTRS)

    Rhee, Myung

    2004-01-01

    Computations of the flow over an oscillating airfoil with a Gurney-flap are performed using a Reynolds Averaged Navier-Stokes code and compared with recent experimental data. The experimental results have been generated for different sizes of the Gurney flaps. The computations are focused mainly on a configuration. The baseline airfoil without a Gurney flap is computed and compared with the experiments in both steady and unsteady cases for the purpose of initial testing of the code performance. The are carried out with different turbulence models. Effects of the grid refinement are also examined and unsteady cases, in addition to the assessment of solver effects. The results of the comparisons of steady lift and drag computations indicate that the code is reasonably accurate in the attached flow of the steady condition but largely overpredicts the lift and underpredicts the drag in the higher angle steady flow.

  11. Development of a thermal and structural analysis procedure for cooled radial turbines

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Deanna, Russell G.

    1988-01-01

    A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine are considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analysis. The inviscid, quasi three dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous three dimensional internal flow cade for the momentum and energy equation. These boundary conditions are input to a three dimensional heat conduction code for the calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results are given.

  12. A FORTRAN code for the calculation of probe volume geometry changes in a laser anemometry system caused by window refraction

    NASA Technical Reports Server (NTRS)

    Owen, Albert K.

    1987-01-01

    A computer code was written which utilizes ray tracing techniques to predict the changes in position and geometry of a laser Doppler velocimeter probe volume resulting from refraction effects. The code predicts the position change, changes in beam crossing angle, and the amount of uncrossing that occur when the beams traverse a region with a changed index of refraction, such as a glass window. The code calculates the changes for flat plate, cylinder, general axisymmetric and general surface windows and is currently operational on a VAX 8600 computer system.

  13. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Technical Reports Server (NTRS)

    Chambers, Lin Hartung

    1994-01-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  14. BRYNTRN: A baryon transport model

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.; Nealy, John E.; Chun, Sang Y.; Hong, B. S.; Buck, Warren W.; Lamkin, S. L.; Ganapol, Barry D.; Khan, Ferdous; Cucinotta, Francis A.

    1989-01-01

    The development of an interaction data base and a numerical solution to the transport of baryons through an arbitrary shield material based on a straight ahead approximation of the Boltzmann equation are described. The code is most accurate for continuous energy boundary values, but gives reasonable results for discrete spectra at the boundary using even a relatively coarse energy grid (30 points) and large spatial increments (1 cm in H2O). The resulting computer code is self-contained, efficient and ready to use. The code requires only a very small fraction of the computer resources required for Monte Carlo codes.

  15. Applying graphics user interface ot group technology classification and coding at the Boeing aerospace company

    NASA Astrophysics Data System (ADS)

    Ness, P. H.; Jacobson, H.

    1984-10-01

    The thrust of 'group technology' is toward the exploitation of similarities in component design and manufacturing process plans to achieve assembly line flow cost efficiencies for small batch production. The systematic method devised for the identification of similarities in component geometry and processing steps is a coding and classification scheme implemented by interactive CAD/CAM systems. This coding and classification scheme has led to significant increases in computer processing power, allowing rapid searches and retrievals on the basis of a 30-digit code together with user-friendly computer graphics.

  16. Safety and effectiveness of the INVATEC MO.MA proximal cerebral protection device during carotid artery stenting: results from the ARMOUR pivotal trial.

    PubMed

    Ansel, Gary M; Hopkins, L Nelson; Jaff, Michael R; Rubino, Paolo; Bacharach, J Michael; Scheinert, Dierk; Myla, Subbarao; Das, Tony; Cremonesi, Alberto

    2010-07-01

    The multicenter ARMOUR (ProximAl PRotection with the MO.MA Device DUring CaRotid Stenting) trial evaluated the 30-day safety and effectiveness of the MO.MA Proximal Cerebral Protection Device (Invatec, Roncadelle, Italy) utilized to treat high surgical risk patients undergoing carotid artery stenting (CAS). Distal embolic protection devices (EPD) have been traditionally utilized during CAS. The MO.MA device acts as a balloon occlusion "endovascular clamping" system to achieve cerebral protection prior to crossing the carotid stenosis. This prospective registry enrolled 262 subjects, 37 roll-in and 225 pivotal subjects evaluated with intention to treat (ITT) from September 2007 to February 2009. Subjects underwent CAS using the MO.MA device. The primary endpoint, myocardial infarction, stroke, or death through 30 days (30-day major adverse cardiac and cerebrovascular events [MACCE]) was compared to a performance goal of 13% derived from trials utilizing distal EPD. For the ITT population, the mean age was 74.7 years with 66.7% of the cohort being male. Symptomatic patients comprised 15.1% and 28.9% were octogenarians. Device success was 98.2% and procedural success was 93.2%. The 30-day MACCE rate was 2.7% [95% CI (1.0-5.8%)] with a 30-day major stroke rate of 0.9%. No symptomatic patient suffered a stroke during this trial. The ARMOUR trial demonstrated that the MO.MA(R) Proximal Cerebral Protection Device is safe and effective for high surgical risk patients undergoing CAS. The absence of stroke in symptomatic patients is the lowest rate reported in any independently adjudicated prospective multicenter registry trial to date. (c) 2010 Wiley-Liss, Inc.

  17. Off-pump compared to minimal extracorporeal circulation surgery in coronary artery bypass grafting.

    PubMed

    Reuthebuch, Oliver; Koechlin, Luca; Gahl, Brigitta; Matt, Peter; Schurr, Ulrich; Grapow, Martin; Eckstein, Friedrich

    2014-01-01

    Coronary artery bypass grafting (CABG) using extracorporeal circulation (ECC) is still the gold standard. However, alternative techniques have been developed to avoid ECC and its potential adverse effects. These encompass minimal extracorporeal circulation (MECC) or off-pump coronary artery bypass grafting (OPCAB). However, the prevailing potential benefits when comparing MECC and OPCABG are not yet clearly established. In this retrospective study we investigated the potential benefits of MECC and OPCABG in 697 patients undergoing CABG. Of these, 555 patients had been operated with MECC and 142 off-pump. The primary endpoint was Troponin T level as an indicator for myocardial damage. Study groups were not significantly different in general. However, patients undergoing OPCABG were significantly older (65.01 years ± 9.5 vs. 69.39 years ± 9.5; p value <0.001) with a higher Logistic EuroSCORE I (4.92% ± 6.5 vs. 5.88% ± 6.8; p value = 0.017). Operating off pump significantly reduced the need for intra-operative blood products (0.7% vs. 8.6%; p-value <0.001) and the length of stay in the intensive care unit (ICU) (2.04 days ± 2.63 vs. 2.76 days ± 2.79; p value <0.001). Regarding other blood values a significant difference could not be found in the adjusted calculations. The combined secondary endpoint, major cardiac or cerebrovascular events (MACCE), was equal in both groups as well. Coronary artery bypass grafting using MECC or OPCABG are two comparable techniques with advantages for OPCABG regarding the reduced need for intra-operative blood products and shorter length of stay in the ICU. However serological values and combined endpoint MACCE did not differ significantly in both groups.

  18. Measurements of volatile organic compounds during the 2006 TexAQS/GoMACCS campaign: Industrial influences, regional characteristics, and diurnal dependencies of the OH reactivity

    NASA Astrophysics Data System (ADS)

    Gilman, Jessica B.; Kuster, William C.; Goldan, Paul D.; Herndon, Scott C.; Zahniser, Mark S.; Tucker, Sara C.; Brewer, W. Alan; Lerner, Brian M.; Williams, Eric J.; Harley, Robert A.; Fehsenfeld, Fred C.; Warneke, Carsten; de Gouw, Joost A.

    2009-04-01

    An extensive set of volatile organic compounds (VOCs) and other gas phase species were measured in situ aboard the NOAA R/V Ronald H. Brown as the ship sailed in the Gulf of Mexico and the Houston and Galveston Bay (HGB) area as part of the Texas Air Quality (TexAQS)/Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) conducted from July-September 2006. The magnitudes of the reactivities of CH4, CO, VOCs, and NO2 with the hydroxyl radical, OH, were determined in order to quantify the contributions of these compounds to potential ozone formation. The average total OH reactivity (ROH,TOTAL) increased from 1.01 s-1 in the central gulf to 10.1 s-1 in the HGB area as a result of the substantial increase in the contribution from VOCs and NO2. The increase in the measured concentrations of reactive VOCs in the HGB area compared to the central gulf was explained by the impact of industrial emissions, the regional distribution of VOCs, and the effects of local meteorology. By compensating for the effects of boundary layer mixing, the diurnal profiles of the OH reactivity were used to characterize the source signatures and relative magnitudes of biogenic, anthropogenic (urban + industrial), and oxygenated VOCs as a function of the time of day. The source of reactive oxygenated VOCs (e.g., formaldehyde) was determined to be almost entirely from secondary production. The secondary formation of oxygenated VOCs, in addition to the continued emissions of reactive anthropogenic VOCs, served to sustain elevated levels of OH reactivity throughout the time of peak ozone production.

  19. Sex-related differences after contemporary primary percutaneous coronary intervention for ST-segment elevation myocardial infarction.

    PubMed

    Barthélémy, Olivier; Degrell, Philippe; Berman, Emmanuel; Kerneis, Mathieu; Petroni, Thibaut; Silvain, Johanne; Payot, Laurent; Choussat, Remi; Collet, Jean-Philippe; Helft, Gerard; Montalescot, Gilles; Le Feuvre, Claude

    2015-01-01

    Whether outcomes differ for women and men after percutaneous coronary intervention (PCI) for ST-segment elevation myocardial infarction (STEMI) remains controversial. To compare 1-year outcomes after primary PCI in women and men with STEMI, matched for age and diabetes. Consecutive women with STEMI of<24 hours' duration referred (August 2007 to January 2011) for primary PCI were compared with men matched for age and diabetes. Rates of all-cause mortality, target vessel revascularization (TVR) and major cardiovascular and cerebrovascular events (MACCE) (death/myocardial infarction/stroke) were assessed at 1 year. Among 775 consecutive patients, 182 (23.5%) women were compared with 182 matched men. Mean age was 69±15 years, 18% had diabetes. Patient characteristics were similar, except for lower creatinine clearance (73±41 vs 82±38 μmol/L; P=0.041), more cardiogenic shock (14.8% vs 6.6%; P=0.017) and less radial PCI (81.3% vs 90.1%; P=0.024) in women. Rates of 1-year death (22.7% vs 18.1%), TVR (8.3% vs 6.0%) and MACCE (24.3% vs 20.9%) were not statistically different in women (P>0.05 for all). After exclusion of patients with shock (10.7%) and out-of-hospital cardiac arrest (6.6%), death rates were even more similar (11.3% vs 11.8%; P=0.10). Female sex was not independently associated with death (odds ratio 1.01, 95% confidence interval 0.55-1.87; P=0.97). In our consecutive unselected patient population, women had similar 1-year outcomes to men matched for age and diabetes, after contemporary primary PCI for STEMI, despite having a higher risk profile at baseline. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  20. Marine aerosol distribution and variability over the pristine Southern Indian Ocean

    NASA Astrophysics Data System (ADS)

    Mallet, Paul-Étienne; Pujol, Olivier; Brioude, Jérôme; Evan, Stéphanie; Jensen, Andrew

    2018-06-01

    This paper presents an 8-year (2005-2012 inclusive) study of the marine aerosol distribution and variability over the Southern Indian Ocean, precisely in the area { 10 °S - 40 °S ; 50 °E - 110 °E } which has been identified as one of the most pristine regions of the globe. A large dataset consisting of satellite data (POLDER, CALIOP), AERONET measurements at Saint-Denis (French Réunion Island) and model reanalysis (MACC), has been used. In spite of a positive bias of about 0.05 between the AOD (aerosol optical depth) given by POLDER and MACC on one hand and the AOD measured by AERONET on the other, consistent results for aerosol distribution and variability over the area considered have been obtained. First, aerosols are mainly confined below 2km asl (above sea level) and are dominated by sea salt, especially in the center of the area of interest, with AOD ≤ 0 . 1. This zone is the most pristine and is associated with the position of the Mascarene anticyclone. There, the direct radiative effect is assessed around - 9 Wm-2 at the top of the atmosphere and probability density functions of the AOD s are leptokurtic lognormal functions without any significant seasonal variation. It is also suggested that the Madden-Jullian oscillation impacts sea salt emissions in the northern part of the area considered by modifying the state of the ocean surface. Finally, this area is surrounded in the northeast and the southwest by seasonal Australian and South African intrusions (AOD > 0.1) ; throughout the year, the ITCZ seems to limit continental contaminations from Asia. Due to the long period of time considered (almost a decade), this paper completes and strengthens results of studies based on observations performed during previous specific field campaigns.

Top