DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobromir Panayotov; Andrew Grief; Brad J. Merrill
'Fusion for Energy' (F4E) develops designs and implements the European Test Blanket Systems (TBS) in ITER - Helium-Cooled Lithium-Lead (HCLL) and Helium-Cooled Pebble-Bed (HCPB). Safety demonstration is an essential element for the integration of TBS in ITER and accident analyses are one of its critical segments. A systematic approach to the accident analyses had been acquired under the F4E contract on TBS safety analyses. F4E technical requirements and AMEC and INL efforts resulted in the development of a comprehensive methodology for fusion breeding blanket accident analyses. It addresses the specificity of the breeding blankets design, materials and phenomena and atmore » the same time is consistent with the one already applied to ITER accident analyses. Methodology consists of several phases. At first the reference scenarios are selected on the base of FMEA studies. In the second place elaboration of the accident analyses specifications we use phenomena identification and ranking tables to identify the requirements to be met by the code(s) and TBS models. Thus the limitations of the codes are identified and possible solutions to be built into the models are proposed. These include among others the loose coupling of different codes or code versions in order to simulate multi-fluid flows and phenomena. The code selection and issue of the accident analyses specifications conclude this second step. Furthermore the breeding blanket and ancillary systems models are built on. In this work challenges met and solutions used in the development of both MELCOR and RELAP5 codes models of HCLL and HCPB TBSs will be shared. To continue the developed models are qualified by comparison with finite elements analyses, by code to code comparison and sensitivity studies. Finally, the qualified models are used for the execution of the accident analyses of specific scenario. When possible the methodology phases will be illustrated in the paper by limited number of tables and figures. Description of each phase and its results in detail as well the methodology applications to EU HCLL and HCPB TBSs will be published in separate papers. The developed methodology is applicable to accident analyses of other TBSs to be tested in ITER and as well to DEMO breeding blankets.« less
Seismic Safety Of Simple Masonry Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guadagnuolo, Mariateresa; Faella, Giuseppe
2008-07-08
Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have tomore » be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.« less
A study of transonic aerodynamic analysis methods for use with a hypersonic aircraft synthesis code
NASA Technical Reports Server (NTRS)
Sandlin, Doral R.; Davis, Paul Christopher
1992-01-01
A means of performing routine transonic lift, drag, and moment analyses on hypersonic all-body and wing-body configurations were studied. The analysis method is to be used in conjunction with the Hypersonic Vehicle Optimization Code (HAVOC). A review of existing techniques is presented, after which three methods, chosen to represent a spectrum of capabilities, are tested and the results are compared with experimental data. The three methods consist of a wave drag code, a full potential code, and a Navier-Stokes code. The wave drag code, representing the empirical approach, has very fast CPU times, but very limited and sporadic results. The full potential code provides results which compare favorably to the wind tunnel data, but with a dramatic increase in computational time. Even more extreme is the Navier-Stokes code, which provides the most favorable and complete results, but with a very large turnaround time. The full potential code, TRANAIR, is used for additional analyses, because of the superior results it can provide over empirical and semi-empirical methods, and because of its automated grid generation. TRANAIR analyses include an all body hypersonic cruise configuration and an oblique flying wing supersonic transport.
Multiphysics Code Demonstrated for Propulsion Applications
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Melis, Matthew E.
1998-01-01
The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.
Finding Resolution for the Responsible Transparency of Economic Models in Health and Medicine.
Padula, William V; McQueen, Robert Brett; Pronovost, Peter J
2017-11-01
The Second Panel on Cost-Effectiveness in Health and Medicine recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses has a number of questions unanswered with respect to the implementation of transparent, open source code interface for economic models. The possibility of making economic model source code could be positive and progressive for the field; however, several unintended consequences of this system should be first considered before complete implementation of this model. First, there is the concern regarding intellectual property rights that modelers have to their analyses. Second, the open source code could make analyses more accessible to inexperienced modelers, leading to inaccurate or misinterpreted results. We propose several resolutions to these concerns. The field should establish a licensing system of open source code such that the model originators maintain control of the code use and grant permissions to other investigators who wish to use it. The field should also be more forthcoming towards the teaching of cost-effectiveness analysis in medical and health services education so that providers and other professionals are familiar with economic modeling and able to conduct analyses with open source code. These types of unintended consequences need to be fully considered before the field's preparedness to move forward into an era of model transparency with open source code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arai, Kenji; Ebata, Shigeo
1997-07-01
This paper summarizes the current and anticipated use of the thermal-hydraulic and neutronic codes for the BWR transient and accident analyses in Japan. The codes may be categorized into the licensing codes and the best estimate codes for the BWR transient and accident analyses. Most of the licensing codes have been originally developed by General Electric. Some codes have been updated based on the technical knowledge obtained in the thermal hydraulic study in Japan, and according to the BWR design changes. The best estimates codes have been used to support the licensing calculations and to obtain the phenomenological understanding ofmore » the thermal hydraulic phenomena during a BWR transient or accident. The best estimate codes can be also applied to a design study for a next generation BWR to which the current licensing model may not be directly applied. In order to rationalize the margin included in the current BWR design and develop a next generation reactor with appropriate design margin, it will be required to improve the accuracy of the thermal-hydraulic and neutronic model. In addition, regarding the current best estimate codes, the improvement in the user interface and the numerics will be needed.« less
Biological significance of long non-coding RNA FTX expression in human colorectal cancer.
Guo, Xiao-Bo; Hua, Zhu; Li, Chen; Peng, Li-Pan; Wang, Jing-Shen; Wang, Bo; Zhi, Qiao-Ming
2015-01-01
The purpose of this study was to determine the expression of long non-coding RNA (lncRNA) FTX and analyze its prognostic and biological significance in colorectal cancer (CRC). A quantitative reverse transcription PCR was performed to detect the expression of long non-coding RNA FTX in 35 pairs of colorectal cancer and corresponding noncancerous tissues. The expression of long non-coding RNA FTX was detected in 187 colorectal cancer tissues and its correlations with clinicopathological factors of patients were examined. Univariate and multivariate analyses were performed to analyze the prognostic significance of Long Non-coding RNA FTX expression. The effects of long non-coding RNA FTX expression on malignant phenotypes of colorectal cancer cells and its possible biological significances were further determined. Long non-coding RNA FTX was significantly upregulated in colorectal cancer tissues, and low long non-coding RNA FTX expression was significantly correlated with differentiation grade, lymph vascular invasion, and clinical stage. Patients with high long non-coding RNA FTX showed poorer overall survival than those with low long non-coding RNA FTX. Multivariate analyses indicated that status of long non-coding RNA FTX was an independent prognostic factor for patients. Functional analyses showed that upregulation of long non-coding RNA FTX significantly promoted growth, migration, invasion, and increased colony formation in colorectal cancer cells. Therefore, long non-coding RNA FTX may be a potential biomarker for predicting the survival of colorectal cancer patients and might be a molecular target for treatment of human colorectal cancer.
Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo
2012-01-01
In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.
Biological significance of long non-coding RNA FTX expression in human colorectal cancer
Guo, Xiao-Bo; Hua, Zhu; Li, Chen; Peng, Li-Pan; Wang, Jing-Shen; Wang, Bo; Zhi, Qiao-Ming
2015-01-01
The purpose of this study was to determine the expression of long non-coding RNA (lncRNA) FTX and analyze its prognostic and biological significance in colorectal cancer (CRC). A quantitative reverse transcription PCR was performed to detect the expression of long non-coding RNA FTX in 35 pairs of colorectal cancer and corresponding noncancerous tissues. The expression of long non-coding RNA FTX was detected in 187 colorectal cancer tissues and its correlations with clinicopathological factors of patients were examined. Univariate and multivariate analyses were performed to analyze the prognostic significance of Long Non-coding RNA FTX expression. The effects of long non-coding RNA FTX expression on malignant phenotypes of colorectal cancer cells and its possible biological significances were further determined. Long non-coding RNA FTX was significantly upregulated in colorectal cancer tissues, and low long non-coding RNA FTX expression was significantly correlated with differentiation grade, lymph vascular invasion, and clinical stage. Patients with high long non-coding RNA FTX showed poorer overall survival than those with low long non-coding RNA FTX. Multivariate analyses indicated that status of long non-coding RNA FTX was an independent prognostic factor for patients. Functional analyses showed that upregulation of long non-coding RNA FTX significantly promoted growth, migration, invasion, and increased colony formation in colorectal cancer cells. Therefore, long non-coding RNA FTX may be a potential biomarker for predicting the survival of colorectal cancer patients and might be a molecular target for treatment of human colorectal cancer. PMID:26629053
Current and anticipated uses of thermal-hydraulic codes in Germany
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teschendorff, V.; Sommer, F.; Depisch, F.
1997-07-01
In Germany, one third of the electrical power is generated by nuclear plants. ATHLET and S-RELAP5 are successfully applied for safety analyses of the existing PWR and BWR reactors and possible future reactors, e.g. EPR. Continuous development and assessment of thermal-hydraulic codes are necessary in order to meet present and future needs of licensing organizations, utilities, and vendors. Desired improvements include thermal-hydraulic models, multi-dimensional simulation, computational speed, interfaces to coupled codes, and code architecture. Real-time capability will be essential for application in full-scope simulators. Comprehensive code validation and quantification of uncertainties are prerequisites for future best-estimate analyses.
A simple code for use in shielding and radiation dosage analyses
NASA Technical Reports Server (NTRS)
Wan, C. C.
1972-01-01
A simple code for use in analyses of gamma radiation effects in laminated materials is described. Simple and good geometry is assumed so that all multiple collision and scattering events are excluded from consideration. The code is capable of handling laminates up to six layers. However, for laminates of more than six layers, the same code may be used to incorporate two additional layers at a time, making use of punch-tape outputs from previous computation on all preceding layers. Spectrum of attenuated radiation are obtained as both printed output and punch tape output as desired.
Fernández-Lansac, Violeta; Crespo, María
2017-07-26
This study introduces a new coding system, the Coding and Assessment System for Narratives of Trauma (CASNOT), to analyse several language domains in narratives of autobiographical memories, especially in trauma narratives. The development of the coding system is described. It was applied to assess positive and traumatic/negative narratives in 50 battered women (trauma-exposed group) and 50 nontrauma-exposed women (control group). Three blind raters coded each narrative. Inter-rater reliability analyses were conducted for the CASNOT language categories (multirater Kfree coefficients) and dimensions (intraclass correlation coefficients). High levels of inter-rater agreement were found for most of the language domains. Categories that did not reach the expected reliability were mainly those related to cognitive processes, which reflects difficulties in operationalizing constructs such as lack of control or helplessness, control or planning, and rationalization or memory elaboration. Applications and limitations of the CASNOT are discussed to enhance narrative measures for autobiographical memories.
Boyd, Andrew D; ‘John’ Li, Jianrong; Kenost, Colleen; Joese, Binoy; Min Yang, Young; Kalagidis, Olympia A; Zenku, Ilir; Saner, Donald; Bahroos, Neil; Lussier, Yves A
2015-01-01
In the United States, International Classification of Disease Clinical Modification (ICD-9-CM, the ninth revision) diagnosis codes are commonly used to identify patient cohorts and to conduct financial analyses related to disease. In October 2015, the healthcare system of the United States will transition to ICD-10-CM (the tenth revision) diagnosis codes. One challenge posed to clinical researchers and other analysts is conducting diagnosis-related queries across datasets containing both coding schemes. Further, healthcare administrators will manage growth, trends, and strategic planning with these dually-coded datasets. The majority of the ICD-9-CM to ICD-10-CM translations are complex and nonreciprocal, creating convoluted representations and meanings. Similarly, mapping back from ICD-10-CM to ICD-9-CM is equally complex, yet different from mapping forward, as relationships are likewise nonreciprocal. Indeed, 10 of the 21 top clinical categories are complex as 78% of their diagnosis codes are labeled as “convoluted” by our analyses. Analysis and research related to external causes of morbidity, injury, and poisoning will face the greatest challenges due to 41 745 (90%) convolutions and a decrease in the number of codes. We created a web portal tool and translation tables to list all ICD-9-CM diagnosis codes related to the specific input of ICD-10-CM diagnosis codes and their level of complexity: “identity” (reciprocal), “class-to-subclass,” “subclass-to-class,” “convoluted,” or “no mapping.” These tools provide guidance on ambiguous and complex translations to reveal where reports or analyses may be challenging to impossible. Web portal: http://www.lussierlab.org/transition-to-ICD9CM/ Tables annotated with levels of translation complexity: http://www.lussierlab.org/publications/ICD10to9 PMID:25681260
Hoare, Karen J; Mills, Jane; Francis, Karen
2012-12-01
The terminology used to analyse data in a grounded theory study can be confusing. Different grounded theorists use a variety of terms which all have similar meanings. In the following study, we use terms adopted by Charmaz including: initial, focused and axial coding. Initial codes are used to analyse data with an emphasis on identifying gerunds, a verb acting as a noun. If initial codes are relevant to the developing theory, they are grouped with similar codes into categories. Categories become saturated when there are no new codes identified in the data. Axial codes are used to link categories together into a grounded theory process. Memo writing accompanies this data sifting and sorting. The following article explains how one initial code became a category providing a worked example of the grounded theory method of constant comparative analysis. The interplay between coding and categorization is facilitated by the constant comparative method. © 2012 Wiley Publishing Asia Pty Ltd.
Accuracy of clinical coding for procedures in oral and maxillofacial surgery.
Khurram, S A; Warner, C; Henry, A M; Kumar, A; Mohammed-Ali, R I
2016-10-01
Clinical coding has important financial implications, and discrepancies in the assigned codes can directly affect the funding of a department and hospital. Over the last few years, numerous oversights have been noticed in the coding of oral and maxillofacial (OMF) procedures. To establish the accuracy and completeness of coding, we retrospectively analysed the records of patients during two time periods: March to May 2009 (324 patients), and January to March 2014 (200 patients). Two investigators independently collected and analysed the data to ensure accuracy and remove bias. A large proportion of operations were not assigned all the relevant codes, and only 32% - 33% were correct in both cycles. To our knowledge, this is the first reported audit of clinical coding in OMFS, and it highlights serious shortcomings that have substantial financial implications. Better input by the surgical team and improved communication between the surgical and coding departments will improve accuracy. Copyright © 2016 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-01-01
Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625
Qualitative data analysis for health services research: developing taxonomy, themes, and theory.
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-08-01
To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.
Seligmann, Hervé
2018-05-01
Genetic codes mainly evolve by reassigning punctuation codons, starts and stops. Previous analyses assuming that undefined amino acids translate stops showed greater divergence between nuclear and mitochondrial genetic codes. Here, three independent methods converge on which amino acids translated stops at split between nuclear and mitochondrial genetic codes: (a) alignment-free genetic code comparisons inserting different amino acids at stops; (b) alignment-based blast analyses of hypothetical peptides translated from non-coding mitochondrial sequences, inserting different amino acids at stops; (c) biases in amino acid insertions at stops in proteomic data. Hence short-term protein evolution models reconstruct long-term genetic code evolution. Mitochondria reassign stops to amino acids otherwise inserted at stops by codon-anticodon mismatches (near-cognate tRNAs). Hence dual function (translation termination and translation by codon-anticodon mismatch) precedes mitochondrial reassignments of stops to amino acids. Stop ambiguity increases coded information, compensates endocellular mitogenome reduction. Mitochondrial codon reassignments might prevent viral infections. Copyright © 2018 Elsevier B.V. All rights reserved.
Iterative categorization (IC): a systematic technique for analysing qualitative data
2016-01-01
Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155
Shielding Analyses for VISION Beam Line at SNS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popova, Irina; Gallmeier, Franz X
2014-01-01
Full-scale neutron and gamma transport analyses were performed to design shielding around the VISION beam line, instrument shielding enclosure, beam stop, secondary shutter including a temporary beam stop for the still closed neighboring beam line to meet requirement is to achieve dose rates below 0.25 mrem/h at 30 cm from the shielding surface. The beam stop and the temporary beam stop analyses were performed with the discrete ordinate code DORT additionally to Monte Carlo analyses with the MCNPX code. Comparison of the results is presented.
A Coding Scheme for Analysing Problem-Solving Processes of First-Year Engineering Students
ERIC Educational Resources Information Center
Grigg, Sarah J.; Benson, Lisa C.
2014-01-01
This study describes the development and structure of a coding scheme for analysing solutions to well-structured problems in terms of cognitive processes and problem-solving deficiencies for first-year engineering students. A task analysis approach was used to assess students' problem solutions using the hierarchical structure from a…
Measurement of neutron spectra in the AWE workplace using a Bonner sphere spectrometer.
Danyluk, Peter
2010-12-01
A Bonner sphere spectrometer has been used to measure the neutron spectra in eight different workplace areas at AWE (Atomic Weapons Establishment). The spectra were analysed by the National Physical Laboratory using their principal unfolding code STAY'SL and the results were also analysed by AWE using a bespoke parametrised unfolding code. The bespoke code was designed specifically for the AWE workplace and is very simple to use. Both codes gave results, in good agreement. It was found that the measured fluence rate varied from 2 to 70 neutrons cm⁻² s⁻¹ (± 10%) and the ambient dose equivalent H*(10) varied from 0.5 to 57 µSv h⁻¹ (± 20%). A detailed description of the development and use of the bespoke code is presented.
Posttest analysis of the FFTF inherent safety tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Padilla, A. Jr.; Claybrook, S.W.
Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less
Accuracy of clinical coding from 1210 appendicectomies in a British district general hospital.
Bhangu, Aneel; Nepogodiev, Dmitri; Taylor, Caroline; Durkin, Natalie; Patel, Rajan
2012-01-01
The primary aim of this study was to assess the accuracy of clinical coding in identifying negative appendicectomies. The secondary aim was to analyse trends over time in rates of simple, complex (gangrenous or perforated) and negative appendicectomies. Retrospective review of 1210 patients undergoing emergency appendicectomy during a five year period (2006-2010). Histopathology reports were taken as gold standard for diagnosis and compared to clinical coding lists. Clinical coding is the process by which non-medical administrators apply standardised diagnostic codes to patients, based upon clinical notes at discharge. These codes then contribute to national databases. Statistical analysis included correlation studies and regression analyses. Clinical coding had only moderate correlation with histopathology, with an overall kappa of 0.421. Annual kappa values varied between 0.378 and 0.500. Overall 14% of patients were incorrectly coded as having had appendicitis when in fact they had a histopathologically normal appendix (153/1107), whereas 4% were falsely coded as having received a negative appendicectomy when they had appendicitis (48/1107). There was an overall significant fall and then rise in the rate of simple appendicitis (B coefficient -0.239 (95% confidence interval -0.426, -0.051), p = 0.014) but no change in the rate of complex appendicitis (B coefficient 0.008 (-0.015, 0.031), p = 0.476). Clinical coding for negative appendicectomy was unreliable. Negative rates may be higher than suspected. This has implications for the validity of national database analyses. Using this form of data as a quality indictor for appendicitis should be reconsidered until its quality is improved. Copyright © 2012 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Current and anticipated uses of thermal-hydraulic codes in NFI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsuda, K.; Takayasu, M.
1997-07-01
This paper presents the thermal-hydraulic codes currently used in NFI for the LWR fuel development and licensing application including transient and design basis accident analyses of LWR plants. The current status of the codes are described in the context of code capability, modeling feature, and experience of code application related to the fuel development and licensing. Finally, the anticipated use of the future thermal-hydraulic code in NFI is briefly given.
General review of the MOSTAS computer code for wind turbines
NASA Technical Reports Server (NTRS)
Dungundji, J.; Wendell, J. H.
1981-01-01
The MOSTAS computer code for wind turbine analysis is reviewed, and techniques and methods used in its analyses are described. Impressions of its strengths and weakness, and recommendations for its application, modification, and further development are made. Basic techniques used in wind turbine stability and response analyses for systems with constant and periodic coefficients are reviewed.
Improvements and applications of COBRA-TF for stand-alone and coupled LWR safety analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, M.; Cuervo, D.; Ivanov, K.
2006-07-01
The advanced thermal-hydraulic subchannel code COBRA-TF has been recently improved and applied for stand-alone and coupled LWR core calculations at the Pennsylvania State Univ. in cooperation with AREVA NP GmbH (Germany)) and the Technical Univ. of Madrid. To enable COBRA-TF for academic and industrial applications including safety margins evaluations and LWR core design analyses, the code programming, numerics, and basic models were revised and substantially improved. The code has undergone through an extensive validation, verification, and qualification program. (authors)
Model-Driven Engineering of Machine Executable Code
NASA Astrophysics Data System (ADS)
Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira
Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.
Boyd, Andrew D; Li, Jianrong John; Kenost, Colleen; Joese, Binoy; Yang, Young Min; Kalagidis, Olympia A; Zenku, Ilir; Saner, Donald; Bahroos, Neil; Lussier, Yves A
2015-05-01
In the United States, International Classification of Disease Clinical Modification (ICD-9-CM, the ninth revision) diagnosis codes are commonly used to identify patient cohorts and to conduct financial analyses related to disease. In October 2015, the healthcare system of the United States will transition to ICD-10-CM (the tenth revision) diagnosis codes. One challenge posed to clinical researchers and other analysts is conducting diagnosis-related queries across datasets containing both coding schemes. Further, healthcare administrators will manage growth, trends, and strategic planning with these dually-coded datasets. The majority of the ICD-9-CM to ICD-10-CM translations are complex and nonreciprocal, creating convoluted representations and meanings. Similarly, mapping back from ICD-10-CM to ICD-9-CM is equally complex, yet different from mapping forward, as relationships are likewise nonreciprocal. Indeed, 10 of the 21 top clinical categories are complex as 78% of their diagnosis codes are labeled as "convoluted" by our analyses. Analysis and research related to external causes of morbidity, injury, and poisoning will face the greatest challenges due to 41 745 (90%) convolutions and a decrease in the number of codes. We created a web portal tool and translation tables to list all ICD-9-CM diagnosis codes related to the specific input of ICD-10-CM diagnosis codes and their level of complexity: "identity" (reciprocal), "class-to-subclass," "subclass-to-class," "convoluted," or "no mapping." These tools provide guidance on ambiguous and complex translations to reveal where reports or analyses may be challenging to impossible.Web portal: http://www.lussierlab.org/transition-to-ICD9CM/Tables annotated with levels of translation complexity: http://www.lussierlab.org/publications/ICD10to9. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
NASA Technical Reports Server (NTRS)
Tatchell, D. G.
1979-01-01
A code, CATHY3/M, was prepared and demonstrated by application to a sample case. The preparation is reviewed, a summary of the capabilities and main features of the code is given, and the sample case results are discussed. Recommendations for future use and development of the code are provided.
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
Three-dimensional pin-to-pin analyses of VVER-440 cores by the MOBY-DICK code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehmann, M.; Mikolas, P.
1994-12-31
Nuclear design for the Dukovany (EDU) VVER-440s nuclear power plant is routinely performed by the MOBY-DICK system. After its implementation on Hewlett Packard series 700 workstations, it is able to perform routinely three-dimensional pin-to-pin core analyses. For purposes of code validation, the benchmark prepared from EDU operational data was solved.
Iterative categorization (IC): a systematic technique for analysing qualitative data.
Neale, Joanne
2016-06-01
The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
Development of Safety Analysis Code System of Beam Transport and Core for Accelerator Driven System
NASA Astrophysics Data System (ADS)
Aizawa, Naoto; Iwasaki, Tomohiko
2014-06-01
Safety analysis code system of beam transport and core for accelerator driven system (ADS) is developed for the analyses of beam transients such as the change of the shape and position of incident beam. The code system consists of the beam transport analysis part and the core analysis part. TRACE 3-D is employed in the beam transport analysis part, and the shape and incident position of beam at the target are calculated. In the core analysis part, the neutronics, thermo-hydraulics and cladding failure analyses are performed by the use of ADS dynamic calculation code ADSE on the basis of the external source database calculated by PHITS and the cross section database calculated by SRAC, and the programs of the cladding failure analysis for thermoelastic and creep. By the use of the code system, beam transient analyses are performed for the ADS proposed by Japan Atomic Energy Agency. As a result, the rapid increase of the cladding temperature happens and the plastic deformation is caused in several seconds. In addition, the cladding is evaluated to be failed by creep within a hundred seconds. These results have shown that the beam transients have caused a cladding failure.
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2011 CFR
2011-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2012 CFR
2012-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2014 CFR
2014-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2010 CFR
2010-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...
SQA of finite element method (FEM) codes used for analyses of pit storage/transport packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russel, E.
1997-11-01
This report contains viewgraphs on the software quality assurance of finite element method codes used for analyses of pit storage and transport projects. This methodology utilizes the ISO 9000-3: Guideline for application of 9001 to the development, supply, and maintenance of software, for establishing well-defined software engineering processes to consistently maintain high quality management approaches.
ERIC Educational Resources Information Center
Brandon, Paul R.; Harrison, George M.; Lawton, Brian E.
2013-01-01
When evaluators plan site-randomized experiments, they must conduct the appropriate statistical power analyses. These analyses are most likely to be valid when they are based on data from the jurisdictions in which the studies are to be conducted. In this method note, we provide software code, in the form of a SAS macro, for producing statistical…
Wilson, Reda J; O'Neil, M E; Ntekop, E; Zhang, Kevin; Ren, Y
2014-01-01
Calculating accurate estimates of cancer survival is important for various analyses of cancer patient care and prognosis. Current US survival rates are estimated based on data from the National Cancer Institute's (NCI's) Surveillance, Epidemiology, and End RESULTS (SEER) program, covering approximately 28 percent of the US population. The National Program of Cancer Registries (NPCR) covers about 96 percent of the US population. Using a population-based database with greater US population coverage to calculate survival rates at the national, state, and regional levels can further enhance the effective monitoring of cancer patient care and prognosis in the United States. The first step is to establish the coding completeness and coding quality of the NPCR data needed for calculating survival rates and conducting related validation analyses. Using data from the NPCR-Cancer Surveillance System (CSS) from 1995 through 2008, we assessed coding completeness and quality on 26 data elements that are needed to calculate cancer relative survival estimates and conduct related analyses. Data elements evaluated consisted of demographic, follow-up, prognostic, and cancer identification variables. Analyses were performed showing trends of these variables by diagnostic year, state of residence at diagnosis, and cancer site. Mean overall percent coding completeness by each NPCR central cancer registry averaged across all data elements and diagnosis years ranged from 92.3 percent to 100 percent. RESULTS showing the mean percent coding completeness for the relative survival-related variables in NPCR data are presented. All data elements but 1 have a mean coding completeness greater than 90 percent as was the mean completeness by data item group type. Statistically significant differences in coding completeness were found in the ICD revision number, cause of death, vital status, and date of last contact variables when comparing diagnosis years. The majority of data items had a coding quality greater than 90 percent, with exceptions found in cause of death, follow-up source, and the SEER Summary Stage 1977, and SEER Summary Stage 2000. Percent coding completeness and quality are very high for variables in the NPCR-CSS that are covariates to calculating relative survival. NPCR provides the opportunity to calculate relative survival that may be more generalizable to the US population.
Dickinson, Dwight; Ramsey, Mary E; Gold, James M
2007-05-01
In focusing on potentially localizable cognitive impairments, the schizophrenia meta-analytic literature has overlooked the largest single impairment: on digit symbol coding tasks. To compare the magnitude of the schizophrenia impairment on coding tasks with impairments on other traditional neuropsychological instruments. MEDLINE and PsycINFO electronic databases and reference lists from identified articles. English-language studies from 1990 to present, comparing performance of patients with schizophrenia and healthy controls on coding tasks and cognitive measures representing at least 2 other cognitive domains. Of 182 studies identified, 40 met all criteria for inclusion in the meta-analysis. Means, standard deviations, and sample sizes were extracted for digit symbol coding and 36 other cognitive variables. In addition, we recorded potential clinical moderator variables, including chronicity/severity, medication status, age, and education, and potential study design moderators, including coding task variant, matching, and study publication date. Main analyses synthesized data from 37 studies comprising 1961 patients with schizophrenia and 1444 comparison subjects. Combination of mean effect sizes across studies by means of a random effects model yielded a weighted mean effect for digit symbol coding of g = -1.57 (95% confidence interval, -1.66 to -1.48). This effect compared with a grand mean effect of g = -0.98 and was significantly larger than effects for widely used measures of episodic memory, executive functioning, and working memory. Moderator variable analyses indicated that clinical and study design differences between studies had little effect on the coding task effect. Comparison with previous meta-analyses suggested that current results were representative of the broader literature. Subsidiary analysis of data from relatives of patients with schizophrenia also suggested prominent coding task impairments in this group. The 5-minute digit symbol coding task, reliable and easy to administer, taps an information processing inefficiency that is a central feature of the cognitive deficit in schizophrenia and deserves systematic investigation.
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2013 CFR
2013-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...
Patient complaints in healthcare systems: a systematic review and coding taxonomy
Reader, Tom W; Gillespie, Alex; Roberts, Jane
2014-01-01
Background Patient complaints have been identified as a valuable resource for monitoring and improving patient safety. This article critically reviews the literature on patient complaints, and synthesises the research findings to develop a coding taxonomy for analysing patient complaints. Methods The PubMed, Science Direct and Medline databases were systematically investigated to identify patient complaint research studies. Publications were included if they reported primary quantitative data on the content of patient-initiated complaints. Data were extracted and synthesised on (1) basic study characteristics; (2) methodological details; and (3) the issues patients complained about. Results 59 studies, reporting 88 069 patient complaints, were included. Patient complaint coding methodologies varied considerably (eg, in attributing single or multiple causes to complaints). In total, 113 551 issues were found to underlie the patient complaints. These were analysed using 205 different analytical codes which when combined represented 29 subcategories of complaint issue. The most common issues complained about were ‘treatment’ (15.6%) and ‘communication’ (13.7%). To develop a patient complaint coding taxonomy, the subcategories were thematically grouped into seven categories, and then three conceptually distinct domains. The first domain related to complaints on the safety and quality of clinical care (representing 33.7% of complaint issues), the second to the management of healthcare organisations (35.1%) and the third to problems in healthcare staff–patient relationships (29.1%). Conclusions Rigorous analyses of patient complaints will help to identify problems in patient safety. To achieve this, it is necessary to standardise how patient complaints are analysed and interpreted. Through synthesising data from 59 patient complaint studies, we propose a coding taxonomy for supporting future research and practice in the analysis of patient complaint data. PMID:24876289
Transport and stability analyses supporting disruption prediction in high beta KSTAR plasmas
NASA Astrophysics Data System (ADS)
Ahn, J.-H.; Sabbagh, S. A.; Park, Y. S.; Berkery, J. W.; Jiang, Y.; Riquezes, J.; Lee, H. H.; Terzolo, L.; Scott, S. D.; Wang, Z.; Glasser, A. H.
2017-10-01
KSTAR plasmas have reached high stability parameters in dedicated experiments, with normalized beta βN exceeding 4.3 at relatively low plasma internal inductance li (βN/li>6). Transport and stability analyses have begun on these plasmas to best understand a disruption-free path toward the design target of βN = 5 while aiming to maximize the non-inductive fraction of these plasmas. Initial analysis using the TRANSP code indicates that the non-inductive current fraction in these plasmas has exceeded 50 percent. The advent of KSTAR kinetic equilibrium reconstructions now allows more accurate computation of the MHD stability of these plasmas. Attention is placed on code validation of mode stability using the PEST-3 and resistive DCON codes. Initial evaluation of these analyses for disruption prediction is made using the disruption event characterization and forecasting (DECAF) code. The present global mode kinetic stability model in DECAF developed for low aspect ratio plasmas is evaluated to determine modifications required for successful disruption prediction of KSTAR plasmas. Work supported by U.S. DoE under contract DE-SC0016614.
Bhattacharya, D; Steinkötter, J; Melkonian, M
1993-12-01
Centrin (= caltractin) is a ubiquitous, cytoskeletal protein which is a member of the EF-hand superfamily of calcium-binding proteins. A centrin-coding cDNA was isolated and characterized from the prasinophyte green alga Scherffelia dubia. Centrin PCR amplification primers were used to isolate partial, homologous cDNA sequences from the green algae Tetraselmis striata and Spermatozopsis similis. Annealing analyses suggested that centrin is a single-copy-coding region in T. striata and S. similis and other green algae studied. Centrin-coding regions from S. dubia, S. similis and T. striata encode four colinear EF-hand domains which putatively bind calcium. Phylogenetic analyses, including homologous sequences from Chlamydomonas reinhardtii and the land plant Atriplex nummularia, demonstrate that the domains of centrins are congruent and arose from the two-fold duplication of an ancestral EF hand with Domains 1+3 and Domains 2+4 clustering. The domains of centrins are also congruent with those of calmodulins demonstrating that, like calmodulin, centrin is an ancient protein which arose within the ancestor of all eukaryotes via gene duplication. Phylogenetic relationships inferred from centrin-coding region comparisons mirror results of small subunit ribosomal RNA sequence analyses suggesting that centrin-coding regions are useful evolutionary markers within the green algae.
An Efficient Method for Verifying Gyrokinetic Microstability Codes
NASA Astrophysics Data System (ADS)
Bravenec, R.; Candy, J.; Dorland, W.; Holland, C.
2009-11-01
Benchmarks for gyrokinetic microstability codes can be developed through successful ``apples-to-apples'' comparisons among them. Unlike previous efforts, we perform the comparisons for actual discharges, rendering the verification efforts relevant to existing experiments and future devices (ITER). The process requires i) assembling the experimental analyses at multiple times, radii, discharges, and devices, ii) creating the input files ensuring that the input parameters are faithfully translated code-to-code, iii) running the codes, and iv) comparing the results, all in an organized fashion. The purpose of this work is to automate this process as much as possible: At present, a python routine is used to generate and organize GYRO input files from TRANSP or ONETWO analyses. Another routine translates the GYRO input files into GS2 input files. (Translation software for other codes has not yet been written.) Other python codes submit the multiple GYRO and GS2 jobs, organize the results, and collect them into a table suitable for plotting. (These separate python routines could easily be consolidated.) An example of the process -- a linear comparison between GYRO and GS2 for a DIII-D discharge at multiple radii -- will be presented.
SSC San Diego Command History Calendar Year 2005
2006-03-01
Lichtenstein, Robert Clark, Celia Metz, Rod Anderson, Michael Dwyer , Dr. Randall Moore, Kate Schemensky, Wanda Parise, Jorge Mora, Ken Kaufman, John Laccone...Dynamically Tunable Wavelength Filters" Distinguished Rachel Goshorn, Code 2373 Dr. Visarath In, Code 2373 David Fogliatti, Code 2373 Dr. Joseph Neff, Code...Information Center Fort Belvoir, VA 22060-6218 (4) SSC San Diego Liaison Office C/ O PEO-SCS Arlington, VA 22202-4804 (1) Center for Naval Analyses
Thermal finite-element analysis of space shuttle main engine turbine blade
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Tong, Michael T.; Kaufman, Albert
1987-01-01
Finite-element, transient heat transfer analyses were performed for the first-stage blades of the space shuttle main engine (SSME) high-pressure fuel turbopump. The analyses were based on test engine data provided by Rocketdyne. Heat transfer coefficients were predicted by performing a boundary-layer analysis at steady-state conditions with the STAN5 boundary-layer code. Two different peak-temperature overshoots were evaluated for the startup transient. Cutoff transient conditions were also analyzed. A reduced gas temperature profile based on actual thermocouple data was also considered. Transient heat transfer analyses were conducted with the MARC finite-element computer code.
Potential Effects of Leak-Before-Break on Light Water Reactor Design.
1985-08-26
Boiler and Pressure Vessel Code . In fact, section 3 of that code was created for nuclear applications. This... Boiler and Pressure Vessel Code . The only major change which leak-before-break would require in these analyses would be that all piping to be considered...XI of the ASME Boiler and Pressure Vessel Code , and is already required for all Class I piping systems in the plant. Class I systems are those
A novel concatenated code based on the improved SCG-LDPC code for optical transmission systems
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Xie, Ya; Wang, Lin; Huang, Sheng; Wang, Yong
2013-01-01
Based on the optimization and improvement for the construction method of systematically constructed Gallager (SCG) (4, k) code, a novel SCG low density parity check (SCG-LDPC)(3969, 3720) code to be suitable for optical transmission systems is constructed. The novel SCG-LDPC (6561,6240) code with code rate of 95.1% is constructed by increasing the length of SCG-LDPC (3969,3720) code, and in a way, the code rate of LDPC codes can better meet the high requirements of optical transmission systems. And then the novel concatenated code is constructed by concatenating SCG-LDPC(6561,6240) code and BCH(127,120) code with code rate of 94.5%. The simulation results and analyses show that the net coding gain (NCG) of BCH(127,120)+SCG-LDPC(6561,6240) concatenated code is respectively 2.28 dB and 0.48 dB more than those of the classic RS(255,239) code and SCG-LDPC(6561,6240) code at the bit error rate (BER) of 10-7.
Numerical modelling of gravel unconstrained flow experiments with the DAN3D and RASH3D codes
NASA Astrophysics Data System (ADS)
Sauthier, Claire; Pirulli, Marina; Pisani, Gabriele; Scavia, Claudio; Labiouse, Vincent
2015-12-01
Landslide continuum dynamic models have improved considerably in the last years, but a consensus on the best method of calibrating the input resistance parameter values for predictive analyses has not yet emerged. In the present paper, numerical simulations of a series of laboratory experiments performed at the Laboratory for Rock Mechanics of the EPF Lausanne were undertaken with the RASH3D and DAN3D numerical codes. They aimed at analysing the possibility to use calibrated ranges of parameters (1) in a code different from that they were obtained from and (2) to simulate potential-events made of a material with the same characteristics as back-analysed past-events, but involving a different volume and propagation path. For this purpose, one of the four benchmark laboratory tests was used as past-event to calibrate the dynamic basal friction angle assuming a Coulomb-type behaviour of the sliding mass, and this back-analysed value was then used to simulate the three other experiments, assumed as potential-events. The computational findings show good correspondence with experimental results in terms of characteristics of the final deposits (i.e., runout, length and width). Furthermore, the obtained best fit values of the dynamic basal friction angle for the two codes turn out to be close to each other and within the range of values measured with pseudo-dynamic tilting tests.
Nonlinear wave vacillation in the atmosphere
NASA Technical Reports Server (NTRS)
Antar, Basil N.
1987-01-01
The problem of vacillation in a baroclinically unstable flow field is studied through the time evolution of a single nonlinearly unstable wave. To this end a computer code is being developed to solve numerically for the time evolution of the amplitude of such a wave. The final working code will be the end product resulting from the development of a heirarchy of codes with increasing complexity. The first code in this series was completed and is undergoing several diagnostic analyses to verify its validity. The development of this code is detailed.
Chumney, Elinor C G; Biddle, Andrea K; Simpson, Kit N; Weinberger, Morris; Magruder, Kathryn M; Zelman, William N
2004-01-01
As cost-effectiveness analyses (CEAs) are increasingly used to inform policy decisions, there is a need for more information on how different cost determination methods affect cost estimates and the degree to which the resulting cost-effectiveness ratios (CERs) may be affected. The lack of specificity of diagnosis-related groups (DRGs) could mean that they are ill-suited for costing applications in CEAs. Yet, the implications of using International Classification of Diseases-9th edition (ICD-9) codes or a form of disease-specific risk group stratification instead of DRGs has yet to be clearly documented. To demonstrate the implications of different disease coding mechanisms on costs and the magnitude of error that could be introduced in head-to-head comparisons of resulting CERs. We based our analyses on a previously published Markov model for HIV/AIDS therapies. We used the Healthcare Cost and Utilisation Project Nationwide Inpatient Sample (HCUP-NIS) data release 6, which contains all-payer data on hospital inpatient stays from selected states. We added costs for the mean number of hospitalisations, derived from analyses based on either DRG or ICD-9 codes or risk group stratification cost weights, to the standard outpatient and prescription drug costs to yield an estimate of total charges for each AIDS-defining illness (ADI). Finally, we estimated the Markov model three times with the appropriate ADI cost weights to obtain CERs specific to the use of either DRG or ICD-9 codes or risk group. Contrary to expectations, we found that the choice of coding/grouping assumptions that are disease-specific by either DRG codes, ICD-9 codes or risk group resulted in very similar CER estimates for highly active antiretroviral therapy. The large variations in the specific ADI cost weights across the three different coding approaches was especially interesting. However, because no one approach produced consistently higher estimates than the others, the Markov model's weighted cost per event and resulting CERs were remarkably close in value to one another. Although DRG codes are based on broader categories and contain less information than ICD-9 codes, in practice the choice of whether to use DRGs or ICD-9 codes may have little effect on the CEA results in heterogeneous conditions such as HIV/AIDS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
PASCO: Structural panel analysis and sizing code: Users manual - Revised
NASA Technical Reports Server (NTRS)
Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.
1981-01-01
A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.
Nimptsch, Ulrike
2016-06-01
To investigate changes in comorbidity coding after the introduction of diagnosis related groups (DRGs) based prospective payment and whether trends differ regarding specific comorbidities. Nationwide administrative data (DRG statistics) from German acute care hospitals from 2005 to 2012. Observational study to analyze trends in comorbidity coding in patients hospitalized for common primary diseases and the effects on comorbidity-related risk of in-hospital death. Comorbidity coding was operationalized by Elixhauser diagnosis groups. The analyses focused on adult patients hospitalized for the primary diseases of heart failure, stroke, and pneumonia, as well as hip fracture. When focusing the total frequency of diagnosis groups per record, an increase in depth of coding was observed. Between-hospital variations in depth of coding were present throughout the observation period. Specific comorbidity increases were observed in 15 of the 31 diagnosis groups, and decreases in comorbidity were observed for 11 groups. In patients hospitalized for heart failure, shifts of comorbidity-related risk of in-hospital death occurred in nine diagnosis groups, in which eight groups were directed toward the null. Comorbidity-adjusted outcomes in longitudinal administrative data analyses may be biased by nonconstant risk over time, changes in completeness of coding, and between-hospital variations in coding. Accounting for such issues is important when the respective observation period coincides with changes in the reimbursement system or other conditions that are likely to alter clinical coding practice. © Health Research and Educational Trust.
Schütz, U; Reichel, H; Dreinhöfer, K
2007-01-01
We introduce a grouping system for clinical practice which allows the separation of DRG coding in specific orthopaedic groups based on anatomic regions, operative procedures, therapeutic interventions and morbidity equivalent diagnosis groups. With this, a differentiated aim-oriented analysis of illustrated internal DRG data becomes possible. The group-specific difference of the coding quality between the DRG groups following primary coding by the orthopaedic surgeon and final coding by the medical controlling is analysed. In a consecutive series of 1600 patients parallel documentation and group-specific comparison of the relevant DRG parameters were carried out in every case after primary and final coding. Analysing the group-specific share in the additional CaseMix coding, the group "spine surgery" dominated, closely followed by the groups "arthroplasty" and "surgery due to infection, tumours, diabetes". Altogether, additional cost-weight-relevant coding was necessary most frequently in the latter group (84%), followed by group "spine surgery" (65%). In DRGs representing conservative orthopaedic treatment documented procedures had nearly no influence on the cost weight. The introduced system of case group analysis in internal DRG documentation can lead to the detection of specific problems in primary coding and cost-weight relevant changes of the case mix. As an instrument for internal process control in the orthopaedic field, it can serve as a communicative interface between an economically oriented classification of the hospital performance and a specific problem solution of the medical staff involved in the department management.
Preliminary Analysis of the Transient Reactor Test Facility (TREAT) with PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Lee, C. H.
The neutron transport code PROTEUS has been used to perform preliminary simulations of the Transient Reactor Test Facility (TREAT). TREAT is an experimental reactor designed for the testing of nuclear fuels and other materials under transient conditions. It operated from 1959 to 1994, when it was placed on non-operational standby. The restart of TREAT to support the U.S. Department of Energy’s resumption of transient testing is currently underway. Both single assembly and assembly-homogenized full core models have been evaluated. Simulations were performed using a historic set of WIMS-ANL-generated cross-sections as well as a new set of Serpent-generated cross-sections. To supportmore » this work, further analyses were also performed using additional codes in order to investigate particular aspects of TREAT modeling. DIF3D and the Monte-Carlo codes MCNP and Serpent were utilized in these studies. MCNP and Serpent were used to evaluate the effect of geometry homogenization on the simulation results and to support code-to-code comparisons. New meshes for the PROTEUS simulations were created using the CUBIT toolkit, with additional meshes generated via conversion of selected DIF3D models to support code-to-code verifications. All current analyses have focused on code-to-code verifications, with additional verification and validation studies planned. The analysis of TREAT with PROTEUS-SN is an ongoing project. This report documents the studies that have been performed thus far, and highlights key challenges to address in future work.« less
ERIC Educational Resources Information Center
Mayhew, Matthew J.; Simonoff, Jeffrey S.
2015-01-01
The purpose of this paper is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, multi-raced independent variables in higher education research. Not only may effect coding enable researchers to get closer to respondents' original intentions, it allows for more accurate analyses of all race…
Overview and Current Status of Analyses of Potential LEU Design Concepts for TREAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Kontogeorgakos, D. C.; Papadias, D. D.
2015-10-01
Neutronic and thermal-hydraulic analyses have been performed to evaluate the performance of different low-enriched uranium (LEU) fuel design concepts for the conversion of the Transient Reactor Test Facility (TREAT) from its current high-enriched uranium (HEU) fuel. TREAT is an experimental reactor developed to generate high neutron flux transients for the testing of nuclear fuels. The goal of this work was to identify an LEU design which can maintain the performance of the existing HEU core while continuing to operate safely. A wide variety of design options were considered, with a focus on minimizing peak fuel temperatures and optimizing the powermore » coupling between the TREAT core and test samples. Designs were also evaluated to ensure that they provide sufficient reactivity and shutdown margin for each control rod bank. Analyses were performed using the core loading and experiment configuration of historic M8 Power Calibration experiments (M8CAL). The Monte Carlo code MCNP was utilized for steady-state analyses, and transient calculations were performed with the point kinetics code TREKIN. Thermal analyses were performed with the COMSOL multi-physics code. Using the results of this study, a new LEU Baseline design concept is being established, which will be evaluated in detail in a future report.« less
Calculations vs. measurements of remnant dose rates for SNS spent structures
NASA Astrophysics Data System (ADS)
Popova, I. I.; Gallmeier, F. X.; Trotter, S.; Dayton, M.
2018-06-01
Residual dose rate measurements were conducted on target vessel #13 and proton beam window #5 after extraction from their service locations. These measurements were used to verify calculation methods of radionuclide inventory assessment that are typically performed for nuclear waste characterization and transportation of these structures. Neutronics analyses for predicting residual dose rates were carried out using the transport code MCNPX and the transmutation code CINDER90. For transport analyses complex and rigorous geometry model of the structures and their surrounding are applied. The neutronics analyses were carried out using Bertini and CEM high energy physics models for simulating particles interaction. Obtained preliminary calculational results were analysed and compared to the measured dose rates and overall are showing good agreement with in 40% in average.
Calculations vs. measurements of remnant dose rates for SNS spent structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popova, Irina I.; Gallmeier, Franz X.; Trotter, Steven M.
Residual dose rate measurements were conducted on target vessel #13 and proton beam window #5 after extraction from their service locations. These measurements were used to verify calculation methods of radionuclide inventory assessment that are typically performed for nuclear waste characterization and transportation of these structures. Neutronics analyses for predicting residual dose rates were carried out using the transport code MCNPX and the transmutation code CINDER90. For transport analyses complex and rigorous geometry model of the structures and their surrounding are applied. The neutronics analyses were carried out using Bertini and CEM high energy physics models for simulating particles interaction.more » Obtained preliminary calculational results were analysed and compared to the measured dose rates and overall are showing good agreement with in 40% in average.« less
Parzeller, Markus; Zedler, Barbara
2013-01-01
The article deals with the new regulations in the German Civil Code (BGB) which came into effect in Germany on 26 Feb 2013 as the Patient Rights Act (PatRG). In Part I, the legislative procedure, the treatment contract and the contracting parties (Section 630a Civil Code), the applicable regulations (Section 630b Civil Code) and the obligations to cooperate and inform (Section 630c Civil Code) are discussed and critically analysed.
Implementation of a Blowing Boundary Condition in the LAURA Code
NASA Technical Reports Server (NTRS)
Thompson, Richard a.; Gnoffo, Peter A.
2008-01-01
Preliminary steps toward modeling a coupled ablation problem using a finite-volume Navier-Stokes code (LAURA) are presented in this paper. Implementation of a surface boundary condition with mass transfer (blowing) is described followed by verification and validation through comparisons with analytic results and experimental data. Application of the code to a carbon-nosetip ablation problem is demonstrated and the results are compared with previously published data. It is concluded that the code and coupled procedure are suitable to support further ablation analyses and studies.
NASA Technical Reports Server (NTRS)
Vetter, A. A.; Maxwell, C. D.; Swean, T. F., Jr.; Demetriades, S. T.; Oliver, D. A.; Bangerter, C. D.
1981-01-01
Data from sufficiently well-instrumented, short-duration experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc., are compared to analyses with multidimensional and time-dependent simulations with the STD/MHD computer codes. These analyses reveal detailed features of major transient events, severe loss mechanisms, and anomalous MHD behavior. In particular, these analyses predicted higher-than-design voltage drops, Hall voltage overshoots, and asymmetric voltage drops before the experimental data were available. The predictions obtained with these analyses are in excellent agreement with the experimental data and the failure predictions are consistent with the experiments. The design of large, high-interaction or advanced MHD experiments will require application of sophisticated, detailed and comprehensive computational procedures in order to account for the critical mechanisms which led to the observed behavior in these experiments.
Capabilities overview of the MORET 5 Monte Carlo code
NASA Astrophysics Data System (ADS)
Cochet, B.; Jinaphanh, A.; Heulers, L.; Jacquet, O.
2014-06-01
The MORET code is a simulation tool that solves the transport equation for neutrons using the Monte Carlo method. It allows users to model complex three-dimensional geometrical configurations, describe the materials, define their own tallies in order to analyse the results. The MORET code has been initially designed to perform calculations for criticality safety assessments. New features has been introduced in the MORET 5 code to expand its use for reactor applications. This paper presents an overview of the MORET 5 code capabilities, going through the description of materials, the geometry modelling, the transport simulation and the definition of the outputs.
Redundant Coding in Visual Search Displays: Effects of Shape and Colour.
1997-02-01
results for refining color selection algorithms and for color coding in situations where the gamut of available colors is limited. In a secondary set of analyses, we note large performance differences as a function of target shape.
Status of VICTORIA: NRC peer review and recent code applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bixler, N.E.; Schaperow, J.H.
1997-12-01
VICTORIA is a mechanistic computer code designed to analyze fission product behavior within a nuclear reactor coolant system (RCS) during a severe accident. It provides detailed predictions of the release of radioactive and nonradioactive materials from the reactor core and transport and deposition of these materials within the RCS. A summary of the results and recommendations of an independent peer review of VICTORIA by the US Nuclear Regulatory Commission (NRC) is presented, along with recent applications of the code. The latter include analyses of a temperature-induced steam generator tube rupture sequence and post-test analyses of the Phebus FPT-1 test. Themore » next planned Phebus test, FTP-4, will focus on fission product releases from a rubble bed, especially those of the less-volatile elements, and on the speciation of the released elements. Pretest analyses using VICTORIA to estimate the magnitude and timing of releases are presented. The predicted release of uranium is a matter of particular importance because of concern about filter plugging during the test.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
[Death certificate data in France: Production process and main types of analyses].
Rey, G
2016-10-01
Mortality data, by the unambiguity of their definition and understanding by all stakeholders, and completeness of registration, are a cornerstone of public health statistics in France and in most industrialized countries. This article describes the data production process, and the main types of possible analyses. Data production is composed of different stages: death certification by a medical doctor on paper or electronic (using a web application) format, data transmission to Inserm, capture and coding of information. The encoding of the information follows the WHO recommendations of the International Classification of Diseases ([ICD], 10th revision used since 2000). It is carried out using an automatic coding software, called Iris, developed in an international consortium. The coding aims, first, at assigning an ICD code to all nosologic entities encountered on the certificate, and then at selecting the underlying cause of death. The latter is the main information used for statistical analyses. Three main types of analysis emerge in the literature: the exploitation of data on the death certificate only, ecological analyses (studies of associations between variables measured across groups) and analysis from data individually linked to other databases. Many public health issues can be addressed with these various analyses. Several developments in the production process are being implemented: the deployment of electronic certification, increased automation of the death certificate information processing and durable and complete record linkage with health insurance and hospitalisation data. They could soon be deeply expanding the scope of possible uses of causes of death data. Copyright © 2016 Société Nationale Française de Médecine Interne (SNFMI). Published by Elsevier SAS. All rights reserved.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Ex-Vessel Core Melt Modeling Comparison between MELTSPREAD-CORQUENCH and MELCOR 2.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robb, Kevin R.; Farmer, Mitchell; Francis, Matthew W.
System-level code analyses by both United States and international researchers predict major core melting, bottom head failure, and corium-concrete interaction for Fukushima Daiichi Unit 1 (1F1). Although system codes such as MELCOR and MAAP are capable of capturing a wide range of accident phenomena, they currently do not contain detailed models for evaluating some ex-vessel core melt behavior. However, specialized codes containing more detailed modeling are available for melt spreading such as MELTSPREAD as well as long-term molten corium-concrete interaction (MCCI) and debris coolability such as CORQUENCH. In a preceding study, Enhanced Ex-Vessel Analysis for Fukushima Daiichi Unit 1: Meltmore » Spreading and Core-Concrete Interaction Analyses with MELTSPREAD and CORQUENCH, the MELTSPREAD-CORQUENCH codes predicted the 1F1 core melt readily cooled in contrast to predictions by MELCOR. The user community has taken notice and is in the process of updating their systems codes; specifically MAAP and MELCOR, to improve and reduce conservatism in their ex-vessel core melt models. This report investigates why the MELCOR v2.1 code, compared to the MELTSPREAD and CORQUENCH 3.03 codes, yield differing predictions of ex-vessel melt progression. To accomplish this, the differences in the treatment of the ex-vessel melt with respect to melt spreading and long-term coolability are examined. The differences in modeling approaches are summarized, and a comparison of example code predictions is provided.« less
Benchmarking the Multidimensional Stellar Implicit Code MUSIC
NASA Astrophysics Data System (ADS)
Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.
2017-04-01
We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.
Comparison of Space Shuttle Hot Gas Manifold analysis to air flow data
NASA Technical Reports Server (NTRS)
Mcconnaughey, P. K.
1988-01-01
This paper summarizes several recent analyses of the Space Shuttle Main Engine Hot Gas Manifold and compares predicted flow environments to air flow data. Codes used in these analyses include INS3D, PAGE, PHOENICS, and VAST. Both laminar (Re = 250, M = 0.30) and turbulent (Re = 1.9 million, M = 0.30) results are discussed, with the latter being compared to data for system losses, outer wall static pressures, and manifold exit Mach number profiles. Comparison of predicted results for the turbulent case to air flow data shows that the analysis using INS3D predicted system losses within 1 percent error, while the PHOENICS, PAGE, and VAST codes erred by 31, 35, and 47 percent, respectively. The INS3D, PHOENICS, and PAGE codes did a reasonable job of predicting outer wall static pressure, while the PHOENICS code predicted exit Mach number profiles with acceptable accuracy. INS3D was approximately an order of magnitude more efficient than the other codes in terms of code speed and memory requirements. In general, it is seen that complex internal flows in manifold-like geometries can be predicted with a limited degree of confidence, and further development is necessary to improve both efficiency and accuracy of codes if they are to be used as design tools for complex three-dimensional geometries.
Crash Outcome Data Evaluation System (CODES) Project Safety Belt and Helmet Analyses
DOT National Transportation Integrated Search
1996-02-15
Analyses of the benefits of safety belt and helmet use were undertaken for a : Report that was sent to Congress February, 1996. NHTSA awarded grants to link : crash and injury state data and perform the analyses upon which the report was : based. The...
Introduction of the ASP3D Computer Program for Unsteady Aerodynamic and Aeroelastic Analyses
NASA Technical Reports Server (NTRS)
Batina, John T.
2005-01-01
A new computer program has been developed called ASP3D (Advanced Small Perturbation 3D), which solves the small perturbation potential flow equation in an advanced form including mass-consistent surface and trailing wake boundary conditions, and entropy, vorticity, and viscous effects. The purpose of the program is for unsteady aerodynamic and aeroelastic analyses, especially in the nonlinear transonic flight regime. The program exploits the simplicity of stationary Cartesian meshes with the movement or deformation of the configuration under consideration incorporated into the solution algorithm through a planar surface boundary condition. The new ASP3D code is the result of a decade of developmental work on improvements to the small perturbation formulation, performed while the author was employed as a Senior Research Scientist in the Configuration Aerodynamics Branch at the NASA Langley Research Center. The ASP3D code is a significant improvement to the state-of-the-art for transonic aeroelastic analyses over the CAP-TSD code (Computational Aeroelasticity Program Transonic Small Disturbance), which was developed principally by the author in the mid-1980s. The author is in a unique position as the developer of both computer programs to compare, contrast, and ultimately make conclusions regarding the underlying formulations and utility of each code. The paper describes the salient features of the ASP3D code including the rationale for improvements in comparison with CAP-TSD. Numerous results are presented to demonstrate the ASP3D capability. The general conclusion is that the new ASP3D capability is superior to the older CAP-TSD code because of the myriad improvements developed and incorporated.
NASA Technical Reports Server (NTRS)
1991-01-01
In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.
Instrumentation for Verification of Bomb Damage Repair Computer Code.
1981-09-01
record the data, a conventional 14-track FM analog tape recorder was retained. The unknown factors of signal duration, test duration, and signal ...Kirtland Air Force Base computer centers for more detailed analyses. In addition to the analog recorder, signal conditioning equipment and amplifiers were...necessary to allow high quality data to be recorded. An Interrange Instrumentation Group (IRIG) code generator/reader placed a coded signal on the tape
SSC San Diego Command History Calendar Year 2005
2006-03-01
Celia Metz, Rod Anderson, Michael Dwyer , Dr. Randall Moore, Kate Schemensky, Wanda Parise, Jorge Mora, Ken Kaufman, John Laccone and Mike Phillips...Tunable Wavelength Filters” Distinguished Rachel Goshorn, Code 2373 Dr. Visarath In, Code 2373 David Fogliatti, Code 2373 Dr. Joseph Neff...22060–6218 (4) SSC San Diego Liaison Office C/ O PEO-SCS Arlington, VA 22202–4804 (1) Center for Naval Analyses Alexandria, VA 22302–0268 (1
Dynamics of face and annular seals with two-phase flow
NASA Technical Reports Server (NTRS)
Hughes, William F.; Basu, Prithwish; Beatty, Paul A.; Beeler, Richard M.; Lau, Stephen
1988-01-01
A detailed study was made of face and annular seals under conditions where boiling, i.e., phase change of the leaking fluid, occurs within the seal. Many seals operate in this mode because of flashing due to pressure drop and/or heat input from frictional heating. Some of the distinctive behavior characteristics of two phase seals are discussed, particularly their axial stability. The main conclusions are that seals with two phase flow may be unstable if improperly balanced. Detailed theoretical analyses of low (laminar) and high (turbulent) leakage seals are presented along with computer codes, parametric studies, and in particular a simplified PC based code that allows for rapid performance prediction: calculations of stiffness coefficients, temperature and pressure distributions, and leakage rates for parallel and coned face seals. A simplified combined computer code for the performance prediction over the laminar and turbulent ranges of a two phase flow is described and documented. The analyses, results, and computer codes are summarized.
Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine
2014-03-01
Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less
Posttest calculation of the PBF LOC-11B and LOC-11C experiments using RELAP4/MOD6. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, C.E.
Comparisons between RELAP4/MOD6, Update 4 code-calculated and measured experimental data are presented for the PBF LOC-11C and LOC-11B experiments. Independent code verification techniques are now being developed and this study represents a preliminary effort applying structured criteria for developing computer models, selecting code input, and performing base-run analyses. Where deficiencies are indicated in the base-case representation of the experiment, methods of code and criteria improvement are developed and appropriate recommendations are made.
Radiation Transport Tools for Space Applications: A Review
NASA Technical Reports Server (NTRS)
Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn
2008-01-01
This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.
Three-Dimensional Numerical Analyses of Earth Penetration Dynamics
1979-01-31
Lagrangian formulation based on the HEMP method and has been adapted and validated for treatment of normal-incidence (axisymmetric) impact and...code, is a detailed analysis of the structural response of the EPW. This analysis is generated using a nonlinear dynamic, elastic- plastic finite element...based on the HEMP scheme. Thus, the code has the same material modeling capabilities and abilities to track large scale motion found in the WAVE-L code
ERIC Educational Resources Information Center
Hau, Goh Bak; Siraj, Saedah; Alias, Norlidah; Rauf, Rose Amnah Abd.; Zakaria, Abd. Razak; Darusalam, Ghazali
2013-01-01
This study provides a content analysis of selected articles in the field of QR code and its application in educational context that were published in journals and proceedings of international conferences and workshops from 2006 to 2011. These articles were cross analysed by published years, journal, and research topics. Further analysis was…
Guo, Jun; Zhou, Yuan; Cheng, Yafen; Fang, Weiwei; Hu, Gang; Wei, Jie; Lin, Yajun; Man, Yong; Guo, Lixin; Sun, Mingxiao; Cui, Qinghua; Li, Jian
2018-01-01
Recent studies have suggested that changes in non-coding mRNA play a key role in the progression of non-alcoholic fatty liver disease (NAFLD). Metformin is now recommended and effective for the treatment of NAFLD. We hope the current analyses of the non-coding mRNA transcriptome will provide a better presentation of the potential roles of mRNAs and long non-coding RNAs (lncRNAs) that underlie NAFLD and metformin intervention. The present study mainly analysed changes in the coding transcriptome and non-coding RNAs after the application of a five-week metformin intervention. Liver samples from three groups of mice were harvested for transcriptome profiling, which covered mRNA, lncRNA, microRNA (miRNA) and circular RNA (circRNA), using a microarray technique. A systematic alleviation of high-fat diet (HFD)-induced transcriptome alterations by metformin was observed. The metformin treatment largely reversed the correlations with diabetes-related pathways. Our analysis also suggested interaction networks between differentially expressed lncRNAs and known hepatic disease genes and interactions between circRNA and their disease-related miRNA partners. Eight HFD-responsive lncRNAs and three metformin-responsive lncRNAs were noted due to their widespread associations with disease genes. Moreover, seven miRNAs that interacted with multiple differentially expressed circRNAs were highlighted because they were likely to be associated with metabolic or liver diseases. The present study identified novel changes in the coding transcriptome and non-coding RNAs in the livers of NAFLD mice after metformin treatment that might shed light on the underlying mechanism by which metformin impedes the progression of NAFLD. © 2018 The Author(s). Published by S. Karger AG, Basel.
Psychometric Properties of the System for Coding Couples’ Interactions in Therapy - Alcohol
Owens, Mandy D.; McCrady, Barbara S.; Borders, Adrienne Z.; Brovko, Julie M.; Pearson, Matthew R.
2014-01-01
Few systems are available for coding in-session behaviors for couples in therapy. Alcohol Behavior Couples Therapy (ABCT) is an empirically supported treatment, but little is known about its mechanisms of behavior change. In the current study, an adapted version of the Motivational Interviewing for Significant Others coding system was developed into the System for Coding Couples’ Interactions in Therapy – Alcohol (SCCIT-A), which was used to code couples’ interactions and behaviors during ABCT. Results showed good inter-rater reliability of the SCCIT-A and provided evidence that the SCCIT-A may be a promising measure for understanding couples in therapy. A three factor model of the SCCIT-A was examined (Positive, Negative, and Change Talk/Counter-Change Talk) using a confirmatory factor analysis, but model fit was poor. Due to poor model fit, ratios were computed for Positive/Negative ratings and for Change Talk/Counter-Change Talk codes based on previous research in the couples and Motivational Interviewing literature. Post-hoc analyses examined correlations between specific SCCIT-A codes and baseline characteristics and indicated some concurrent validity. Correlations were run between ratios and baseline characteristics; ratios may be an alternative to using the factors from the SCCIT-A. Reliability and validity analyses suggest that the SCCIT-A has the potential to be a useful measure for coding in-session behaviors of both partners in couples therapy and could be used to identify mechanisms of behavior change for ABCT. Additional research is needed to improve the reliability of some codes and to further develop the SCCIT-A and other measures of couples’ interactions in therapy. PMID:25528049
Discourse Matrix in Filipino-English Code-Switching: Students' Attitudes and Feelings
ERIC Educational Resources Information Center
dela Rosa, Rona
2016-01-01
Undeniably, one language may be considered more valuable than other languages. Hence, most bilingual communities suffer from language imbalances. The present study attempts to identify the factors of code-switching during classroom presentations. Its functions were identified through analysing conversational contexts in which it occurs. Through…
A Coding Scheme to Analyse the Online Asynchronous Discussion Forums of University Students
ERIC Educational Resources Information Center
Biasutti, Michele
2017-01-01
The current study describes the development of a content analysis coding scheme to examine transcripts of online asynchronous discussion groups in higher education. The theoretical framework comprises the theories regarding knowledge construction in computer-supported collaborative learning (CSCL) based on a sociocultural perspective. The coding…
Langner, Ingo; Mikolajczyk, Rafael; Garbe, Edeltraut
2011-08-17
Health insurance claims data are increasingly used for health services research in Germany. Hospital diagnoses in these data are coded according to the International Classification of Diseases, German modification (ICD-10-GM). Due to the historical division into West and East Germany, different coding practices might persist in both former parts. Additionally, the introduction of Diagnosis Related Groups (DRGs) in Germany in 2003/2004 might have changed the coding. The aim of this study was to investigate regional and temporal variations in coding of hospitalisation diagnoses in Germany. We analysed hospitalisation diagnoses for oesophageal bleeding (OB) and upper gastrointestinal bleeding (UGIB) from the official German Hospital Statistics provided by the Federal Statistical Office. Bleeding diagnoses were classified as "specific" (origin of bleeding provided) or "unspecific" (origin of bleeding not provided) coding. We studied regional (former East versus West Germany) differences in incidence of hospitalisations with specific or unspecific coding for OB and UGIB and temporal variations between 2000 and 2005. For each year, incidence ratios of hospitalisations for former East versus West Germany were estimated with log-linear regression models adjusting for age, gender and population density. Significant differences in specific and unspecific coding between East and West Germany and over time were found for both, OB and UGIB hospitalisation diagnoses, respectively. For example in 2002, incidence ratios of hospitalisations for East versus West Germany were 1.24 (95% CI 1.16-1.32) for specific and 0.67 (95% CI 0.60-0.74) for unspecific OB diagnoses and 1.43 (95% CI 1.36-1.51) for specific and 0.83 (95% CI 0.80-0.87) for unspecific UGIB. Regional differences nearly disappeared and time trends were less marked when using combined specific and unspecific diagnoses of OB or UGIB, respectively. During the study period, there were substantial regional and temporal variations in the coding of OB and UGIB diagnoses in hospitalised patients. Possible explanations for the observed regional variations are different coding preferences, further influenced by changes in coding and reimbursement rules. Analysing groups of diagnoses including specific and unspecific codes reduces the influence of varying coding practices.
NASA Astrophysics Data System (ADS)
Uwaba, Tomoyuki; Ito, Masahiro; Nemoto, Junichi; Ichikawa, Shoichi; Katsuyama, Kozo
2014-09-01
The BAMBOO computer code was verified by results for the out-of-pile bundle compression test with large diameter pin bundle deformation under the bundle-duct interaction (BDI) condition. The pin diameters of the examined test bundles were 8.5 mm and 10.4 mm, which are targeted as preliminary fuel pin diameters for the upgraded core of the prototype fast breeder reactor (FBR) and for demonstration and commercial FBRs studied in the FaCT project. In the bundle compression test, bundle cross-sectional views were obtained from X-ray computer tomography (CT) images and local parameters of bundle deformation such as pin-to-duct and pin-to-pin clearances were measured by CT image analyses. In the verification, calculation results of bundle deformation obtained by the BAMBOO code analyses were compared with the experimental results from the CT image analyses. The comparison showed that the BAMBOO code reasonably predicts deformation of large diameter pin bundles under the BDI condition by assuming that pin bowing and cladding oval distortion are the major deformation mechanisms, the same as in the case of small diameter pin bundles. In addition, the BAMBOO analysis results confirmed that cladding oval distortion effectively suppresses BDI in large diameter pin bundles as well as in small diameter pin bundles.
Continuous Codes and Standards Improvement (CCSI)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivkin, Carl H; Burgess, Robert M; Buttner, William J
2015-10-21
As of 2014, the majority of the codes and standards required to initially deploy hydrogen technologies infrastructure in the United States have been promulgated. These codes and standards will be field tested through their application to actual hydrogen technologies projects. Continuous codes and standards improvement (CCSI) is a process of identifying code issues that arise during project deployment and then developing codes solutions to these issues. These solutions would typically be proposed amendments to codes and standards. The process is continuous because as technology and the state of safety knowledge develops there will be a need to monitor the applicationmore » of codes and standards and improve them based on information gathered during their application. This paper will discuss code issues that have surfaced through hydrogen technologies infrastructure project deployment and potential code changes that would address these issues. The issues that this paper will address include (1) setback distances for bulk hydrogen storage, (2) code mandated hazard analyses, (3) sensor placement and communication, (4) the use of approved equipment, and (5) system monitoring and maintenance requirements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feltus, M.A.
1989-11-01
The operation of a nuclear power plant must be regularly supported by various reactor dynamics and thermal-hydraulic analyses, which may include final safety analysis report (FSAR) design-basis calculations, and conservative and best-estimate analyses. The development and improvement of computer codes and analysis methodologies provide many advantages, including the ability to evaluate the effect of modeling simplifications and assumptions made in previous reactor kinetics and thermal-hydraulic calculations. This paper describes the results of using the RETRAN, MCPWR, and STAR codes in a tandem, predictive-corrective manner for three pressurized water reactor (PWR) transients: (a) loss of feedwater (LOF) anticipated transient without scrammore » (ATWS), (b) station blackout ATWS, and (c) loss of total reactor coolant system (RCS) flow with a scram.« less
NASA Technical Reports Server (NTRS)
Stricker, L. T.
1975-01-01
The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-03-01
A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.
Validation of ICD-9-CM coding algorithm for improved identification of hypoglycemia visits.
Ginde, Adit A; Blanc, Phillip G; Lieberman, Rebecca M; Camargo, Carlos A
2008-04-01
Accurate identification of hypoglycemia cases by International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes will help to describe epidemiology, monitor trends, and propose interventions for this important complication in patients with diabetes. Prior hypoglycemia studies utilized incomplete search strategies and may be methodologically flawed. We sought to validate a new ICD-9-CM coding algorithm for accurate identification of hypoglycemia visits. This was a multicenter, retrospective cohort study using a structured medical record review at three academic emergency departments from July 1, 2005 to June 30, 2006. We prospectively derived a coding algorithm to identify hypoglycemia visits using ICD-9-CM codes (250.3, 250.8, 251.0, 251.1, 251.2, 270.3, 775.0, 775.6, and 962.3). We confirmed hypoglycemia cases by chart review identified by candidate ICD-9-CM codes during the study period. The case definition for hypoglycemia was documented blood glucose 3.9 mmol/l or emergency physician charted diagnosis of hypoglycemia. We evaluated individual components and calculated the positive predictive value. We reviewed 636 charts identified by the candidate ICD-9-CM codes and confirmed 436 (64%) cases of hypoglycemia by chart review. Diabetes with other specified manifestations (250.8), often excluded in prior hypoglycemia analyses, identified 83% of hypoglycemia visits, and unspecified hypoglycemia (251.2) identified 13% of hypoglycemia visits. The absence of any predetermined co-diagnosis codes improved the positive predictive value of code 250.8 from 62% to 92%, while excluding only 10 (2%) true hypoglycemia visits. Although prior analyses included only the first-listed ICD-9 code, more than one-quarter of identified hypoglycemia visits were outside this primary diagnosis field. Overall, the proposed algorithm had 89% positive predictive value (95% confidence interval, 86-92) for detecting hypoglycemia visits. The proposed algorithm improves on prior strategies to identify hypoglycemia visits in administrative data sets and will enhance the ability to study the epidemiology and design interventions for this important complication of diabetes care.
A Coding System for Analysing a Spoken Text Database.
ERIC Educational Resources Information Center
Cutting, Joan
1994-01-01
This paper describes a coding system devised to analyze conversations of graduate students in applied linguistics at Edinburgh University. The system was devised to test the hypothesis that as shared knowledge among conversation participants grows, the textual density of in-group members has more cues than that of strangers. The informal…
Code-Switching and Vernacular Support: An Early Middle English Case Study
ERIC Educational Resources Information Center
Skaffari, Janne
2016-01-01
In the multilingual history of England, the period following the Norman Conquest in 1066 is a particularly intriguing phase, but its code-switching patterns have so far received little attention. The present article describes and analyses the multilingual practices evinced in London, British Library, MS Stowe 34, containing one instructional prose…
Planned Comparisons as Better Alternatives to ANOVA Omnibus Tests.
ERIC Educational Resources Information Center
Benton, Roberta L.
Analyses of data are presented to illustrate the advantages of using a priori or planned comparisons rather than omnibus analysis of variance (ANOVA) tests followed by post hoc or posteriori testing. The two types of planned comparisons considered are planned orthogonal non-trend coding contrasts and orthogonal polynomial or trend contrast coding.…
Crashdynamics with DYNA3D: Capabilities and research directions
NASA Technical Reports Server (NTRS)
Whirley, Robert G.; Engelmann, Bruce E.
1993-01-01
The application of the explicit nonlinear finite element analysis code DYNA3D to crashworthiness problems is discussed. Emphasized in the first part of this work are the most important capabilities of an explicit code for crashworthiness analyses. The areas with significant research promise for the computational simulation of crash events are then addressed.
English-Afrikaans Intrasentential Code Switching: Testing a Feature Checking Account
ERIC Educational Resources Information Center
van Dulm, Ondene
2009-01-01
The work presented here aims to account for the structure of intrasentential code switching between English and Afrikaans within the framework of feature checking theory, a theory associated with minimalist syntax. Six constructions in which verb position differs between English and Afrikaans were analysed in terms of differences in the strength…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, B.E.
1979-10-01
GRAPE is a display program for three-dimensional polygon and polyhedral models. It can produce line-drawing and continuous-tone black and white or color images in still frame or movie mode. The code was written specifically to be a post-processor for finite element and finite difference analyses. It runs on the CDC 7600 computer, and is compiled with the LLL FTN system. The allocation of storage is dynamic. There are presently three data paths into the code. The first is the binary inerface from the analyses codes and this with the other databases is described. The second data path is the SAMPPmore » format, and the last is the MOVIE format. The code structure is described first; then the commands are discussed in general terms to try to give the user some feel for what they do. The next section deals with the exact format of the commands by overlay. Then examples are given and discussed. Next, the various output options are covered. 57 figures. (RWR)« less
Common radiation analysis model for 75,000 pound thrust NERVA engine (1137400E)
NASA Technical Reports Server (NTRS)
Warman, E. A.; Lindsey, B. A.
1972-01-01
The mathematical model and sources of radiation used for the radiation analysis and shielding activities in support of the design of the 1137400E version of the 75,000 lbs thrust NERVA engine are presented. The nuclear subsystem (NSS) and non-nuclear components are discussed. The geometrical model for the NSS is two dimensional as required for the DOT discrete ordinates computer code or for an azimuthally symetrical three dimensional Point Kernel or Monte Carlo code. The geometrical model for the non-nuclear components is three dimensional in the FASTER geometry format. This geometry routine is inherent in the ANSC versions of the QAD and GGG Point Kernal programs and the COHORT Monte Carlo program. Data are included pertaining to a pressure vessel surface radiation source data tape which has been used as the basis for starting ANSC analyses with the DASH code to bridge into the COHORT Monte Carlo code using the WANL supplied DOT angular flux leakage data. In addition to the model descriptions and sources of radiation, the methods of analyses are briefly described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
Sweeney, Angela; Greenwood, Kathryn E; Williams, Sally; Wykes, Til; Rose, Diana S
2013-12-01
Health research is frequently conducted in multi-disciplinary teams, with these teams increasingly including service user researchers. Whilst it is common for service user researchers to be involved in data collection--most typically interviewing other service users--it is less common for service user researchers to be involved in data analysis and interpretation. This means that a unique and significant perspective on the data is absent. This study aims to use an empirical report of a study on Cognitive Behavioural Therapy for psychosis (CBTp) to demonstrate the value of multiple coding in enabling service users voices to be heard in team-based qualitative data analysis. The CBTp study employed multiple coding to analyse service users' discussions of CBT for psychosis (CBTp) from the perspectives of a service user researcher, clinical researcher and psychology assistant. Multiple coding was selected to enable multiple perspectives to analyse and interpret data, to understand and explore differences and to build multi-disciplinary consensus. Multiple coding enabled the team to understand where our views were commensurate and incommensurate and to discuss and debate differences. Through the process of multiple coding, we were able to build strong consensus about the data from multiple perspectives, including that of the service user researcher. Multiple coding is an important method for understanding and exploring multiple perspectives on data and building team consensus. This can be contrasted with inter-rater reliability which is only appropriate in limited circumstances. We conclude that multiple coding is an appropriate and important means of hearing service users' voices in qualitative data analysis. © 2012 John Wiley & Sons Ltd.
Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W.; Imel, Zac E.; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C.
2014-01-01
The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. PMID:25242192
Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C
2015-02-01
The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.
Propellant Chemistry for CFD Applications
NASA Technical Reports Server (NTRS)
Farmer, R. C.; Anderson, P. G.; Cheng, Gary C.
1996-01-01
Current concepts for reusable launch vehicle design have created renewed interest in the use of RP-1 fuels for high pressure and tri-propellant propulsion systems. Such designs require the use of an analytical technology that accurately accounts for the effects of real fluid properties, combustion of large hydrocarbon fuel modules, and the possibility of soot formation. These effects are inadequately treated in current computational fluid dynamic (CFD) codes used for propulsion system analyses. The objective of this investigation is to provide an accurate analytical description of hydrocarbon combustion thermodynamics and kinetics that is sufficiently computationally efficient to be a practical design tool when used with CFD codes such as the FDNS code. A rigorous description of real fluid properties for RP-1 and its combustion products will be derived from the literature and from experiments conducted in this investigation. Upon the establishment of such a description, the fluid description will be simplified by using the minimum of empiricism necessary to maintain accurate combustion analyses and including such empirical models into an appropriate CFD code. An additional benefit of this approach is that the real fluid properties analysis simplifies the introduction of the effects of droplet sprays into the combustion model. Typical species compositions of RP-1 have been identified, surrogate fuels have been established for analyses, and combustion and sooting reaction kinetics models have been developed. Methods for predicting the necessary real fluid properties have been developed and essential experiments have been designed. Verification studies are in progress, and preliminary results from these studies will be presented. The approach has been determined to be feasible, and upon its completion the required methodology for accurate performance and heat transfer CFD analyses for high pressure, tri-propellant propulsion systems will be available.
SPIDERMAN: Fast code to simulate secondary transits and phase curves
NASA Astrophysics Data System (ADS)
Louden, Tom; Kreidberg, Laura
2017-11-01
SPIDERMAN calculates exoplanet phase curves and secondary eclipses with arbitrary surface brightness distributions in two dimensions. The code uses a geometrical algorithm to solve exactly the area of sections of the disc of the planet that are occulted by the star. Approximately 1000 models can be generated per second in typical use, which makes making Markov Chain Monte Carlo analyses practicable. The code is modular and allows comparison of the effect of multiple different brightness distributions for a dataset.
Design geometry and design/off-design performance computer codes for compressors and turbines
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1995-01-01
This report summarizes some NASA Lewis (i.e., government owned) computer codes capable of being used for airbreathing propulsion system studies to determine the design geometry and to predict the design/off-design performance of compressors and turbines. These are not CFD codes; velocity-diagram energy and continuity computations are performed fore and aft of the blade rows using meanline, spanline, or streamline analyses. Losses are provided by empirical methods. Both axial-flow and radial-flow configurations are included.
1982-11-01
Service code exceeded operational code in the ratio of 10 : I. No redundant information was required. It was modular. Internal parts of the program...to NASA’s analyses. We were to try to find an existing finite element program of a quality that would be worth recommending to all NASA Centers. We...Distinct manuals were published for users, programmers, theory, and demonstration problems. 3 It abounded with service code to provide user conveniences
Sierra/Solid Mechanics 4.48 User's Guide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merewether, Mark Thomas; Crane, Nathan K; de Frias, Gabriel Jose
Sierra/SolidMechanics (Sierra/SM) is a Lagrangian, three-dimensional code for finite element analysis of solids and structures. It provides capabilities for explicit dynamic, implicit quasistatic and dynamic analyses. The explicit dynamics capabilities allow for the efficient and robust solution of models with extensive contact subjected to large, suddenly applied loads. For implicit problems, Sierra/SM uses a multi-level iterative solver, which enables it to effectively solve problems with large deformations, nonlinear material behavior, and contact. Sierra/SM has a versatile library of continuum and structural elements, and a large library of material models. The code is written for parallel computing environments enabling scalable solutionsmore » of extremely large problems for both implicit and explicit analyses. It is built on the SIERRA Framework, which facilitates coupling with other SIERRA mechanics codes. This document describes the functionality and input syntax for Sierra/SM.« less
Analysis of the new code stroke protocol in Asturias after one year. Experience at one hospital.
García-Cabo, C; Benavente, L; Martínez-Ramos, J; Pérez-Álvarez, Á; Trigo, A; Calleja, S
2018-03-01
Prehospital code stroke (CS) systems have been proved effective for improving access to specialised medical care in acute stroke cases. They also improve the prognosis of this disease, which is one of the leading causes of death and disability in our setting. The aim of this study is to analyse results one year after implementation of the new code stroke protocol at one hospital in Asturias. We prospectively included patients who were admitted to our tertiary care centre as per the code stroke protocol for the period of one year. We analysed 363 patients. Mean age was 69 years and 54% of the cases were men. During the same period in the previous year, there were 236 non-hospital CS activations. One hundred forty-seven recanalisation treatments were performed (66 fibrinolysis and 81 mechanical thrombectomies or combined treatments), representing a 25% increase with regard to the previous year. Recent advances in the management of acute stroke call for coordinated code stroke protocols that are adapted to the needs of each specific region. This may result in an increased number of patients receiving early care, as well as revascularisation treatments. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.
On Flowfield Periodicity in the NASA Transonic Flutter Cascade. Part 2; Numerical Study
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.; McFarland, Eric R.; Wood, Jerry R.; Lepicovsky, Jan
2000-01-01
The transonic flutter cascade facility at NASA Glenn Research Center was redesigned based on a combined program of experimental measurements and numerical analyses. The objectives of the redesign were to improve the periodicity of the cascade in steady operation, and to better quantify the inlet and exit flow conditions needed for CFD predictions. Part I of this paper describes the experimental measurements, which included static pressure measurements on the blade and endwalls made using both static taps and pressure sensitive paints, cobra probe measurements of the endwall boundary layers and blade wakes, and shadowgraphs of the wave structure. Part II of this paper describes three CFD codes used to analyze the facility, including a multibody panel code, a quasi-three-dimensional viscous code, and a fully three-dimensional viscous code. The measurements and analyses both showed that the operation of the cascade was heavily dependent on the configuration of the sidewalls. Four configurations of the sidewalls were studied and the results are described. For the final configuration, the quasi-three-dimensional viscous code was used to predict the location of mid-passage streamlines for a perfectly periodic cascade. By arranging the tunnel sidewalls to approximate these streamlines, sidewall interference was minimized and excellent periodicity was obtained.
Manifestations of poverty and birthrates among young teenagers in California zip code areas.
Kirby, D; Coyle, K; Gould, J B
2001-01-01
Given that many communities are implementing community-wide initiatives to reduce teenage pregnancy or childbearing, it is important to understand the effects of a community's characteristics on adolescent birthrates. Data from the 1990 census and from California birth certificates were obtained for zip codes in California. Regression analyses were conducted on data from zip code areas with at least 200 females aged 15-17 between 1991 and 1996, to predict the effects of race and ethnicity marital status, education, employment, income and poverty, and housing on birthrates among young teenagers. In bivariate analyses, the proportion of families living below poverty level within a zip code was highly related to the birthrate among young teenagers in that zip code (r=.80, p<.001). In multivariate analyses, which controlled for some of the correlates of family poverty level, the proportion of families living below poverty level remained by far the most important predictor of the birthrate among young teenagers (b=1.54), followed by the proportion of adults aged 25 or older who have a college education (b=-0.80). Race and ethnicity were only weakly related to birthrate. In all three racial and ethnic groups, poverty and education were significantly related to birthrate, but the effect of college education was greater among Hispanics (b=-2.98) than among either non-Hispanic whites (b=-0.53) or blacks (b=-1.12). Male employment and unemployment and female unemployment were highly related to the birthrate among young teenagers in some racial or ethnic groups, but not in others. Multiple manifestations of poverty, including poverty itself, low levels of education and employment, and high levels of unemployment, may have a large impact upon birthrates among young teenagers. Addressing some of these issues could substantially reduce childbearing among young adolescents.
Fast-Running Aeroelastic Code Based on Unsteady Linearized Aerodynamic Solver Developed
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Bakhle, Milind A.; Keith, T., Jr.
2003-01-01
The NASA Glenn Research Center has been developing aeroelastic analyses for turbomachines for use by NASA and industry. An aeroelastic analysis consists of a structural dynamic model, an unsteady aerodynamic model, and a procedure to couple the two models. The structural models are well developed. Hence, most of the development for the aeroelastic analysis of turbomachines has involved adapting and using unsteady aerodynamic models. Two methods are used in developing unsteady aerodynamic analysis procedures for the flutter and forced response of turbomachines: (1) the time domain method and (2) the frequency domain method. Codes based on time domain methods require considerable computational time and, hence, cannot be used during the design process. Frequency domain methods eliminate the time dependence by assuming harmonic motion and, hence, require less computational time. Early frequency domain analyses methods neglected the important physics of steady loading on the analyses for simplicity. A fast-running unsteady aerodynamic code, LINFLUX, which includes steady loading and is based on the frequency domain method, has been modified for flutter and response calculations. LINFLUX, solves unsteady linearized Euler equations for calculating the unsteady aerodynamic forces on the blades, starting from a steady nonlinear aerodynamic solution. First, we obtained a steady aerodynamic solution for a given flow condition using the nonlinear unsteady aerodynamic code TURBO. A blade vibration analysis was done to determine the frequencies and mode shapes of the vibrating blades, and an interface code was used to convert the steady aerodynamic solution to a form required by LINFLUX. A preprocessor was used to interpolate the mode shapes from the structural dynamic mesh onto the computational dynamics mesh. Then, we used LINFLUX to calculate the unsteady aerodynamic forces for a given mode, frequency, and phase angle. A postprocessor read these unsteady pressures and calculated the generalized aerodynamic forces, eigenvalues, and response amplitudes. The eigenvalues determine the flutter frequency and damping. As a test case, the flutter of a helical fan was calculated with LINFLUX and compared with calculations from TURBO-AE, a nonlinear time domain code, and from ASTROP2, a code based on linear unsteady aerodynamics.
Reddy, Sushma; Kimball, Rebecca T; Pandey, Akanksha; Hosner, Peter A; Braun, Michael J; Hackett, Shannon J; Han, Kin-Lan; Harshman, John; Huddleston, Christopher J; Kingston, Sarah; Marks, Ben D; Miglia, Kathleen J; Moore, William S; Sheldon, Frederick H; Witt, Christopher C; Yuri, Tamaki; Braun, Edward L
2017-09-01
Phylogenomics, the use of large-scale data matrices in phylogenetic analyses, has been viewed as the ultimate solution to the problem of resolving difficult nodes in the tree of life. However, it has become clear that analyses of these large genomic data sets can also result in conflicting estimates of phylogeny. Here, we use the early divergences in Neoaves, the largest clade of extant birds, as a "model system" to understand the basis for incongruence among phylogenomic trees. We were motivated by the observation that trees from two recent avian phylogenomic studies exhibit conflicts. Those studies used different strategies: 1) collecting many characters [$\\sim$ 42 mega base pairs (Mbp) of sequence data] from 48 birds, sometimes including only one taxon for each major clade; and 2) collecting fewer characters ($\\sim$ 0.4 Mbp) from 198 birds, selected to subdivide long branches. However, the studies also used different data types: the taxon-poor data matrix comprised 68% non-coding sequences whereas coding exons dominated the taxon-rich data matrix. This difference raises the question of whether the primary reason for incongruence is the number of sites, the number of taxa, or the data type. To test among these alternative hypotheses we assembled a novel, large-scale data matrix comprising 90% non-coding sequences from 235 bird species. Although increased taxon sampling appeared to have a positive impact on phylogenetic analyses the most important variable was data type. Indeed, by analyzing different subsets of the taxa in our data matrix we found that increased taxon sampling actually resulted in increased congruence with the tree from the previous taxon-poor study (which had a majority of non-coding data) instead of the taxon-rich study (which largely used coding data). We suggest that the observed differences in the estimates of topology for these studies reflect data-type effects due to violations of the models used in phylogenetic analyses, some of which may be difficult to detect. If incongruence among trees estimated using phylogenomic methods largely reflects problems with model fit developing more "biologically-realistic" models is likely to be critical for efforts to reconstruct the tree of life. [Birds; coding exons; GTR model; model fit; Neoaves; non-coding DNA; phylogenomics; taxon sampling.]. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Rui; Sumner, Tyler S.
2016-04-17
An advanced system analysis tool SAM is being developed for fast-running, improved-fidelity, and whole-plant transient analyses at Argonne National Laboratory under DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. As an important part of code development, companion validation activities are being conducted to ensure the performance and validity of the SAM code. This paper presents the benchmark simulations of two EBR-II tests, SHRT-45R and BOP-302R, whose data are available through the support of DOE-NE’s Advanced Reactor Technology (ART) program. The code predictions of major primary coolant system parameter are compared with the test results. Additionally, the SAS4A/SASSYS-1 code simulationmore » results are also included for a code-to-code comparison.« less
NASA Astrophysics Data System (ADS)
York, B. J.; Sinha, N.; Dash, S. M.; Hosangadi, A.; Kenzakowski, D. C.; Lee, R. A.
1992-07-01
The analysis of steady and transient aerodynamic/propulsive/plume flowfield interactions utilizing several state-of-the-art computer codes (PARCH, CRAFT, and SCHAFT) is discussed. These codes have been extended to include advanced turbulence models, generalized thermochemistry, and multiphase nonequilibrium capabilities. Several specialized versions of these codes have been developed for specific applications. This paper presents a brief overview of these codes followed by selected cases demonstrating steady and transient analyses of conventional as well as advanced missile systems. Areas requiring upgrades include turbulence modeling in a highly compressible environment and the treatment of particulates in general. Recent progress in these areas are highlighted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexandrov, Boian S.; Lliev, Filip L.; Stanev, Valentin G.
This code is a toy (short) version of CODE-2016-83. From a general perspective, the code represents an unsupervised adaptive machine learning algorithm that allows efficient and high performance de-mixing and feature extraction of a multitude of non-negative signals mixed and recorded by a network of uncorrelated sensor arrays. The code identifies the number of the mixed original signals and their locations. Further, the code also allows deciphering of signals that have been delayed in regards to the mixing process in each sensor. This code is high customizable and it can be efficiently used for a fast macro-analyses of data. Themore » code is applicable to a plethora of distinct problems: chemical decomposition, pressure transient decomposition, unknown sources/signal allocation, EM signal decomposition. An additional procedure for allocation of the unknown sources is incorporated in the code.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heard, F.J.; Harris, R.A.; Padilla, A.
The SASSYS/SAS4A systems analysis code was used to simulate a series of unprotected loss of flow (ULOF) tests planned at the Fast Flux Test Facility (FFTF). The subject tests were designed to investigate the transient performance of the FFTF during various ULOF scenarios for two different loading patterns designed to produce extremes in the assembly load pad clearance and the direction of the initial assembly bows. The tests are part of an international program designed to extend the existing data base on the performance of liquid metal reactors (LMR). The analyses demonstrate that a wide range of power-to-flow ratios canmore » be reached during the transients and, therefore, will yield valuable data on the dynamic character of the structural feedbacks in LMRS. These analyses will be repeated once the actual FFTF core loadings for the tests are available. These predictions, similar ones obtained by other international participants in the FFTF program, and post-test analyses will be used to upgrade and further verify the computer codes used to predict the behavior of LMRS.« less
Making Visible the Coding Process: Using Qualitative Data Software in a Post-Structural Study
ERIC Educational Resources Information Center
Ryan, Mary
2009-01-01
Qualitative research methods require transparency to ensure the "trustworthiness" of the data analysis. The intricate processes of organising, coding and analysing the data are often rendered invisible in the presentation of the research findings, which requires a "leap of faith" for the reader. Computer assisted data analysis software can be used…
Coastal Benthic Boundary Layer Special Research Program: A Review of the First Year. Volume 1.
1994-04-06
also Indebted to Dr. LeBlanc for his supervision of the relaxation time numerical analyses, and Lachlan Munro, a3 graduate ONR AASERT student, for coding...R. Smith and E. Besancon Code 7174 Naval Research Laboratory Stennis Space Center, MS 39529-5004 I INTRODUCTION: This brief report outlines the
ERIC Educational Resources Information Center
Hammer, David; Berland, Leema K.
2014-01-01
We question widely accepted practices of publishing articles that present quantified analyses of qualitative data. First, articles are often published that provide only very brief excerpts of the qualitative data themselves to illustrate the coding scheme, tacitly or explicitly treating the coding results as data. Second, articles are often…
Using DEWIS and R for Multi-Staged Statistics e-Assessments
ERIC Educational Resources Information Center
Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.
2016-01-01
We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…
ERIC Educational Resources Information Center
Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.
2012-01-01
Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…
MEMOPS: data modelling and automatic code generation.
Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D
2010-03-25
In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.
DOT National Transportation Integrated Search
1972-11-01
Analyses are made of waveforms, parameters, codes, error rates, and multi-access noise for proposed communications and surveillance subsystems to be useful for air traffic control in the 1990-2000 time period. The systems represented in these analyse...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dokhane, A.; Canepa, S.; Ferroukhi, H.
For stability analyses of the Swiss operating Boiling-Water-Reactors (BWRs), the methodology employed and validated so far at the Paul Scherrer Inst. (PSI) was based on the RAMONA-3 code with a hybrid upstream static lattice/core analysis approach using CASMO-4 and PRESTO-2. More recently, steps were undertaken towards a new methodology based on the SIMULATE-3K (S3K) code for the dynamical analyses combined with the CMSYS system relying on the CASMO/SIMULATE-3 suite of codes and which was established at PSI to serve as framework for the development and validation of reference core models of all the Swiss reactors and operated cycles. This papermore » presents a first validation of the new methodology on the basis of a benchmark recently organised by a Swiss utility and including the participation of several international organisations with various codes/methods. Now in parallel, a transition from CASMO-4E (C4E) to CASMO-5M (C5M) as basis for the CMSYS core models was also recently initiated at PSI. Consequently, it was considered adequate to address the impact of this transition both for the steady-state core analyses as well as for the stability calculations and to achieve thereby, an integral approach for the validation of the new S3K methodology. Therefore, a comparative assessment of C4 versus C5M is also presented in this paper with particular emphasis on the void coefficients and their impact on the downstream stability analysis results. (authors)« less
Haiman, Christopher A; Garcia, Rachel R; Hsu, Chris; Xia, Lucy; Ha, Helen; Sheng, Xin; Le Marchand, Loic; Kolonel, Laurence N; Henderson, Brian E; Stallcup, Michael R; Greene, Geoffrey L; Press, Michael F
2009-01-30
Only a limited number of studies have performed comprehensive investigations of coding variation in relation to breast cancer risk. Given the established role of estrogens in breast cancer, we hypothesized that coding variation in steroid receptor coactivator and corepressor genes may alter inter-individual response to estrogen and serve as markers of breast cancer risk. We sequenced the coding exons of 17 genes (EP300, CCND1, NME1, NCOA1, NCOA2, NCOA3, SMARCA4, SMARCA2, CARM1, FOXA1, MPG, NCOR1, NCOR2, CALCOCO1, PRMT1, PPARBP and CREBBP) suggested to influence transcriptional activation by steroid hormone receptors in a multiethnic panel of women with advanced breast cancer (n = 95): African Americans, Latinos, Japanese, Native Hawaiians and European Americans. Association testing of validated coding variants was conducted in a breast cancer case-control study (1,612 invasive cases and 1,961 controls) nested in the Multiethnic Cohort. We used logistic regression to estimate odds ratios for allelic effects in ethnic-pooled analyses as well as in subgroups defined by disease stage and steroid hormone receptor status. We also investigated effect modification by established breast cancer risk factors that are associated with steroid hormone exposure. We identified 45 coding variants with frequencies > or = 1% in any one ethnic group (43 non-synonymous variants). We observed nominally significant positive associations with two coding variants in ethnic-pooled analyses (NCOR2: His52Arg, OR = 1.79; 95% CI, 1.05-3.05; CALCOCO1: Arg12His, OR = 2.29; 95% CI, 1.00-5.26). A small number of variants were associated with risk in disease subgroup analyses and we observed no strong evidence of effect modification by breast cancer risk factors. Based on the large number of statistical tests conducted in this study, the nominally significant associations that we observed may be due to chance, and will need to be confirmed in other studies. Our findings suggest that common coding variation in these candidate genes do not make a substantial contribution to breast cancer risk in the general population. Cataloging and testing of coding variants in coactivator and corepressor genes should continue and may serve as a valuable resource for investigations of other hormone-related phenotypes, such as inter-individual response to hormonal therapies used for cancer treatment and prevention.
Accuracy of external cause-of-injury coding in VA polytrauma patient discharge records.
Carlson, Kathleen F; Nugent, Sean M; Grill, Joseph; Sayer, Nina A
2010-01-01
Valid and efficient methods of identifying the etiology of treated injuries are critical for characterizing patient populations and developing prevention and rehabilitation strategies. We examined the accuracy of external cause-of-injury codes (E-codes) in Veterans Health Administration (VHA) administrative data for a population of injured patients. Chart notes and E-codes were extracted for 566 patients treated at any one of four VHA Polytrauma Rehabilitation Center sites between 2001 and 2006. Two expert coders, blinded to VHA E-codes, used chart notes to assign "gold standard" E-codes to injured patients. The accuracy of VHA E-coding was examined based on these gold standard E-codes. Only 382 of 517 (74%) injured patients were assigned E-codes in VHA records. Sensitivity of VHA E-codes varied significantly by site (range: 59%-91%, p < 0.001). Sensitivity was highest for combat-related injuries (81%) and lowest for fall-related injuries (60%). Overall specificity of E-codes was high (92%). E-coding accuracy was markedly higher when we restricted analyses to records that had been assigned VHA E-codes. E-codes may not be valid for ascertaining source-of-injury data for all injuries among VHA rehabilitation inpatients at this time. Enhanced training and policies may ensure more widespread, standardized use and accuracy of E-codes for injured veterans treated in the VHA.
NASA Astrophysics Data System (ADS)
Ditommaso, Rocco; Carlo Ponzo, Felice; Auletta, Gianluca; Iacovino, Chiara; Nigro, Antonella
2015-04-01
Aim of this study is a comparison among the fundamental period of reinforced concrete buildings evaluated using the simplified approach proposed by the Italian Seismic code (NTC 2008), numerical models and real values retrieved from an experimental campaign performed on several buildings located in Basilicata region (Italy). With the intention of proposing simplified relationships to evaluate the fundamental period of reinforced concrete buildings, scientists and engineers performed several numerical and experimental campaigns, on different structures all around the world, to calibrate different kind of formulas. Most of formulas retrieved from both numerical and experimental analyses provides vibration periods smaller than those suggested by the Italian seismic code. However, it is well known that the fundamental period of a structure play a key role in the correct evaluation of the spectral acceleration for seismic static analyses. Generally, simplified approaches impose the use of safety factors greater than those related to in depth nonlinear analyses with the aim to cover possible unexpected uncertainties. Using the simplified formula proposed by the Italian seismic code the fundamental period is quite higher than fundamental periods experimentally evaluated on real structures, with the consequence that the spectral acceleration adopted in the seismic static analysis may be significantly different than real spectral acceleration. This approach could produces a decreasing in safety factors obtained using linear and nonlinear seismic static analyses. Finally, the authors suggest a possible update of the Italian seismic code formula for the simplified estimation of the fundamental period of vibration of existing RC buildings, taking into account both elastic and inelastic structural behaviour and the interaction between structural and non-structural elements. Acknowledgements This study was partially funded by the Italian Civil Protection Department within the project DPC-RELUIS 2014 - RS4 ''Seismic observatory of structures and health monitoring''. References R. Ditommaso, M. Vona, M. R. Gallipoli and M. Mucciarelli (2013). Evaluation and considerations about fundamental periods of damaged reinforced concrete buildings. Nat. Hazards Earth Syst. Sci., 13, 1903-1912, 2013. www.nat-hazards-earth-syst-sci.net/13/1903/2013. doi:10.5194/nhess-13-1903-2013
NASA Technical Reports Server (NTRS)
Bade, W. L.; Yos, J. M.
1975-01-01
The present, third volume of the final report is a programmer's manual for the code. It provides a listing of the FORTRAN 4 source program; a complete glossary of FORTRAN symbols; a discussion of the purpose and method of operation of each subroutine (including mathematical analyses of special algorithms); and a discussion of the operation of the code on IBM/360 and UNIVAC 1108 systems, including required control cards and the overlay structure used to accommodate the code to the limited core size of the 1108. In addition, similar information is provided to document the programming of the NOZFIT code, which is employed to set up nozzle profile curvefits for use in NATA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adrian Miron; Joshua Valentine; John Christenson
2009-10-01
The current state of the art in nuclear fuel cycle (NFC) modeling is an eclectic mixture of codes with various levels of applicability, flexibility, and availability. In support of the advanced fuel cycle systems analyses, especially those by the Advanced Fuel Cycle Initiative (AFCI), Unviery of Cincinnati in collaboration with Idaho State University carried out a detailed review of the existing codes describing various aspects of the nuclear fuel cycle and identified the research and development needs required for a comprehensive model of the global nuclear energy infrastructure and the associated nuclear fuel cycles. Relevant information obtained on the NFCmore » codes was compiled into a relational database that allows easy access to various codes' properties. Additionally, the research analyzed the gaps in the NFC computer codes with respect to their potential integration into programs that perform comprehensive NFC analysis.« less
Toward a CFD nose-to-tail capability - Hypersonic unsteady Navier-Stokes code validation
NASA Technical Reports Server (NTRS)
Edwards, Thomas A.; Flores, Jolen
1989-01-01
Computational fluid dynamics (CFD) research for hypersonic flows presents new problems in code validation because of the added complexity of the physical models. This paper surveys code validation procedures applicable to hypersonic flow models that include real gas effects. The current status of hypersonic CFD flow analysis is assessed with the Compressible Navier-Stokes (CNS) code as a case study. The methods of code validation discussed to beyond comparison with experimental data to include comparisons with other codes and formulations, component analyses, and estimation of numerical errors. Current results indicate that predicting hypersonic flows of perfect gases and equilibrium air are well in hand. Pressure, shock location, and integrated quantities are relatively easy to predict accurately, while surface quantities such as heat transfer are more sensitive to the solution procedure. Modeling transition to turbulence needs refinement, though preliminary results are promising.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uchibori, Akihiro; Kurihara, Akikazu; Ohshima, Hiroyuki
A multiphysics analysis system for sodium-water reaction phenomena in a steam generator of sodium-cooled fast reactors was newly developed. The analysis system consists of the mechanistic numerical analysis codes, SERAPHIM, TACT, and RELAP5. The SERAPHIM code calculates the multicomponent multiphase flow and sodium-water chemical reaction caused by discharging of pressurized water vapor. Applicability of the SERAPHIM code was confirmed through the analyses of the experiment on water vapor discharging in liquid sodium. The TACT code was developed to calculate heat transfer from the reacting jet to the adjacent tube and to predict the tube failure occurrence. The numerical models integratedmore » into the TACT code were verified through some related experiments. The RELAP5 code evaluates thermal hydraulic behavior of water inside the tube. The original heat transfer correlations were corrected for the tube rapidly heated by the reacting jet. The developed system enables evaluation of the wastage environment and the possibility of the failure propagation.« less
H.264 Layered Coded Video over Wireless Networks: Channel Coding and Modulation Constraints
NASA Astrophysics Data System (ADS)
Ghandi, M. M.; Barmada, B.; Jones, E. V.; Ghanbari, M.
2006-12-01
This paper considers the prioritised transmission of H.264 layered coded video over wireless channels. For appropriate protection of video data, methods such as prioritised forward error correction coding (FEC) or hierarchical quadrature amplitude modulation (HQAM) can be employed, but each imposes system constraints. FEC provides good protection but at the price of a high overhead and complexity. HQAM is less complex and does not introduce any overhead, but permits only fixed data ratios between the priority layers. Such constraints are analysed and practical solutions are proposed for layered transmission of data-partitioned and SNR-scalable coded video where combinations of HQAM and FEC are used to exploit the advantages of both coding methods. Simulation results show that the flexibility of SNR scalability and absence of picture drift imply that SNR scalability as modelled is superior to data partitioning in such applications.
Analysis of JSI TRIGA MARK II reactor physical parameters calculated with TRIPOLI and MCNP.
Henry, R; Tiselj, I; Snoj, L
2015-03-01
New computational model of the JSI TRIGA Mark II research reactor was built for TRIPOLI computer code and compared with existing MCNP code model. The same modelling assumptions were used in order to check the differences of the mathematical models of both Monte Carlo codes. Differences between the TRIPOLI and MCNP predictions of keff were up to 100pcm. Further validation was performed with analyses of the normalized reaction rates and computations of kinetic parameters for various core configurations. Copyright © 2014 Elsevier Ltd. All rights reserved.
Signal Processing Expert Code (SPEC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ames, H.S.
1985-12-01
The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.
ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes
NASA Astrophysics Data System (ADS)
Yuan, Gary; Gygi, Francois
2011-03-01
ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
A rapid mission analysis code based on the use of approximate flight path equations of motion is presented. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed characteristics were specified in tabular form. The code also contains extensive flight envelope performance mapping capabilities. Approximate take off and landing analyses were performed. At high speeds, centrifugal lift effects were accounted for. Extensive turbojet and ramjet engine scaling procedures were incorporated in the code.
Integration of a supersonic unsteady aerodynamic code into the NASA FASTEX system
NASA Technical Reports Server (NTRS)
Appa, Kari; Smith, Michael J. C.
1987-01-01
A supersonic unsteady aerodynamic loads prediction method based on the constant pressure method was integrated into the NASA FASTEX system. The updated FASTEX code can be employed for aeroelastic analyses in subsonic and supersonic flow regimes. A brief description of the supersonic constant pressure panel method, as applied to lifting surfaces and body configurations, is followed by a documentation of updates required to incorporate this method in the FASTEX code. Test cases showing correlations of predicted pressure distributions, flutter solutions, and stability derivatives with available data are reported.
Woodman, Jenny; Allister, Janice; Rafi, Imran; de Lusignan, Simon; Belsey, Jonathan; Petersen, Irene; Gilbert, Ruth
2012-01-01
Background Information is lacking on how concerns about child maltreatment are recorded in primary care records. Aim To determine how the recording of child maltreatment concerns can be improved. Design and setting Development of a quality improvement intervention involving: clinical audit, a descriptive survey, telephone interviews, a workshop, database analyses, and consensus development in UK general practice. Method Descriptive analyses and incidence estimates were carried out based on 11 study practices and 442 practices in The Health Improvement Network (THIN). Telephone interviews, a workshop, and a consensus development meeting were conducted with lead GPs from 11 study practices. Results The rate of children with at least one maltreatment-related code was 8.4/1000 child years (11 study practices, 2009–2010), and 8.0/1000 child years (THIN, 2009–2010). Of 25 patients with known maltreatment, six had no maltreatment-related codes recorded, but all had relevant free text, scanned documents, or codes. When stating their reasons for undercoding maltreatment concerns, GPs cited damage to the patient relationship, uncertainty about which codes to use, and having concerns about recording information on other family members in the child’s records. Consensus recommendations are to record the code ‘child is cause for concern’ as a red flag whenever maltreatment is considered, and to use a list of codes arranged around four clinical concepts, with an option for a templated short data entry form. Conclusion GPs under-record maltreatment-related concerns in children’s electronic medical records. As failure to use codes makes it impossible to search or audit these cases, an approach designed to be simple and feasible to implement in UK general practice was recommended. PMID:22781996
Advances in Evaluation of Chronic Diarrhea in Infants.
Thiagarajah, Jay R; Kamin, Daniel S; Acra, Sari; Goldsmith, Jeffrey D; Roland, Joseph T; Lencer, Wayne I; Muise, Aleixo M; Goldenring, James R; Avitzur, Yaron; Martín, Martín G
2018-06-01
Diarrhea is common in infants (children less than 2 years of age), usually acute, and, if chronic, commonly caused by allergies and occasionally by infectious agents. Congenital diarrheas and enteropathies (CODEs) are rare causes of devastating chronic diarrhea in infants. Evaluation of CODEs is a lengthy process and infrequently leads to a clear diagnosis. However, genomic analyses and the development of model systems have increased our understanding of CODE pathogenesis. With these advances, a new diagnostic approach is needed. We propose a revised approach to determine causes of diarrhea in infants, including CODEs, based on stool analysis, histologic features, responses to dietary modifications, and genetic tests. After exclusion of common causes of diarrhea in infants, the evaluation proceeds through analyses of stool characteristics (watery, fatty, or bloody) and histologic features, such as the villus to crypt ratio in intestinal biopsies. Infants with CODEs resulting from defects in digestion, absorption, transport of nutrients and electrolytes, or enteroendocrine cell development or function have normal villi to crypt ratios; defects in enterocyte structure or immune-mediated conditions result in an abnormal villus to crypt ratios and morphology. Whole-exome and genome sequencing in the early stages of evaluation can reduce the time required for a definitive diagnosis of CODEs, or lead to identification of new variants associated with these enteropathies. The functional effects of gene mutations can be analyzed in model systems such as enteroids or induced pluripotent stem cells and are facilitated by recent advances in gene editing procedures. Characterization and investigation of new CODE disorders will improve management of patients and advance our understanding of epithelial cells and other cells in the intestinal mucosa. Copyright © 2018 AGA Institute. Published by Elsevier Inc. All rights reserved.
Conducting Retrospective Ontological Clinical Trials in ICD-9-CM in the Age of ICD-10-CM.
Venepalli, Neeta K; Shergill, Ardaman; Dorestani, Parvaneh; Boyd, Andrew D
2014-01-01
To quantify the impact of International Classification of Disease 10th Revision Clinical Modification (ICD-10-CM) transition in cancer clinical trials by comparing coding accuracy and data discontinuity in backward ICD-10-CM to ICD-9-CM mapping via two tools, and to develop a standard ICD-9-CM and ICD-10-CM bridging methodology for retrospective analyses. While the transition to ICD-10-CM has been delayed until October 2015, its impact on cancer-related studies utilizing ICD-9-CM diagnoses has been inadequately explored. Three high impact journals with broad national and international readerships were reviewed for cancer-related studies utilizing ICD-9-CM diagnoses codes in study design, methods, or results. Forward ICD-9-CM to ICD-10-CM mapping was performing using a translational methodology with the Motif web portal ICD-9-CM conversion tool. Backward mapping from ICD-10-CM to ICD-9-CM was performed using both Centers for Medicare and Medicaid Services (CMS) general equivalence mappings (GEMs) files and the Motif web portal tool. Generated ICD-9-CM codes were compared with the original ICD-9-CM codes to assess data accuracy and discontinuity. While both methods yielded additional ICD-9-CM codes, the CMS GEMs method provided incomplete coverage with 16 of the original ICD-9-CM codes missing, whereas the Motif web portal method provided complete coverage. Of these 16 codes, 12 ICD-9-CM codes were present in 2010 Illinois Medicaid data, and accounted for 0.52% of patient encounters and 0.35% of total Medicaid reimbursements. Extraneous ICD-9-CM codes from both methods (Centers for Medicare and Medicaid Services general equivalent mapping [CMS GEMs, n = 161; Motif web portal, n = 246]) in excess of original ICD-9-CM codes accounted for 2.1% and 2.3% of total patient encounters and 3.4% and 4.1% of total Medicaid reimbursements from the 2010 Illinois Medicare database. Longitudinal data analyses post-ICD-10-CM transition will require backward ICD-10-CM to ICD-9-CM coding, and data comparison for accuracy. Researchers must be aware that all methods for backward coding are not comparable in yielding original ICD-9-CM codes. The mandated delay is an opportunity for organizations to better understand areas of financial risk with regards to data management via backward coding. Our methodology is relevant for all healthcare-related coding data, and can be replicated by organizations as a strategy to mitigate financial risk.
USDA-ARS?s Scientific Manuscript database
Coding/functional SNPs change the biological function of a gene and, therefore, could serve as “large-effect” genetic markers. In this study, we used two bioinformatics pipelines, GATK and SAMtools, for discovering coding/functional SNPs with allelic-imbalances associated with total body weight, mus...
Input and Output in Code Switching: A Case Study of a Japanese-Chinese Bilingual Infant
ERIC Educational Resources Information Center
Meng, Hairong; Miyamoto, Tadao
2012-01-01
Code switching (CS) (or language mixing) generally takes place in bilingual children's utterances, even if their parents adhere to the "one parent-one language" principle. The present case study of a Japanese-Chinese bilingual infant provides both quantitative and qualitative analyses on the impact of input on output, as manifested in CS. The…
PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.
1998-01-01
PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.
Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Bartels, Robert E.
2002-01-01
A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.
Holsclaw, Tracy; Hallgren, Kevin A; Steyvers, Mark; Smyth, Padhraic; Atkins, David C
2015-12-01
Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased Type I and Type II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in online supplemental materials. (c) 2016 APA, all rights reserved).
Holsclaw, Tracy; Hallgren, Kevin A.; Steyvers, Mark; Smyth, Padhraic; Atkins, David C.
2015-01-01
Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non-normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased type-I and type-II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally-technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in supplementary materials. PMID:26098126
Recent improvements of reactor physics codes in MHI
NASA Astrophysics Data System (ADS)
Kosaka, Shinya; Yamaji, Kazuya; Kirimura, Kazuki; Kamiyama, Yohei; Matsumoto, Hideki
2015-12-01
This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO's Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipated transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.
Zhu, Shiyou; Li, Wei; Liu, Jingze; Chen, Chen-Hao; Liao, Qi; Xu, Ping; Xu, Han; Xiao, Tengfei; Cao, Zhongzheng; Peng, Jingyu; Yuan, Pengfei; Brown, Myles; Liu, Xiaole Shirley; Wei, Wensheng
2017-01-01
CRISPR/Cas9 screens have been widely adopted to analyse coding gene functions, but high throughput screening of non-coding elements using this method is more challenging, because indels caused by a single cut in non-coding regions are unlikely to produce a functional knockout. A high-throughput method to produce deletions of non-coding DNA is needed. Herein, we report a high throughput genomic deletion strategy to screen for functional long non-coding RNAs (lncRNAs) that is based on a lentiviral paired-guide RNA (pgRNA) library. Applying our screening method, we identified 51 lncRNAs that can positively or negatively regulate human cancer cell growth. We individually validated 9 lncRNAs using CRISPR/Cas9-mediated genomic deletion and functional rescue, CRISPR activation or inhibition, and gene expression profiling. Our high-throughput pgRNA genome deletion method should enable rapid identification of functional mammalian non-coding elements. PMID:27798563
Recent improvements of reactor physics codes in MHI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kosaka, Shinya, E-mail: shinya-kosaka@mhi.co.jp; Yamaji, Kazuya; Kirimura, Kazuki
2015-12-31
This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO’s Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipatedmore » transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.« less
Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence
2016-06-01
observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical
CFD Predictions for Transonic Performance of the ERA Hybrid Wing-Body Configuration
NASA Technical Reports Server (NTRS)
Deere, Karen A.; Luckring, James M.; McMillin, S. Naomi; Flamm, Jeffrey D.; Roman, Dino
2016-01-01
A computational study was performed for a Hybrid Wing Body configuration that was focused at transonic cruise performance conditions. In the absence of experimental data, two fully independent computational fluid dynamics analyses were conducted to add confidence to the estimated transonic performance predictions. The primary analysis was performed by Boeing with the structured overset-mesh code OVERFLOW. The secondary analysis was performed by NASA Langley Research Center with the unstructured-mesh code USM3D. Both analyses were performed at full-scale flight conditions and included three configurations customary to drag buildup and interference analysis: a powered complete configuration, the configuration with the nacelle/pylon removed, and the powered nacelle in isolation. The results in this paper are focused primarily on transonic performance up to cruise and through drag rise. Comparisons between the CFD results were very good despite some minor geometric differences in the two analyses.
Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
Star 48 solid rocket motor nozzle analyses and instrumented firings
NASA Technical Reports Server (NTRS)
Porter, R. L.
1986-01-01
The analyses and testing performed by NASA in support of an expanded and improved nozzle design data base for use by the U.S. solid rocket motor industry is presented. A production nozzle with a history of one ground failure and two flight failures was selected for analyses and testing. The stress analysis was performed with the Champion computer code developed by the U.S. Navy. Several improvements were made to the code. Strain predictions were made and compared to test data. Two short duration motor firings were conducted with highly instrumented nozzles. The first nozzle had 58 thermocouples, 66 strain gages, and 8 bondline pressure measurements. The second nozzle had 59 thermocouples, 68 strain measurements, and 8 bondline pressure measurements. Most of this instrumentation was on the nonmetallic parts, and provided significantly more thermal and strain data on the nonmetallic components of a nozzle than has been accumulated in a solid rocket motor test to date.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it
2014-10-06
This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where themore » masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.« less
Supercomputer description of human lung morphology for imaging analysis.
Martonen, T B; Hwang, D; Guan, X; Fleming, J S
1998-04-01
A supercomputer code that describes the three-dimensional branching structure of the human lung has been developed. The algorithm was written for the Cray C94. In our simulations, the human lung was divided into a matrix containing discrete volumes (voxels) so as to be compatible with analyses of SPECT images. The matrix has 3840 voxels. The matrix can be segmented into transverse, sagittal and coronal layers analogous to human subject examinations. The compositions of individual voxels were identified by the type and respective number of airways present. The code provides a mapping of the spatial positions of the almost 17 million airways in human lungs and unambiguously assigns each airway to a voxel. Thus, the clinician and research scientist in the medical arena have a powerful new tool to be used in imaging analyses. The code was designed to be integrated into diverse applications, including the interpretation of SPECT images, the design of inhalation exposure experiments and the targeted delivery of inhaled pharmacologic drugs.
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Perry, Boyd III; Chwalowski, Pawel
2014-01-01
Reduced-order modeling (ROM) methods are applied to the CFD-based aeroelastic analysis of the AGARD 445.6 wing in order to gain insight regarding well-known discrepancies between the aeroelastic analyses and the experimental results. The results presented include aeroelastic solutions using the inviscid CAP-TSD code and the FUN3D code (Euler and Navier-Stokes). Full CFD aeroelastic solutions and ROM aeroelastic solutions, computed at several Mach numbers, are presented in the form of root locus plots in order to better reveal the aeroelastic root migrations with increasing dynamic pressure. Important conclusions are drawn from these results including the ability of the linear CAP-TSD code to accurately predict the entire experimental flutter boundary (repeat of analyses performed in the 1980's), that the Euler solutions at supersonic conditions indicate that the third mode is always unstable, and that the FUN3D Navier-Stokes solutions stabilize the unstable third mode seen in the Euler solutions.
Seligmann, Hervé
2013-05-07
GenBank's EST database includes RNAs matching exactly human mitochondrial sequences assuming systematic asymmetric nucleotide exchange-transcription along exchange rules: A→G→C→U/T→A (12 ESTs), A→U/T→C→G→A (4 ESTs), C→G→U/T→C (3 ESTs), and A→C→G→U/T→A (1 EST), no RNAs correspond to other potential asymmetric exchange rules. Hypothetical polypeptides translated from nucleotide-exchanged human mitochondrial protein coding genes align with numerous GenBank proteins, predicted secondary structures resemble their putative GenBank homologue's. Two independent methods designed to detect overlapping genes (one based on nucleotide contents analyses in relation to replicative deamination gradients at third codon positions, and circular code analyses of codon contents based on frame redundancy), confirm nucleotide-exchange-encrypted overlapping genes. Methods converge on which genes are most probably active, and which not, and this for the various exchange rules. Mean EST lengths produced by different nucleotide exchanges are proportional to (a) extents that various bioinformatics analyses confirm the protein coding status of putative overlapping genes; (b) known kinetic chemistry parameters of the corresponding nucleotide substitutions by the human mitochondrial DNA polymerase gamma (nucleotide DNA misinsertion rates); (c) stop codon densities in predicted overlapping genes (stop codon readthrough and exchanging polymerization regulate gene expression by counterbalancing each other). Numerous rarely expressed proteins seem encoded within regular mitochondrial genes through asymmetric nucleotide exchange, avoiding lengthening genomes. Intersecting evidence between several independent approaches confirms the working hypothesis status of gene encryption by systematic nucleotide exchanges. Copyright © 2013 Elsevier Ltd. All rights reserved.
Governing sexual behaviour through humanitarian codes of conduct.
Matti, Stephanie
2015-10-01
Since 2001, there has been a growing consensus that sexual exploitation and abuse of intended beneficiaries by humanitarian workers is a real and widespread problem that requires governance. Codes of conduct have been promoted as a key mechanism for governing the sexual behaviour of humanitarian workers and, ultimately, preventing sexual exploitation and abuse (PSEA). This article presents a systematic study of PSEA codes of conduct adopted by humanitarian non-governmental organisations (NGOs) and how they govern the sexual behaviour of humanitarian workers. It draws on Foucault's analytics of governance and speech act theory to examine the findings of a survey of references to codes of conduct made on the websites of 100 humanitarian NGOs, and to analyse some features of the organisation-specific PSEA codes identified. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.
Severgnini, Marco; Bicciato, Silvio; Mangano, Eleonora; Scarlatti, Francesca; Mezzelani, Alessandra; Mattioli, Michela; Ghidoni, Riccardo; Peano, Clelia; Bonnal, Raoul; Viti, Federica; Milanesi, Luciano; De Bellis, Gianluca; Battaglia, Cristina
2006-06-01
Meta-analysis of microarray data is increasingly important, considering both the availability of multiple platforms using disparate technologies and the accumulation in public repositories of data sets from different laboratories. We addressed the issue of comparing gene expression profiles from two microarray platforms by devising a standardized investigative strategy. We tested this procedure by studying MDA-MB-231 cells, which undergo apoptosis on treatment with resveratrol. Gene expression profiles were obtained using high-density, short-oligonucleotide, single-color microarray platforms: GeneChip (Affymetrix) and CodeLink (Amersham). Interplatform analyses were carried out on 8414 common transcripts represented on both platforms, as identified by LocusLink ID, representing 70.8% and 88.6% of annotated GeneChip and CodeLink features, respectively. We identified 105 differentially expressed genes (DEGs) on CodeLink and 42 DEGs on GeneChip. Among them, only 9 DEGs were commonly identified by both platforms. Multiple analyses (BLAST alignment of probes with target sequences, gene ontology, literature mining, and quantitative real-time PCR) permitted us to investigate the factors contributing to the generation of platform-dependent results in single-color microarray experiments. An effective approach to cross-platform comparison involves microarrays of similar technologies, samples prepared by identical methods, and a standardized battery of bioinformatic and statistical analyses.
EXODUS II: A finite element data model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schoof, L.A.; Yarberry, V.R.
1994-09-01
EXODUS II is a model developed to store and retrieve data for finite element analyses. It is used for preprocessing (problem definition), postprocessing (results visualization), as well as code to code data transfer. An EXODUS II data file is a random access, machine independent, binary file that is written and read via C, C++, or Fortran library routines which comprise the Application Programming Interface (API).
A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals
NASA Technical Reports Server (NTRS)
Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.
1994-01-01
Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.
Panzera, Alejandra; Leaché, Adam D; D'Elía, Guillermo; Victoriano, Pedro F
2017-01-01
The genus Liolaemus is one of the most ecologically diverse and species-rich genera of lizards worldwide. It currently includes more than 250 recognized species, which have been subject to many ecological and evolutionary studies. Nevertheless, Liolaemus lizards have a complex taxonomic history, mainly due to the incongruence between morphological and genetic data, incomplete taxon sampling, incomplete lineage sorting and hybridization. In addition, as many species have restricted and remote distributions, this has hampered their examination and inclusion in molecular systematic studies. The aims of this study are to infer a robust phylogeny for a subsample of lizards representing the Chilean clade (subgenus Liolaemus sensu stricto ), and to test the monophyly of several of the major species groups. We use a phylogenomic approach, targeting 541 ultra-conserved elements (UCEs) and 44 protein-coding genes for 16 taxa. We conduct a comparison of phylogenetic analyses using maximum-likelihood and several species tree inference methods. The UCEs provide stronger support for phylogenetic relationships compared to the protein-coding genes; however, the UCEs outnumber the protein-coding genes by 10-fold. On average, the protein-coding genes contain over twice the number of informative sites. Based on our phylogenomic analyses, all the groups sampled are polyphyletic. Liolaemus tenuis tenuis is difficult to place in the phylogeny, because only a few loci (nine) were recovered for this species. Topologies or support values did not change dramatically upon exclusion of L. t. tenuis from analyses, suggesting that missing data did not had a significant impact on phylogenetic inference in this data set. The phylogenomic analyses provide strong support for sister group relationships between L. fuscus , L. monticola , L. nigroviridis and L. nitidus , and L. platei and L. velosoi . Despite our limited taxon sampling, we have provided a reliable starting hypothesis for the relationships among many major groups of the Chilean clade of Liolaemus that will help future work aimed at resolving the Liolaemus phylogeny.
Coping with Guilt and Shame: A Narrative Approach
ERIC Educational Resources Information Center
Silfver, Mia
2007-01-01
Autobiographical narratives (N = 97) of guilt and shame experiences were analysed to determine how the nature of emotion and context relate to ways of coping in such situations. The coding categories were created by content analysis, and the connections between categories were analysed with optimal scaling and log-linear analysis. Two theoretical…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nicholas R.; Pointer, William David; Sieger, Matt
2016-04-01
The goal of this review is to enable application of codes or software packages for safety assessment of advanced sodium-cooled fast reactor (SFR) designs. To address near-term programmatic needs, the authors have focused on two objectives. First, the authors have focused on identification of requirements for software QA that must be satisfied to enable the application of software to future safety analyses. Second, the authors have collected best practices applied by other code development teams to minimize cost and time of initial code qualification activities and to recommend a path to the stated goal.
Understanding the Code: acting in a patient's best interests.
Griffith, Richard
2015-09-01
The revised Code of the Nursing and Midwifery Council (NMC), the statutory professional regulator for registered district nurses, makes clear that while district nurses can interpret the values and principles for use in community settings, the standards are not negotiable or discretionary. They must be applied or the district nurse's fitness to practice will be called into question. In this article in the continuing series analysing the legal implications of the Code on district nurse practice, the author considers the fourth standard that requires district nurses to act in the best interests of people at all times.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
A rapid mission analysis code based on the use of approximate flight path equations of motion is described. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed vehicle characteristics are specified in tabular form. In addition to its mission performance calculation capabilities, the code also contains extensive flight envelop performance mapping capabilities. Approximate take off and landing analyses can be performed. At high speeds, centrifugal lift effects are taken into account. Extensive turbojet and ramjet engine scaling procedures are incorporated in the code.
Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom
2010-06-01
The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of themore » implementation and testing of SUSA at the INL VHTR Project Office.« less
Nonlinear static and dynamic finite element analysis of an eccentrically loaded graphite-epoxy beam
NASA Technical Reports Server (NTRS)
Fasanella, Edwin L.; Jackson, Karen E.; Jones, Lisa E.
1991-01-01
The Dynamic Crash Analysis of Structures (DYCAT) and NIKE3D nonlinear finite element codes were used to model the static and implulsive response of an eccentrically loaded graphite-epoxy beam. A 48-ply unidirectional composite beam was tested under an eccentric axial compressive load until failure. This loading configuration was chosen to highlight the capabilities of two finite element codes for modeling a highly nonlinear, large deflection structural problem which has an exact solution. These codes are currently used to perform dynamic analyses of aircraft structures under impact loads to study crashworthiness and energy absorbing capabilities. Both beam and plate element models were developed to compare with the experimental data using the DYCAST and NIKE3D codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lilienthal, P.
1997-12-01
This paper describes three different computer codes which have been written to model village power applications. The reasons which have driven the development of these codes include: the existance of limited field data; diverse applications can be modeled; models allow cost and performance comparisons; simulations generate insights into cost structures. The models which are discussed are: Hybrid2, a public code which provides detailed engineering simulations to analyze the performance of a particular configuration; HOMER - the hybrid optimization model for electric renewables - which provides economic screening for sensitivity analyses; and VIPOR the village power model - which is amore » network optimization model for comparing mini-grids to individual systems. Examples of the output of these codes are presented for specific applications.« less
2012-01-01
Background Procedures documented by general practitioners in primary care have not been studied in relation to procedure coding systems. We aimed to describe procedures documented by Swedish general practitioners in electronic patient records and to compare them to the Swedish Classification of Health Interventions (KVÅ) and SNOMED CT. Methods Procedures in 200 record entries were identified, coded, assessed in relation to two procedure coding systems and analysed. Results 417 procedures found in the 200 electronic patient record entries were coded with 36 different Classification of Health Interventions categories and 148 different SNOMED CT concepts. 22.8% of the procedures could not be coded with any Classification of Health Interventions category and 4.3% could not be coded with any SNOMED CT concept. 206 procedure-concept/category pairs were assessed as a complete match in SNOMED CT compared to 10 in the Classification of Health Interventions. Conclusions Procedures documented by general practitioners were present in nearly all electronic patient record entries. Almost all procedures could be coded using SNOMED CT. Classification of Health Interventions covered the procedures to a lesser extent and with a much lower degree of concordance. SNOMED CT is a more flexible terminology system that can be used for different purposes for procedure coding in primary care. PMID:22230095
Use of Health Care Claims Data to Study Patients with Ophthalmologic Conditions
Stein, Joshua D.; Lum, Flora; Lee, Paul P.; Rich, William L.; Coleman, Anne L.
2014-01-01
Objective To describe what information is or is not included in health care claims data, provide an overview of the main advantages and limitations of performing analyses using health care claims data, and offer general guidance on how to report and interpret findings of ophthalmology-related claims data analyses. Design Systematic review. Participants Not applicable. Methods A literature review and synthesis of methods for claims-based data analyses. Main Outcome Measures Not applicable. Results Some advantages of using claims data for analyses include large, diverse sample sizes, longitudinal follow-up, lack of selection bias, and potential for complex, multivariable modeling. The disadvantages include (a) the inherent limitations of claims data, such as incomplete, inaccurate, or missing data, or the lack of specific billing codes for some conditions; and (b) the inability, in some circumstances, to adequately evaluate the appropriateness of care. In general, reports of claims data analyses should include clear descriptions of the following methodological elements: the data source, the inclusion and exclusion criteria, the specific billing codes used, and the potential confounding factors incorporated in the multivariable models. Conclusions The use of claims data for research is expected to increase with the enhanced availability of data from Medicare and other sources. The use of claims data to evaluate resource use and efficiency and to determine the basis for supplementary payment methods for physicians is anticipated. Thus, it will be increasingly important for eye care providers to use accurate and descriptive codes for billing. Adherence to general guidance on the reporting of claims data analyses, as outlined in this article, is important to enhance the credibility and applicability of findings. Guidance on optimal ways to conduct and report ophthalmology-related investigations using claims data will likely continue to evolve as health services researchers refine the metrics to analyze large administrative data sets. PMID:24433971
A History of Rotorcraft Comprehensive Analyses
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2013-01-01
A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.
Nouraei, S A R; Hudovsky, A; Virk, J S; Saleh, H A
2017-04-01
This study aimed to develop a multidisciplinary coded dataset standard for nasal surgery and to assess its impact on data accuracy. An audit of 528 patients undergoing septal and/or inferior turbinate surgery, rhinoplasty and/or septorhinoplasty, and nasal fracture surgery was undertaken. A total of 200 septoplasties, 109 septorhinoplasties, 57 complex septorhinoplasties and 116 nasal fractures were analysed. There were 76 (14.4 per cent) changes to the primary diagnosis. Septorhinoplasties were the most commonly amended procedures. The overall audit-related income change for nasal surgery was £8.78 per patient. Use of a multidisciplinary coded dataset standard revealed that nasal diagnoses were under-coded; a significant proportion of patients received more precise diagnoses following the audit. There was also significant under-coding of both morbidities and revision surgery. The multidisciplinary coded dataset standard approach can improve the accuracy of both data capture and information flow, and, thus, ultimately create a more reliable dataset for use outcomes and health planning.
VLF Trimpi modelling on the path NWC-Dunedin using both finite element and 3D Born modelling
NASA Astrophysics Data System (ADS)
Nunn, D.; Hayakawa, K. B. M.
1998-10-01
This paper investigates the numerical modelling of VLF Trimpis, produced by a D region inhomogeneity on the great circle path. Two different codes are used to model Trimpis on the path NWC-Dunedin. The first is a 2D Finite Element Method Code (FEM), whose solutions are rigorous and valid in the strong scattering or non-Born limit. The second code is a 3D model that invokes the Born approximation. The predicted Trimpis from these codes compare very closely, thus confirming the validity of both models. The modal scattering matrices for both codes are analysed in some detail and are found to have a comparable structure. They indicate strong scattering between the dominant TM modes. Analysis of the scattering matrix from the FEM code shows that departure from linear Born behaviour occurs when the inhomogeneity has a horizontal scale size of about 100 km and a maximum electron density enhancement at 75 km altitude of about 6 electrons.
Moderate Deviation Analysis for Classical Communication over Quantum Channels
NASA Astrophysics Data System (ADS)
Chubb, Christopher T.; Tan, Vincent Y. F.; Tomamichel, Marco
2017-11-01
We analyse families of codes for classical data transmission over quantum channels that have both a vanishing probability of error and a code rate approaching capacity as the code length increases. To characterise the fundamental tradeoff between decoding error, code rate and code length for such codes we introduce a quantum generalisation of the moderate deviation analysis proposed by Altŭg and Wagner as well as Polyanskiy and Verdú. We derive such a tradeoff for classical-quantum (as well as image-additive) channels in terms of the channel capacity and the channel dispersion, giving further evidence that the latter quantity characterises the necessary backoff from capacity when transmitting finite blocks of classical data. To derive these results we also study asymmetric binary quantum hypothesis testing in the moderate deviations regime. Due to the central importance of the latter task, we expect that our techniques will find further applications in the analysis of other quantum information processing tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-05-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-01-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
How families cope with diabetes in adolescence. An approach and case analyses.
Hauser, S T; Paul, E L; Jacobson, A M; Weiss-Perry, B; Vieyra, M A; Rufo, P; Spetter, L D; DiPlacido, J; Wertlieb, D; Wolfsdorf, J
1988-01-01
In this paper we describe our newly constructed Family Coping Coding System. This scheme was constructed to identify family coping strategies that involve appraisal, problem solving, and emotion management dimensions. We discuss the theoretical rationale, meanings and reliability of the coping codes, and illustrate them through excerpts drawn from family discussions of a recent stressful situation (the onset of a chronic or acute illness in an adolescent member). Finally, we consider the clinical research relevance of this new assessment technique, exemplifying this potential with respect to medical compliance. We present analyses of two families with diabetic adolescents who strikingly differ with respect to compliance, and explore which family coping strategies may be predictive of an adolescent's favorable or problematic compliance to diabetes management.
NPSS Multidisciplinary Integration and Analysis
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Rasche, Joseph; Simons, Todd A.; Hoyniak, Daniel
2006-01-01
The objective of this task was to enhance the capability of the Numerical Propulsion System Simulation (NPSS) by expanding its reach into the high-fidelity multidisciplinary analysis area. This task investigated numerical techniques to convert between cold static to hot running geometry of compressor blades. Numerical calculations of blade deformations were iteratively done with high fidelity flow simulations together with high fidelity structural analysis of the compressor blade. The flow simulations were performed with the Advanced Ducted Propfan Analysis (ADPAC) code, while structural analyses were performed with the ANSYS code. High fidelity analyses were used to evaluate the effects on performance of: variations in tip clearance, uncertainty in manufacturing tolerance, variable inlet guide vane scheduling, and the effects of rotational speed on the hot running geometry of the compressor blades.
Sequence data and association statistics from 12,940 type 2 diabetes cases and controls.
Flannick, Jason; Fuchsberger, Christian; Mahajan, Anubha; Teslovich, Tanya M; Agarwala, Vineeta; Gaulton, Kyle J; Caulkins, Lizz; Koesterer, Ryan; Ma, Clement; Moutsianas, Loukas; McCarthy, Davis J; Rivas, Manuel A; Perry, John R B; Sim, Xueling; Blackwell, Thomas W; Robertson, Neil R; Rayner, N William; Cingolani, Pablo; Locke, Adam E; Tajes, Juan Fernandez; Highland, Heather M; Dupuis, Josee; Chines, Peter S; Lindgren, Cecilia M; Hartl, Christopher; Jackson, Anne U; Chen, Han; Huyghe, Jeroen R; van de Bunt, Martijn; Pearson, Richard D; Kumar, Ashish; Müller-Nurasyid, Martina; Grarup, Niels; Stringham, Heather M; Gamazon, Eric R; Lee, Jaehoon; Chen, Yuhui; Scott, Robert A; Below, Jennifer E; Chen, Peng; Huang, Jinyan; Go, Min Jin; Stitzel, Michael L; Pasko, Dorota; Parker, Stephen C J; Varga, Tibor V; Green, Todd; Beer, Nicola L; Day-Williams, Aaron G; Ferreira, Teresa; Fingerlin, Tasha; Horikoshi, Momoko; Hu, Cheng; Huh, Iksoo; Ikram, Mohammad Kamran; Kim, Bong-Jo; Kim, Yongkang; Kim, Young Jin; Kwon, Min-Seok; Lee, Juyoung; Lee, Selyeong; Lin, Keng-Han; Maxwell, Taylor J; Nagai, Yoshihiko; Wang, Xu; Welch, Ryan P; Yoon, Joon; Zhang, Weihua; Barzilai, Nir; Voight, Benjamin F; Han, Bok-Ghee; Jenkinson, Christopher P; Kuulasmaa, Teemu; Kuusisto, Johanna; Manning, Alisa; Ng, Maggie C Y; Palmer, Nicholette D; Balkau, Beverley; Stančáková, Alena; Abboud, Hanna E; Boeing, Heiner; Giedraitis, Vilmantas; Prabhakaran, Dorairaj; Gottesman, Omri; Scott, James; Carey, Jason; Kwan, Phoenix; Grant, George; Smith, Joshua D; Neale, Benjamin M; Purcell, Shaun; Butterworth, Adam S; Howson, Joanna M M; Lee, Heung Man; Lu, Yingchang; Kwak, Soo-Heon; Zhao, Wei; Danesh, John; Lam, Vincent K L; Park, Kyong Soo; Saleheen, Danish; So, Wing Yee; Tam, Claudia H T; Afzal, Uzma; Aguilar, David; Arya, Rector; Aung, Tin; Chan, Edmund; Navarro, Carmen; Cheng, Ching-Yu; Palli, Domenico; Correa, Adolfo; Curran, Joanne E; Rybin, Dennis; Farook, Vidya S; Fowler, Sharon P; Freedman, Barry I; Griswold, Michael; Hale, Daniel Esten; Hicks, Pamela J; Khor, Chiea-Chuen; Kumar, Satish; Lehne, Benjamin; Thuillier, Dorothée; Lim, Wei Yen; Liu, Jianjun; Loh, Marie; Musani, Solomon K; Puppala, Sobha; Scott, William R; Yengo, Loïc; Tan, Sian-Tsung; Taylor, Herman A; Thameem, Farook; Wilson, Gregory; Wong, Tien Yin; Njølstad, Pål Rasmus; Levy, Jonathan C; Mangino, Massimo; Bonnycastle, Lori L; Schwarzmayr, Thomas; Fadista, João; Surdulescu, Gabriela L; Herder, Christian; Groves, Christopher J; Wieland, Thomas; Bork-Jensen, Jette; Brandslund, Ivan; Christensen, Cramer; Koistinen, Heikki A; Doney, Alex S F; Kinnunen, Leena; Esko, Tõnu; Farmer, Andrew J; Hakaste, Liisa; Hodgkiss, Dylan; Kravic, Jasmina; Lyssenko, Valeri; Hollensted, Mette; Jørgensen, Marit E; Jørgensen, Torben; Ladenvall, Claes; Justesen, Johanne Marie; Käräjämäki, Annemari; Kriebel, Jennifer; Rathmann, Wolfgang; Lannfelt, Lars; Lauritzen, Torsten; Narisu, Narisu; Linneberg, Allan; Melander, Olle; Milani, Lili; Neville, Matt; Orho-Melander, Marju; Qi, Lu; Qi, Qibin; Roden, Michael; Rolandsson, Olov; Swift, Amy; Rosengren, Anders H; Stirrups, Kathleen; Wood, Andrew R; Mihailov, Evelin; Blancher, Christine; Carneiro, Mauricio O; Maguire, Jared; Poplin, Ryan; Shakir, Khalid; Fennell, Timothy; DePristo, Mark; de Angelis, Martin Hrabé; Deloukas, Panos; Gjesing, Anette P; Jun, Goo; Nilsson, Peter; Murphy, Jacquelyn; Onofrio, Robert; Thorand, Barbara; Hansen, Torben; Meisinger, Christa; Hu, Frank B; Isomaa, Bo; Karpe, Fredrik; Liang, Liming; Peters, Annette; Huth, Cornelia; O'Rahilly, Stephen P; Palmer, Colin N A; Pedersen, Oluf; Rauramaa, Rainer; Tuomilehto, Jaakko; Salomaa, Veikko; Watanabe, Richard M; Syvänen, Ann-Christine; Bergman, Richard N; Bharadwaj, Dwaipayan; Bottinger, Erwin P; Cho, Yoon Shin; Chandak, Giriraj R; Chan, Juliana Cn; Chia, Kee Seng; Daly, Mark J; Ebrahim, Shah B; Langenberg, Claudia; Elliott, Paul; Jablonski, Kathleen A; Lehman, Donna M; Jia, Weiping; Ma, Ronald C W; Pollin, Toni I; Sandhu, Manjinder; Tandon, Nikhil; Froguel, Philippe; Barroso, Inês; Teo, Yik Ying; Zeggini, Eleftheria; Loos, Ruth J F; Small, Kerrin S; Ried, Janina S; DeFronzo, Ralph A; Grallert, Harald; Glaser, Benjamin; Metspalu, Andres; Wareham, Nicholas J; Walker, Mark; Banks, Eric; Gieger, Christian; Ingelsson, Erik; Im, Hae Kyung; Illig, Thomas; Franks, Paul W; Buck, Gemma; Trakalo, Joseph; Buck, David; Prokopenko, Inga; Mägi, Reedik; Lind, Lars; Farjoun, Yossi; Owen, Katharine R; Gloyn, Anna L; Strauch, Konstantin; Tuomi, Tiinamaija; Kooner, Jaspal Singh; Lee, Jong-Young; Park, Taesung; Donnelly, Peter; Morris, Andrew D; Hattersley, Andrew T; Bowden, Donald W; Collins, Francis S; Atzmon, Gil; Chambers, John C; Spector, Timothy D; Laakso, Markku; Strom, Tim M; Bell, Graeme I; Blangero, John; Duggirala, Ravindranath; Tai, E Shyong; McVean, Gilean; Hanis, Craig L; Wilson, James G; Seielstad, Mark; Frayling, Timothy M; Meigs, James B; Cox, Nancy J; Sladek, Rob; Lander, Eric S; Gabriel, Stacey; Mohlke, Karen L; Meitinger, Thomas; Groop, Leif; Abecasis, Goncalo; Scott, Laura J; Morris, Andrew P; Kang, Hyun Min; Altshuler, David; Burtt, Noël P; Florez, Jose C; Boehnke, Michael; McCarthy, Mark I
2017-12-19
To investigate the genetic basis of type 2 diabetes (T2D) to high resolution, the GoT2D and T2D-GENES consortia catalogued variation from whole-genome sequencing of 2,657 European individuals and exome sequencing of 12,940 individuals of multiple ancestries. Over 27M SNPs, indels, and structural variants were identified, including 99% of low-frequency (minor allele frequency [MAF] 0.1-5%) non-coding variants in the whole-genome sequenced individuals and 99.7% of low-frequency coding variants in the whole-exome sequenced individuals. Each variant was tested for association with T2D in the sequenced individuals, and, to increase power, most were tested in larger numbers of individuals (>80% of low-frequency coding variants in ~82 K Europeans via the exome chip, and ~90% of low-frequency non-coding variants in ~44 K Europeans via genotype imputation). The variants, genotypes, and association statistics from these analyses provide the largest reference to date of human genetic information relevant to T2D, for use in activities such as T2D-focused genotype imputation, functional characterization of variants or genes, and other novel analyses to detect associations between sequence variation and T2D.
Sequence data and association statistics from 12,940 type 2 diabetes cases and controls
Jason, Flannick; Fuchsberger, Christian; Mahajan, Anubha; Teslovich, Tanya M.; Agarwala, Vineeta; Gaulton, Kyle J.; Caulkins, Lizz; Koesterer, Ryan; Ma, Clement; Moutsianas, Loukas; McCarthy, Davis J.; Rivas, Manuel A.; Perry, John R. B.; Sim, Xueling; Blackwell, Thomas W.; Robertson, Neil R.; Rayner, N William; Cingolani, Pablo; Locke, Adam E.; Tajes, Juan Fernandez; Highland, Heather M.; Dupuis, Josee; Chines, Peter S.; Lindgren, Cecilia M.; Hartl, Christopher; Jackson, Anne U.; Chen, Han; Huyghe, Jeroen R.; van de Bunt, Martijn; Pearson, Richard D.; Kumar, Ashish; Müller-Nurasyid, Martina; Grarup, Niels; Stringham, Heather M.; Gamazon, Eric R.; Lee, Jaehoon; Chen, Yuhui; Scott, Robert A.; Below, Jennifer E.; Chen, Peng; Huang, Jinyan; Go, Min Jin; Stitzel, Michael L.; Pasko, Dorota; Parker, Stephen C. J.; Varga, Tibor V.; Green, Todd; Beer, Nicola L.; Day-Williams, Aaron G.; Ferreira, Teresa; Fingerlin, Tasha; Horikoshi, Momoko; Hu, Cheng; Huh, Iksoo; Ikram, Mohammad Kamran; Kim, Bong-Jo; Kim, Yongkang; Kim, Young Jin; Kwon, Min-Seok; Lee, Juyoung; Lee, Selyeong; Lin, Keng-Han; Maxwell, Taylor J.; Nagai, Yoshihiko; Wang, Xu; Welch, Ryan P.; Yoon, Joon; Zhang, Weihua; Barzilai, Nir; Voight, Benjamin F.; Han, Bok-Ghee; Jenkinson, Christopher P.; Kuulasmaa, Teemu; Kuusisto, Johanna; Manning, Alisa; Ng, Maggie C. Y.; Palmer, Nicholette D.; Balkau, Beverley; Stančáková, Alena; Abboud, Hanna E.; Boeing, Heiner; Giedraitis, Vilmantas; Prabhakaran, Dorairaj; Gottesman, Omri; Scott, James; Carey, Jason; Kwan, Phoenix; Grant, George; Smith, Joshua D.; Neale, Benjamin M.; Purcell, Shaun; Butterworth, Adam S.; Howson, Joanna M. M.; Lee, Heung Man; Lu, Yingchang; Kwak, Soo-Heon; Zhao, Wei; Danesh, John; Lam, Vincent K. L.; Park, Kyong Soo; Saleheen, Danish; So, Wing Yee; Tam, Claudia H. T.; Afzal, Uzma; Aguilar, David; Arya, Rector; Aung, Tin; Chan, Edmund; Navarro, Carmen; Cheng, Ching-Yu; Palli, Domenico; Correa, Adolfo; Curran, Joanne E.; Rybin, Dennis; Farook, Vidya S.; Fowler, Sharon P.; Freedman, Barry I.; Griswold, Michael; Hale, Daniel Esten; Hicks, Pamela J.; Khor, Chiea-Chuen; Kumar, Satish; Lehne, Benjamin; Thuillier, Dorothée; Lim, Wei Yen; Liu, Jianjun; Loh, Marie; Musani, Solomon K.; Puppala, Sobha; Scott, William R.; Yengo, Loïc; Tan, Sian-Tsung; Taylor, Herman A.; Thameem, Farook; Wilson, Gregory; Wong, Tien Yin; Njølstad, Pål Rasmus; Levy, Jonathan C.; Mangino, Massimo; Bonnycastle, Lori L.; Schwarzmayr, Thomas; Fadista, João; Surdulescu, Gabriela L.; Herder, Christian; Groves, Christopher J.; Wieland, Thomas; Bork-Jensen, Jette; Brandslund, Ivan; Christensen, Cramer; Koistinen, Heikki A.; Doney, Alex S. F.; Kinnunen, Leena; Esko, Tõnu; Farmer, Andrew J.; Hakaste, Liisa; Hodgkiss, Dylan; Kravic, Jasmina; Lyssenko, Valeri; Hollensted, Mette; Jørgensen, Marit E.; Jørgensen, Torben; Ladenvall, Claes; Justesen, Johanne Marie; Käräjämäki, Annemari; Kriebel, Jennifer; Rathmann, Wolfgang; Lannfelt, Lars; Lauritzen, Torsten; Narisu, Narisu; Linneberg, Allan; Melander, Olle; Milani, Lili; Neville, Matt; Orho-Melander, Marju; Qi, Lu; Qi, Qibin; Roden, Michael; Rolandsson, Olov; Swift, Amy; Rosengren, Anders H.; Stirrups, Kathleen; Wood, Andrew R.; Mihailov, Evelin; Blancher, Christine; Carneiro, Mauricio O.; Maguire, Jared; Poplin, Ryan; Shakir, Khalid; Fennell, Timothy; DePristo, Mark; de Angelis, Martin Hrabé; Deloukas, Panos; Gjesing, Anette P.; Jun, Goo; Nilsson, Peter; Murphy, Jacquelyn; Onofrio, Robert; Thorand, Barbara; Hansen, Torben; Meisinger, Christa; Hu, Frank B.; Isomaa, Bo; Karpe, Fredrik; Liang, Liming; Peters, Annette; Huth, Cornelia; O'Rahilly, Stephen P; Palmer, Colin N. A.; Pedersen, Oluf; Rauramaa, Rainer; Tuomilehto, Jaakko; Salomaa, Veikko; Watanabe, Richard M.; Syvänen, Ann-Christine; Bergman, Richard N.; Bharadwaj, Dwaipayan; Bottinger, Erwin P.; Cho, Yoon Shin; Chandak, Giriraj R.; Chan, Juliana CN; Chia, Kee Seng; Daly, Mark J.; Ebrahim, Shah B.; Langenberg, Claudia; Elliott, Paul; Jablonski, Kathleen A.; Lehman, Donna M.; Jia, Weiping; Ma, Ronald C. W.; Pollin, Toni I.; Sandhu, Manjinder; Tandon, Nikhil; Froguel, Philippe; Barroso, Inês; Teo, Yik Ying; Zeggini, Eleftheria; Loos, Ruth J. F.; Small, Kerrin S.; Ried, Janina S.; DeFronzo, Ralph A.; Grallert, Harald; Glaser, Benjamin; Metspalu, Andres; Wareham, Nicholas J.; Walker, Mark; Banks, Eric; Gieger, Christian; Ingelsson, Erik; Im, Hae Kyung; Illig, Thomas; Franks, Paul W.; Buck, Gemma; Trakalo, Joseph; Buck, David; Prokopenko, Inga; Mägi, Reedik; Lind, Lars; Farjoun, Yossi; Owen, Katharine R.; Gloyn, Anna L.; Strauch, Konstantin; Tuomi, Tiinamaija; Kooner, Jaspal Singh; Lee, Jong-Young; Park, Taesung; Donnelly, Peter; Morris, Andrew D.; Hattersley, Andrew T.; Bowden, Donald W.; Collins, Francis S.; Atzmon, Gil; Chambers, John C.; Spector, Timothy D.; Laakso, Markku; Strom, Tim M.; Bell, Graeme I.; Blangero, John; Duggirala, Ravindranath; Tai, E. Shyong; McVean, Gilean; Hanis, Craig L.; Wilson, James G.; Seielstad, Mark; Frayling, Timothy M.; Meigs, James B.; Cox, Nancy J.; Sladek, Rob; Lander, Eric S.; Gabriel, Stacey; Mohlke, Karen L.; Meitinger, Thomas; Groop, Leif; Abecasis, Goncalo; Scott, Laura J.; Morris, Andrew P.; Kang, Hyun Min; Altshuler, David; Burtt, Noël P.; Florez, Jose C.; Boehnke, Michael; McCarthy, Mark I.
2017-01-01
To investigate the genetic basis of type 2 diabetes (T2D) to high resolution, the GoT2D and T2D-GENES consortia catalogued variation from whole-genome sequencing of 2,657 European individuals and exome sequencing of 12,940 individuals of multiple ancestries. Over 27M SNPs, indels, and structural variants were identified, including 99% of low-frequency (minor allele frequency [MAF] 0.1–5%) non-coding variants in the whole-genome sequenced individuals and 99.7% of low-frequency coding variants in the whole-exome sequenced individuals. Each variant was tested for association with T2D in the sequenced individuals, and, to increase power, most were tested in larger numbers of individuals (>80% of low-frequency coding variants in ~82 K Europeans via the exome chip, and ~90% of low-frequency non-coding variants in ~44 K Europeans via genotype imputation). The variants, genotypes, and association statistics from these analyses provide the largest reference to date of human genetic information relevant to T2D, for use in activities such as T2D-focused genotype imputation, functional characterization of variants or genes, and other novel analyses to detect associations between sequence variation and T2D. PMID:29257133
A computer code for calculations in the algebraic collective model of the atomic nucleus
NASA Astrophysics Data System (ADS)
Welsh, T. A.; Rowe, D. J.
2016-03-01
A Maple code is presented for algebraic collective model (ACM) calculations. The ACM is an algebraic version of the Bohr model of the atomic nucleus, in which all required matrix elements are derived by exploiting the model's SU(1 , 1) × SO(5) dynamical group. This paper reviews the mathematical formulation of the ACM, and serves as a manual for the code. The code enables a wide range of model Hamiltonians to be analysed. This range includes essentially all Hamiltonians that are rational functions of the model's quadrupole moments qˆM and are at most quadratic in the corresponding conjugate momenta πˆN (- 2 ≤ M , N ≤ 2). The code makes use of expressions for matrix elements derived elsewhere and newly derived matrix elements of the operators [ π ˆ ⊗ q ˆ ⊗ π ˆ ] 0 and [ π ˆ ⊗ π ˆ ] LM. The code is made efficient by use of an analytical expression for the needed SO(5)-reduced matrix elements, and use of SO(5) ⊃ SO(3) Clebsch-Gordan coefficients obtained from precomputed data files provided with the code.
Codes and morals: is there a missing link? (The Nuremberg Code revisited).
Hick, C
1998-01-01
Codes are a well known and popular but weak form of ethical regulation in medical practice. There is, however, a lack of research on the relations between moral judgments and ethical Codes, or on the possibility of morally justifying these Codes. Our analysis begins by showing, given the Nuremberg Code, how a typical reference to natural law has historically served as moral justification. We then indicate, following the analyses of H. T. Engelhardt, Jr., and A. MacIntyre, why such general moral justifications of codes must necessarily fail in a society of "moral strangers." Going beyond Engelhardt we argue, that after the genealogical suspicion in morals raised by Nietzsche, not even Engelhardt's "principle of permission" can be rationally justified in a strong sense--a problem of transcendental argumentation in morals already realized by I. Kant. Therefore, we propose to abandon the project of providing general justifications for moral judgements and to replace it with a hermeneutical analysis of ethical meanings in real-world situations, starting with the archetypal ethical situation, the encounter with the Other (E. Levinas).
Administrative database code accuracy did not vary notably with changes in disease prevalence.
van Walraven, Carl; English, Shane; Austin, Peter C
2016-11-01
Previous mathematical analyses of diagnostic tests based on the categorization of a continuous measure have found that test sensitivity and specificity varies significantly by disease prevalence. This study determined if the accuracy of diagnostic codes varied by disease prevalence. We used data from two previous studies in which the true status of renal disease and primary subarachnoid hemorrhage, respectively, had been determined. In multiple stratified random samples from the two previous studies having varying disease prevalence, we measured the accuracy of diagnostic codes for each disease using sensitivity, specificity, and positive and negative predictive value. Diagnostic code sensitivity and specificity did not change notably within clinically sensible disease prevalence. In contrast, positive and negative predictive values changed significantly with disease prevalence. Disease prevalence had no important influence on the sensitivity and specificity of diagnostic codes in administrative databases. Copyright © 2016 Elsevier Inc. All rights reserved.
Are neighborhood-level characteristics associated with indoor allergens in the household?
Rosenfeld, Lindsay; Rudd, Rima; Chew, Ginger L; Emmons, Karen; Acevedo-García, Dolores
2010-02-01
Individual home characteristics have been associated with indoor allergen exposure; however, the influence of neighborhood-level characteristics has not been well studied. We defined neighborhoods as community districts determined by the New York City Department of City Planning. We examined the relationship between neighborhood-level characteristics and the presence of dust mite (Der f 1), cat (Fel d 1), cockroach (Bla g 2), and mouse (MUP) allergens in the household. Using data from the Puerto Rican Asthma Project, a birth cohort of Puerto Rican children at risk of allergic sensitization (n = 261), we examined associations between neighborhood characteristics (percent tree canopy, asthma hospitalizations per 1,000 children, roadway length within 100 meters of buildings, serious housing code violations per 1000 rental units, poverty rates, and felony crime rates), and the presence of indoor allergens. Allergen cutpoints were used for categorical analyses and defined as follows: dust mite: >0.25 microg/g; cat: >1 microg/g; cockroach: >1 U/g; mouse: >1.6 microg/g. Serious housing code violations were statistically significantly positively associated with dust mite, cat, and mouse allergens (continuous variables), adjusting for mother's income and education, and all neighborhood-level characteristics. In multivariable logistic regression analyses, medium levels of housing code violations were associated with higher dust mite and cat allergens (1.81, 95%CI: 1.08, 3.03 and 3.10, 95%CI: 1.22, 7.92, respectively). A high level of serious housing code violations was associated with higher mouse allergen (2.04, 95%CI: 1.15, 3.62). A medium level of housing code violations was associated with higher cockroach allergen (3.30, 95%CI: 1.11, 9.78). Neighborhood-level characteristics, specifically housing code violations, appear to be related to indoor allergens, which may have implications for future research explorations and policy decisions.
Louder than words: power and conflict in interprofessional education articles, 1954–2013
Paradis, Elise; Whitehead, Cynthia R
2015-01-01
Context Interprofessional education (IPE) aspires to enable collaborative practice. Current IPE offerings, although rapidly proliferating, lack evidence of efficacy and theoretical grounding. Objectives Our research aimed to explore the historical emergence of the field of IPE and to analyse the positioning of this academic field of inquiry. In particular, we sought to investigate the extent to which power and conflict – elements central to interprofessional care – figure in the IPE literature. Methods We used a combination of deductive and inductive automated coding and manual coding to explore the contents of 2191 articles in the IPE literature published between 1954 and 2013. Inductive coding focused on the presence and use of the sociological (rather than statistical) version of power, which refers to hierarchies and asymmetries among the professions. Articles found to be centrally about power were then analysed using content analysis. Results Publications on IPE have grown exponentially in the past decade. Deductive coding of identified articles showed an emphasis on students, learning, programmes and practice. Automated inductive coding of titles and abstracts identified 129 articles potentially about power, but manual coding found that only six articles put power and conflict at the centre. Content analysis of these six articles revealed that two provided tentative explorations of power dynamics, one skirted around this issue, and three explicitly theorised and integrated power and conflict. Conclusions The lack of attention to power and conflict in the IPE literature suggests that many educators do not foreground these issues. Education programmes are expected to transform individuals into effective collaborators, without heed to structural, organisational and institutional factors. In so doing, current constructions of IPE veil the problems that IPE attempts to solve. PMID:25800300
Louder than words: power and conflict in interprofessional education articles, 1954-2013.
Paradis, Elise; Whitehead, Cynthia R
2015-04-01
Interprofessional education (IPE) aspires to enable collaborative practice. Current IPE offerings, although rapidly proliferating, lack evidence of efficacy and theoretical grounding. Our research aimed to explore the historical emergence of the field of IPE and to analyse the positioning of this academic field of inquiry. In particular, we sought to investigate the extent to which power and conflict - elements central to interprofessional care - figure in the IPE literature. We used a combination of deductive and inductive automated coding and manual coding to explore the contents of 2191 articles in the IPE literature published between 1954 and 2013. Inductive coding focused on the presence and use of the sociological (rather than statistical) version of power, which refers to hierarchies and asymmetries among the professions. Articles found to be centrally about power were then analysed using content analysis. Publications on IPE have grown exponentially in the past decade. Deductive coding of identified articles showed an emphasis on students, learning, programmes and practice. Automated inductive coding of titles and abstracts identified 129 articles potentially about power, but manual coding found that only six articles put power and conflict at the centre. Content analysis of these six articles revealed that two provided tentative explorations of power dynamics, one skirted around this issue, and three explicitly theorised and integrated power and conflict. The lack of attention to power and conflict in the IPE literature suggests that many educators do not foreground these issues. Education programmes are expected to transform individuals into effective collaborators, without heed to structural, organisational and institutional factors. In so doing, current constructions of IPE veil the problems that IPE attempts to solve. © 2015 The Authors Medical Education Published by John Wiley & Sons Ltd.
A genetic scale of reading frame coding.
Michel, Christian J
2014-08-21
The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Athavale, Mahesh; Przekwas, Andrzej
2004-01-01
The objectives of the program were to develop computational fluid dynamics (CFD) codes and simpler industrial codes for analyzing and designing advanced seals for air-breathing and space propulsion engines. The CFD code SCISEAL is capable of producing full three-dimensional flow field information for a variety of cylindrical configurations. An implicit multidomain capability allow the division of complex flow domains to allow optimum use of computational cells. SCISEAL also has the unique capability to produce cross-coupled stiffness and damping coefficients for rotordynamic computations. The industrial codes consist of a series of separate stand-alone modules designed for expeditious parametric analyses and optimization of a wide variety of cylindrical and face seals. Coupled through a Knowledge-Based System (KBS) that provides a user-friendly Graphical User Interface (GUI), the industrial codes are PC based using an OS/2 operating system. These codes were designed to treat film seals where a clearance exists between the rotating and stationary components. Leakage is inhibited by surface roughness, small but stiff clearance films, and viscous pumping devices. The codes have demonstrated to be a valuable resource for seal development of future air-breathing and space propulsion engines.
Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim
2017-01-01
Objective To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. Design A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Setting Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Main outcome measure Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding ‘poor’ quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Results Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled ‘poor’ quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. Conclusions In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. PMID:28122831
ERIC Educational Resources Information Center
Egmir, Eray; Erdem, Cahit; Koçyigit, Mehmet
2017-01-01
The aim of this study is to analyse the studies published in "International Journal of Instruction" ["IJI"] in the last ten years. This study is a qualitative, descriptive literature review study. The data was collected through document analysis, coded using constant comparison and analysed using content analysis. Frequencies…
Recommendations for open data science.
Gymrek, Melissa; Farjoun, Yossi
2016-01-01
Life science research increasingly relies on large-scale computational analyses. However, the code and data used for these analyses are often lacking in publications. To maximize scientific impact, reproducibility, and reuse, it is crucial that these resources are made publicly available and are fully transparent. We provide recommendations for improving the openness of data-driven studies in life sciences.
ERIC Educational Resources Information Center
Plonsky, Luke
2013-01-01
This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…
Copper benchmark experiment for the testing of JEFF-3.2 nuclear data for fusion applications
NASA Astrophysics Data System (ADS)
Angelone, M.; Flammini, D.; Loreti, S.; Moro, F.; Pillon, M.; Villar, R.; Klix, A.; Fischer, U.; Kodeli, I.; Perel, R. L.; Pohorecky, W.
2017-09-01
A neutronics benchmark experiment on a pure Copper block (dimensions 60 × 70 × 70 cm3) aimed at testing and validating the recent nuclear data libraries for fusion applications was performed in the frame of the European Fusion Program at the 14 MeV ENEA Frascati Neutron Generator (FNG). Reaction rates, neutron flux spectra and doses were measured using different experimental techniques (e.g. activation foils techniques, NE213 scintillator and thermoluminescent detectors). This paper first summarizes the analyses of the experiment carried-out using the MCNP5 Monte Carlo code and the European JEFF-3.2 library. Large discrepancies between calculation (C) and experiment (E) were found for the reaction rates both in the high and low neutron energy range. The analysis was complemented by sensitivity/uncertainty analyses (S/U) using the deterministic and Monte Carlo SUSD3D and MCSEN codes, respectively. The S/U analyses enabled to identify the cross sections and energy ranges which are mostly affecting the calculated responses. The largest discrepancy among the C/E values was observed for the thermal (capture) reactions indicating severe deficiencies in the 63,65Cu capture and elastic cross sections at lower rather than at high energy. Deterministic and MC codes produced similar results. The 14 MeV copper experiment and its analysis thus calls for a revision of the JEFF-3.2 copper cross section and covariance data evaluation. A new analysis of the experiment was performed with the MCNP5 code using the revised JEFF-3.3-T2 library released by NEA and a new, not yet distributed, revised JEFF-3.2 Cu evaluation produced by KIT. A noticeable improvement of the C/E results was obtained with both new libraries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balkey, K.; Witt, F.J.; Bishop, B.A.
1995-06-01
Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less
Evolution of coding and non-coding genes in HOX clusters of a marsupial.
Yu, Hongshi; Lindsay, James; Feng, Zhi-Ping; Frankenberg, Stephen; Hu, Yanqiu; Carone, Dawn; Shaw, Geoff; Pask, Andrew J; O'Neill, Rachel; Papenfuss, Anthony T; Renfree, Marilyn B
2012-06-18
The HOX gene clusters are thought to be highly conserved amongst mammals and other vertebrates, but the long non-coding RNAs have only been studied in detail in human and mouse. The sequencing of the kangaroo genome provides an opportunity to use comparative analyses to compare the HOX clusters of a mammal with a distinct body plan to those of other mammals. Here we report a comparative analysis of HOX gene clusters between an Australian marsupial of the kangaroo family and the eutherians. There was a strikingly high level of conservation of HOX gene sequence and structure and non-protein coding genes including the microRNAs miR-196a, miR-196b, miR-10a and miR-10b and the long non-coding RNAs HOTAIR, HOTAIRM1 and HOXA11AS that play critical roles in regulating gene expression and controlling development. By microRNA deep sequencing and comparative genomic analyses, two conserved microRNAs (miR-10a and miR-10b) were identified and one new candidate microRNA with typical hairpin precursor structure that is expressed in both fibroblasts and testes was found. The prediction of microRNA target analysis showed that several known microRNA targets, such as miR-10, miR-414 and miR-464, were found in the tammar HOX clusters. In addition, several novel and putative miRNAs were identified that originated from elsewhere in the tammar genome and that target the tammar HOXB and HOXD clusters. This study confirms that the emergence of known long non-coding RNAs in the HOX clusters clearly predate the marsupial-eutherian divergence 160 Ma ago. It also identified a new potentially functional microRNA as well as conserved miRNAs. These non-coding RNAs may participate in the regulation of HOX genes to influence the body plan of this marsupial.
Evolution of coding and non-coding genes in HOX clusters of a marsupial
2012-01-01
Background The HOX gene clusters are thought to be highly conserved amongst mammals and other vertebrates, but the long non-coding RNAs have only been studied in detail in human and mouse. The sequencing of the kangaroo genome provides an opportunity to use comparative analyses to compare the HOX clusters of a mammal with a distinct body plan to those of other mammals. Results Here we report a comparative analysis of HOX gene clusters between an Australian marsupial of the kangaroo family and the eutherians. There was a strikingly high level of conservation of HOX gene sequence and structure and non-protein coding genes including the microRNAs miR-196a, miR-196b, miR-10a and miR-10b and the long non-coding RNAs HOTAIR, HOTAIRM1 and HOXA11AS that play critical roles in regulating gene expression and controlling development. By microRNA deep sequencing and comparative genomic analyses, two conserved microRNAs (miR-10a and miR-10b) were identified and one new candidate microRNA with typical hairpin precursor structure that is expressed in both fibroblasts and testes was found. The prediction of microRNA target analysis showed that several known microRNA targets, such as miR-10, miR-414 and miR-464, were found in the tammar HOX clusters. In addition, several novel and putative miRNAs were identified that originated from elsewhere in the tammar genome and that target the tammar HOXB and HOXD clusters. Conclusions This study confirms that the emergence of known long non-coding RNAs in the HOX clusters clearly predate the marsupial-eutherian divergence 160 Ma ago. It also identified a new potentially functional microRNA as well as conserved miRNAs. These non-coding RNAs may participate in the regulation of HOX genes to influence the body plan of this marsupial. PMID:22708672
Incorporation of Dynamic SSI Effects in the Design Response Spectra
NASA Astrophysics Data System (ADS)
Manjula, N. K.; Pillai, T. M. Madhavan; Nagarajan, Praveen; Reshma, K. K.
2018-05-01
Many studies in the past on dynamic soil-structure interactions have revealed the detrimental and advantageous effects of soil flexibility. Based on such studies, the design response spectra of international seismic codes are being improved worldwide. The improvements required for the short period range of the design response spectra in the Indian seismic code (IS 1893:2002) are presented in this paper. As the recent code revisions has not incorporated the short period amplifications, proposals given in this paper are equally applicable for the latest code also (IS 1893:2016). Analyses of single degree of freedom systems are performed to predict the required improvements. The proposed modifications to the constant acceleration portion of the spectra are evaluated with respect to the current design spectra in Eurocode 8.
Color and Grey Scale in Sonar Displays
NASA Technical Reports Server (NTRS)
Kraiss, K. F.; Kuettelwesch, K. H.
1984-01-01
In spite of numerous publications 1 it is still rather unclear, whether color is of any help in sonar displays. The work presented here deals with a particular type of sonar data, i.e., LOFAR-grams (low frequency analysing and recording) where acoustic sensor data are continuously written as a time-frequency plot. The question to be answered quantitatively is, whether color coding does improve target detection when compared with a grey scale code. The data show significant differences in receiver-operating characteristics performance for the selected codes. In addition it turned out, that the background noise level affects the performance dramatically for some color codes, while others remain stable or even improve. Generally valid rules are presented on how to generate useful color scales for this particular application.
Structural and seismic analyses of waste facility reinforced concrete storage vaults
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, C.Y.
1995-07-01
Facility 317 of Argonne National Laboratory consists of several reinforced concrete waste storage vaults designed and constructed in the late 1940`s through the early 1960`s. In this paper, structural analyses of these concrete vaults subjected to various natural hazards are described, emphasizing the northwest shallow vault. The natural phenomenon hazards considered include both earthquakes and tornados. Because these vaults are deeply embedded in the soil, the SASSI (System Analysis of Soil-Structure Interaction) code was utilized for the seismic calculations. The ultimate strength method was used to analyze the reinforced concrete structures. In all studies, moment and shear strengths at criticalmore » locations of the storage vaults were evaluated. Results of the structural analyses show that almost all the waste storage vaults meet the code requirements according to ACI 349--85. These vaults also satisfy the performance goal such that confinement of hazardous materials is maintained and functioning of the facility is not interrupted.« less
Day, Felix R; Ruth, Katherine S; Thompson, Deborah J; Lunetta, Kathryn L; Pervjakova, Natalia; Chasman, Daniel I; Stolk, Lisette; Finucane, Hilary K; Sulem, Patrick; Bulik-Sullivan, Brendan; Esko, Tõnu; Johnson, Andrew D; Elks, Cathy E; Franceschini, Nora; He, Chunyan; Altmaier, Elisabeth; Brody, Jennifer A; Franke, Lude L; Huffman, Jennifer E; Keller, Margaux F; McArdle, Patrick F; Nutile, Teresa; Porcu, Eleonora; Robino, Antonietta; Rose, Lynda M; Schick, Ursula M; Smith, Jennifer A; Teumer, Alexander; Traglia, Michela; Vuckovic, Dragana; Yao, Jie; Zhao, Wei; Albrecht, Eva; Amin, Najaf; Corre, Tanguy; Hottenga, Jouke-Jan; Mangino, Massimo; Smith, Albert V; Tanaka, Toshiko; Abecasis, Goncalo; Andrulis, Irene L; Anton-Culver, Hoda; Antoniou, Antonis C; Arndt, Volker; Arnold, Alice M; Barbieri, Caterina; Beckmann, Matthias W; Beeghly-Fadiel, Alicia; Benitez, Javier; Bernstein, Leslie; Bielinski, Suzette J; Blomqvist, Carl; Boerwinkle, Eric; Bogdanova, Natalia V; Bojesen, Stig E; Bolla, Manjeet K; Borresen-Dale, Anne-Lise; Boutin, Thibaud S; Brauch, Hiltrud; Brenner, Hermann; Brüning, Thomas; Burwinkel, Barbara; Campbell, Archie; Campbell, Harry; Chanock, Stephen J; Chapman, J Ross; Chen, Yii-Der Ida; Chenevix-Trench, Georgia; Couch, Fergus J; Coviello, Andrea D; Cox, Angela; Czene, Kamila; Darabi, Hatef; De Vivo, Immaculata; Demerath, Ellen W; Dennis, Joe; Devilee, Peter; Dörk, Thilo; Dos-Santos-Silva, Isabel; Dunning, Alison M; Eicher, John D; Fasching, Peter A; Faul, Jessica D; Figueroa, Jonine; Flesch-Janys, Dieter; Gandin, Ilaria; Garcia, Melissa E; García-Closas, Montserrat; Giles, Graham G; Girotto, Giorgia G; Goldberg, Mark S; González-Neira, Anna; Goodarzi, Mark O; Grove, Megan L; Gudbjartsson, Daniel F; Guénel, Pascal; Guo, Xiuqing; Haiman, Christopher A; Hall, Per; Hamann, Ute; Henderson, Brian E; Hocking, Lynne J; Hofman, Albert; Homuth, Georg; Hooning, Maartje J; Hopper, John L; Hu, Frank B; Huang, Jinyan; Humphreys, Keith; Hunter, David J; Jakubowska, Anna; Jones, Samuel E; Kabisch, Maria; Karasik, David; Knight, Julia A; Kolcic, Ivana; Kooperberg, Charles; Kosma, Veli-Matti; Kriebel, Jennifer; Kristensen, Vessela; Lambrechts, Diether; Langenberg, Claudia; Li, Jingmei; Li, Xin; Lindström, Sara; Liu, Yongmei; Luan, Jian'an; Lubinski, Jan; Mägi, Reedik; Mannermaa, Arto; Manz, Judith; Margolin, Sara; Marten, Jonathan; Martin, Nicholas G; Masciullo, Corrado; Meindl, Alfons; Michailidou, Kyriaki; Mihailov, Evelin; Milani, Lili; Milne, Roger L; Müller-Nurasyid, Martina; Nalls, Michael; Neale, Ben M; Nevanlinna, Heli; Neven, Patrick; Newman, Anne B; Nordestgaard, Børge G; Olson, Janet E; Padmanabhan, Sandosh; Peterlongo, Paolo; Peters, Ulrike; Petersmann, Astrid; Peto, Julian; Pharoah, Paul D P; Pirastu, Nicola N; Pirie, Ailith; Pistis, Giorgio; Polasek, Ozren; Porteous, David; Psaty, Bruce M; Pylkäs, Katri; Radice, Paolo; Raffel, Leslie J; Rivadeneira, Fernando; Rudan, Igor; Rudolph, Anja; Ruggiero, Daniela; Sala, Cinzia F; Sanna, Serena; Sawyer, Elinor J; Schlessinger, David; Schmidt, Marjanka K; Schmidt, Frank; Schmutzler, Rita K; Schoemaker, Minouk J; Scott, Robert A; Seynaeve, Caroline M; Simard, Jacques; Sorice, Rossella; Southey, Melissa C; Stöckl, Doris; Strauch, Konstantin; Swerdlow, Anthony; Taylor, Kent D; Thorsteinsdottir, Unnur; Toland, Amanda E; Tomlinson, Ian; Truong, Thérèse; Tryggvadottir, Laufey; Turner, Stephen T; Vozzi, Diego; Wang, Qin; Wellons, Melissa; Willemsen, Gonneke; Wilson, James F; Winqvist, Robert; Wolffenbuttel, Bruce B H R; Wright, Alan F; Yannoukakos, Drakoulis; Zemunik, Tatijana; Zheng, Wei; Zygmunt, Marek; Bergmann, Sven; Boomsma, Dorret I; Buring, Julie E; Ferrucci, Luigi; Montgomery, Grant W; Gudnason, Vilmundur; Spector, Tim D; van Duijn, Cornelia M; Alizadeh, Behrooz Z; Ciullo, Marina; Crisponi, Laura; Easton, Douglas F; Gasparini, Paolo P; Gieger, Christian; Harris, Tamara B; Hayward, Caroline; Kardia, Sharon L R; Kraft, Peter; McKnight, Barbara; Metspalu, Andres; Morrison, Alanna C; Reiner, Alex P; Ridker, Paul M; Rotter, Jerome I; Toniolo, Daniela; Uitterlinden, André G; Ulivi, Sheila; Völzke, Henry; Wareham, Nicholas J; Weir, David R; Yerges-Armstrong, Laura M; Price, Alkes L; Stefansson, Kari; Visser, Jenny A; Ong, Ken K; Chang-Claude, Jenny; Murabito, Joanne M; Perry, John R B; Murray, Anna
2015-11-01
Menopause timing has a substantial impact on infertility and risk of disease, including breast cancer, but the underlying mechanisms are poorly understood. We report a dual strategy in ∼70,000 women to identify common and low-frequency protein-coding variation associated with age at natural menopause (ANM). We identified 44 regions with common variants, including two regions harboring additional rare missense alleles of large effect. We found enrichment of signals in or near genes involved in delayed puberty, highlighting the first molecular links between the onset and end of reproductive lifespan. Pathway analyses identified major association with DNA damage response (DDR) genes, including the first common coding variant in BRCA1 associated with any complex trait. Mendelian randomization analyses supported a causal effect of later ANM on breast cancer risk (∼6% increase in risk per year; P = 3 × 10(-14)), likely mediated by prolonged sex hormone exposure rather than DDR mechanisms.
Lunetta, Kathryn L.; Pervjakova, Natalia; Chasman, Daniel I.; Stolk, Lisette; Finucane, Hilary K.; Sulem, Patrick; Bulik-Sullivan, Brendan; Esko, Tõnu; Johnson, Andrew D.; Elks, Cathy E.; Franceschini, Nora; He, Chunyan; Altmaier, Elisabeth; Brody, Jennifer A.; Franke, Lude L.; Huffman, Jennifer E.; Keller, Margaux F.; McArdle, Patrick F.; Nutile, Teresa; Porcu, Eleonora; Robino, Antonietta; Rose, Lynda M.; Schick, Ursula M.; Smith, Jennifer A.; Teumer, Alexander; Traglia, Michela; Vuckovic, Dragana; Yao, Jie; Zhao, Wei; Albrecht, Eva; Amin, Najaf; Corre, Tanguy; Hottenga, Jouke-Jan; Mangino, Massimo; Smith, Albert V.; Tanaka, Toshiko; Abecasis, Goncalo; Andrulis, Irene L.; Anton-Culver, Hoda; Antoniou, Antonis C.; Arndt, Volker; Arnold, Alice M.; Barbieri, Caterina; Beckmann, Matthias W.; Beeghly-Fadiel, Alicia; Benitez, Javier; Bernstein, Leslie; Bielinski, Suzette J.; Blomqvist, Carl; Boerwinkle, Eric; Bogdanova, Natalia V.; Bojesen, Stig E.; Bolla, Manjeet K.; Borresen-Dale, Anne-Lise; Boutin, Thibaud S; Brauch, Hiltrud; Brenner, Hermann; Brüning, Thomas; Burwinkel, Barbara; Campbell, Archie; Campbell, Harry; Chanock, Stephen J.; Chapman, J. Ross; Chen, Yii-Der Ida; Chenevix-Trench, Georgia; Couch, Fergus J.; Coviello, Andrea D.; Cox, Angela; Czene, Kamila; Darabi, Hatef; De Vivo, Immaculata; Demerath, Ellen W.; Dennis, Joe; Devilee, Peter; Dörk, Thilo; dos-Santos-Silva, Isabel; Dunning, Alison M.; Eicher, John D.; Fasching, Peter A.; Faul, Jessica D.; Figueroa, Jonine; Flesch-Janys, Dieter; Gandin, Ilaria; Garcia, Melissa E.; García-Closas, Montserrat; Giles, Graham G.; Girotto, Giorgia G.; Goldberg, Mark S.; González-Neira, Anna; Goodarzi, Mark O.; Grove, Megan L.; Gudbjartsson, Daniel F.; Guénel, Pascal; Guo, Xiuqing; Haiman, Christopher A.; Hall, Per; Hamann, Ute; Henderson, Brian E.; Hocking, Lynne J.; Hofman, Albert; Homuth, Georg; Hooning, Maartje J.; Hopper, John L.; Hu, Frank B.; Huang, Jinyan; Humphreys, Keith; Hunter, David J.; Jakubowska, Anna; Jones, Samuel E.; Kabisch, Maria; Karasik, David; Knight, Julia A.; Kolcic, Ivana; Kooperberg, Charles; Kosma, Veli-Matti; Kriebel, Jennifer; Kristensen, Vessela; Lambrechts, Diether; Langenberg, Claudia; Li, Jingmei; Li, Xin; Lindström, Sara; Liu, Yongmei; Luan, Jian’an; Lubinski, Jan; Mägi, Reedik; Mannermaa, Arto; Manz, Judith; Margolin, Sara; Marten, Jonathan; Martin, Nicholas G.; Masciullo, Corrado; Meindl, Alfons; Michailidou, Kyriaki; Mihailov, Evelin; Milani, Lili; Milne, Roger L.; Müller-Nurasyid, Martina; Nalls, Michael; Neale, Ben M.; Nevanlinna, Heli; Neven, Patrick; Newman, Anne B.; Nordestgaard, Børge G.; Olson, Janet E.; Padmanabhan, Sandosh; Peterlongo, Paolo; Peters, Ulrike; Petersmann, Astrid; Peto, Julian; Pharoah, Paul D.P.; Pirastu, Nicola N.; Pirie, Ailith; Pistis, Giorgio; Polasek, Ozren; Porteous, David; Psaty, Bruce M.; Pylkäs, Katri; Radice, Paolo; Raffel, Leslie J.; Rivadeneira, Fernando; Rudan, Igor; Rudolph, Anja; Ruggiero, Daniela; Sala, Cinzia F.; Sanna, Serena; Sawyer, Elinor J.; Schlessinger, David; Schmidt, Marjanka K.; Schmidt, Frank; Schmutzler, Rita K.; Schoemaker, Minouk J.; Scott, Robert A.; Seynaeve, Caroline M.; Simard, Jacques; Sorice, Rossella; Southey, Melissa C.; Stöckl, Doris; Strauch, Konstantin; Swerdlow, Anthony; Taylor, Kent D.; Thorsteinsdottir, Unnur; Toland, Amanda E.; Tomlinson, Ian; Truong, Thérèse; Tryggvadottir, Laufey; Turner, Stephen T.; Vozzi, Diego; Wang, Qin; Wellons, Melissa; Willemsen, Gonneke; Wilson, James F.; Winqvist, Robert; Wolffenbuttel, Bruce B.H.R.; Wright, Alan F.; Yannoukakos, Drakoulis; Zemunik, Tatijana; Zheng, Wei; Zygmunt, Marek; Bergmann, Sven; Boomsma, Dorret I.; Buring, Julie E.; Ferrucci, Luigi; Montgomery, Grant W.; Gudnason, Vilmundur; Spector, Tim D.; van Duijn, Cornelia M; Alizadeh, Behrooz Z.; Ciullo, Marina; Crisponi, Laura; Easton, Douglas F.; Gasparini, Paolo P.; Gieger, Christian; Harris, Tamara B.; Hayward, Caroline; Kardia, Sharon L.R.; Kraft, Peter; McKnight, Barbara; Metspalu, Andres; Morrison, Alanna C.; Reiner, Alex P.; Ridker, Paul M.; Rotter, Jerome I.; Toniolo, Daniela; Uitterlinden, André G.; Ulivi, Sheila; Völzke, Henry; Wareham, Nicholas J.; Weir, David R.; Yerges-Armstrong, Laura M.; Price, Alkes L.; Stefansson, Kari; Visser, Jenny A.; Ong, Ken K.; Chang-Claude, Jenny; Murabito, Joanne M.; Perry, John R.B.; Murray, Anna
2015-01-01
Menopause timing has a substantial impact on infertility and risk of disease, including breast cancer, but the underlying mechanisms are poorly understood. We report a dual strategy in ~70,000 women to identify common and low-frequency protein-coding variation associated with age at natural menopause (ANM). We identified 44 regions with common variants, including two harbouring additional rare missense alleles of large effect. We found enrichment of signals in/near genes involved in delayed puberty, highlighting the first molecular links between the onset and end of reproductive lifespan. Pathway analyses revealed a major association with DNA damage-response (DDR) genes, including the first common coding variant in BRCA1 associated with any complex trait. Mendelian randomisation analyses supported a causal effect of later ANM on breast cancer risk (~6% risk increase per-year, P=3×10−14), likely mediated by prolonged sex hormone exposure, rather than DDR mechanisms. PMID:26414677
Defect tolerance in resistor-logic demultiplexers for nanoelectronics.
Kuekes, Philip J; Robinett, Warren; Williams, R Stanley
2006-05-28
Since defect rates are expected to be high in nanocircuitry, we analyse the performance of resistor-based demultiplexers in the presence of defects. The defects observed to occur in fabricated nanoscale crossbars are stuck-open, stuck-closed, stuck-short, broken-wire, and adjacent-wire-short defects. We analyse the distribution of voltages on the nanowire output lines of a resistor-logic demultiplexer, based on an arbitrary constant-weight code, when defects occur. These analyses show that resistor-logic demultiplexers can tolerate small numbers of stuck-closed, stuck-open, and broken-wire defects on individual nanowires, at the cost of some degradation in the circuit's worst-case voltage margin. For stuck-short and adjacent-wire-short defects, and for nanowires with too many defects of the other types, the demultiplexer can still achieve error-free performance, but with a smaller set of output lines. This design thus has two layers of defect tolerance: the coding layer improves the yield of usable output lines, and an avoidance layer guarantees that error-free performance is achieved.
Santos, José; Monteagudo, Ángel
2017-03-27
The canonical code, although prevailing in complex genomes, is not universal. It was shown the canonical genetic code superior robustness compared to random codes, but it is not clearly determined how it evolved towards its current form. The error minimization theory considers the minimization of point mutation adverse effect as the main selection factor in the evolution of the code. We have used simulated evolution in a computer to search for optimized codes, which helps to obtain information about the optimization level of the canonical code in its evolution. A genetic algorithm searches for efficient codes in a fitness landscape that corresponds with the adaptability of possible hypothetical genetic codes. The lower the effects of errors or mutations in the codon bases of a hypothetical code, the more efficient or optimal is that code. The inclusion of the fitness sharing technique in the evolutionary algorithm allows the extent to which the canonical genetic code is in an area corresponding to a deep local minimum to be easily determined, even in the high dimensional spaces considered. The analyses show that the canonical code is not in a deep local minimum and that the fitness landscape is not a multimodal fitness landscape with deep and separated peaks. Moreover, the canonical code is clearly far away from the areas of higher fitness in the landscape. Given the non-presence of deep local minima in the landscape, although the code could evolve and different forces could shape its structure, the fitness landscape nature considered in the error minimization theory does not explain why the canonical code ended its evolution in a location which is not an area of a localized deep minimum of the huge fitness landscape.
F-14A aircraft high-speed flow simulations
NASA Technical Reports Server (NTRS)
Boppe, C. W.; Rosen, B. S.
1985-01-01
A model of the Grumman/Navy F-14A aircraft was developed for analyses using the NASA/Grumman Transonic Wing-Body Code. Computations were performed for isolated wing and wing fuselage glove arrangements to determine the extent of aerodynamic interference effects which propagate outward onto the main wing outer panel. Additional studies were conducted using the full potential analysis, FLO 22, to calibrate any inaccuracies that might accrue because of small disturbance code limitations. Comparisons indicate that the NASA/Grumman code provides excellent flow simulations for the range of wing sweep angles and flow conditions that will be of interest for the upcoming F-14 Variable Sweep Flight Transition Experiment.
Combining Thermal And Structural Analyses
NASA Technical Reports Server (NTRS)
Winegar, Steven R.
1990-01-01
Computer code makes programs compatible so stresses and deformations calculated. Paper describes computer code combining thermal analysis with structural analysis. Called SNIP (for SINDA-NASTRAN Interfacing Program), code provides interface between finite-difference thermal model of system and finite-element structural model when no node-to-element correlation between models. Eliminates much manual work in converting temperature results of SINDA (Systems Improved Numerical Differencing Analyzer) program into thermal loads for NASTRAN (NASA Structural Analysis) program. Used to analyze concentrating reflectors for solar generation of electric power. Large thermal and structural models needed to predict distortion of surface shapes, and SNIP saves considerable time and effort in combining models.
Alternative Line Coding Scheme with Fixed Dimming for Visible Light Communication
NASA Astrophysics Data System (ADS)
Niaz, M. T.; Imdad, F.; Kim, H. S.
2017-01-01
An alternative line coding scheme called fixed-dimming on/off keying (FD-OOK) is proposed for visible-light communication (VLC). FD-OOK reduces the flickering caused by a VLC transmitter and can maintain a 50% dimming level. Simple encoder and decoder are proposed which generates codes where the number of bits representing one is same as the number of bits representing zero. By keeping the number of ones and zeros equal the change in the brightness of lighting may be minimized and kept constant at 50%, thereby reducing the flickering in VLC. The performance of FD-OOK is analysed with two parameters: the spectral efficiency and power requirement.
Tools for Designing and Analyzing Structures
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Structural Design and Analysis Toolset is a collection of approximately 26 Microsoft Excel spreadsheet programs, each of which performs calculations within a different subdiscipline of structural design and analysis. These programs present input and output data in user-friendly, menu-driven formats. Although these programs cannot solve complex cases like those treated by larger finite element codes, these programs do yield quick solutions to numerous common problems more rapidly than the finite element codes, thereby making it possible to quickly perform multiple preliminary analyses - e.g., to establish approximate limits prior to detailed analyses by the larger finite element codes. These programs perform different types of calculations, as follows: 1. determination of geometric properties for a variety of standard structural components; 2. analysis of static, vibrational, and thermal- gradient loads and deflections in certain structures (mostly beams and, in the case of thermal-gradients, mirrors); 3. kinetic energies of fans; 4. detailed analysis of stress and buckling in beams, plates, columns, and a variety of shell structures; and 5. temperature dependent properties of materials, including figures of merit that characterize strength, stiffness, and deformation response to thermal gradients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michaels, A.I.; Sillman, S.; Baylin, F.
1983-05-01
A central solar-heating plant with seasonal heat storage in a deep underground aquifer is designed by means of a solar-seasonal-storage-system simulation code based on the Solar Energy Research Institute (SERI) code for Solar Annual Storage Simulation (SASS). This Solar Seasonal Storage Plant is designed to supply close to 100% of the annual heating and domestic-hot-water (DHW) load of a hypothetical new community, the Fox River Valley Project, for a location in Madison, Wisconsin. Some analyses are also carried out for Boston, Massachusetts and Copenhagen, Denmark, as an indication of weather and insolation effects. Analyses are conducted for five different typesmore » of solar collectors, and for an alternate system utilizing seasonal storage in a large water tank. Predicted seasonal performance and system and storage costs are calculated. To provide some validation of the SASS results, a simulation of the solar system with seasonal storage in a large water tank is also carried out with a modified version of the Swedish Solar Seasonal Storage Code MINSUN.« less
Path Toward a Unified Geometry for Radiation Transport
NASA Astrophysics Data System (ADS)
Lee, Kerry
The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The work-flow for doing radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats.
Tsopra, Rosy; Peckham, Daniel; Beirne, Paul; Rodger, Kirsty; Callister, Matthew; White, Helen; Jais, Jean-Philippe; Ghosh, Dipansu; Whitaker, Paul; Clifton, Ian J; Wyatt, Jeremy C
2018-07-01
Coding of diagnoses is important for patient care, hospital management and research. However coding accuracy is often poor and may reflect methods of coding. This study investigates the impact of three alternative coding methods on the inaccuracy of diagnosis codes and hospital reimbursement. Comparisons of coding inaccuracy were made between a list of coded diagnoses obtained by a coder using (i)the discharge summary alone, (ii)case notes and discharge summary, and (iii)discharge summary with the addition of medical input. For each method, inaccuracy was determined for the primary, secondary diagnoses, Healthcare Resource Group (HRG) and estimated hospital reimbursement. These data were then compared with a gold standard derived by a consultant and coder. 107 consecutive patient discharges were analysed. Inaccuracy of diagnosis codes was highest when a coder used the discharge summary alone, and decreased significantly when the coder used the case notes (70% vs 58% respectively, p < 0.0001) or coded from the discharge summary with medical support (70% vs 60% respectively, p < 0.0001). When compared with the gold standard, the percentage of incorrect HRGs was 42% for discharge summary alone, 31% for coding with case notes, and 35% for coding with medical support. The three coding methods resulted in an annual estimated loss of hospital remuneration of between £1.8 M and £16.5 M. The accuracy of diagnosis codes and percentage of correct HRGs improved when coders used either case notes or medical support in addition to the discharge summary. Further emphasis needs to be placed on improving the standard of information recorded in discharge summaries. Copyright © 2018 Elsevier B.V. All rights reserved.
Probabilistic Assessment of National Wind Tunnel
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M.; Chamis, C. C.
1996-01-01
A preliminary probabilistic structural assessment of the critical section of National Wind Tunnel (NWT) is performed using NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) computer code. Thereby, the capabilities of NESSUS code have been demonstrated to address reliability issues of the NWT. Uncertainties in the geometry, material properties, loads and stiffener location on the NWT are considered to perform the reliability assessment. Probabilistic stress, frequency, buckling, fatigue and proof load analyses are performed. These analyses cover the major global and some local design requirements. Based on the assumed uncertainties, the results reveal the assurance of minimum 0.999 reliability for the NWT. Preliminary life prediction analysis results show that the life of the NWT is governed by the fatigue of welds. Also, reliability based proof test assessment is performed.
McKenzie, Sam; Keene, Chris; Farovik, Anja; Blandon, John; Place, Ryan; Komorowski, Robert; Eichenbaum, Howard
2016-01-01
Here we consider the value of neural population analysis as an approach to understanding how information is represented in the hippocampus and cortical areas and how these areas might interact as a brain system to support memory. We argue that models based on sparse coding of different individual features by single neurons in these areas (e.g., place cells, grid cells) are inadequate to capture the complexity of experience represented within this system. By contrast, population analyses of neurons with denser coding and mixed selectivity reveal new and important insights into the organization of memories. Furthermore, comparisons of the organization of information in interconnected areas suggest a model of hippocampal-cortical interactions that mediates the fundamental features of memory. PMID:26748022
Mistranslation: from adaptations to applications.
Hoffman, Kyle S; O'Donoghue, Patrick; Brandl, Christopher J
2017-11-01
The conservation of the genetic code indicates that there was a single origin, but like all genetic material, the cell's interpretation of the code is subject to evolutionary pressure. Single nucleotide variations in tRNA sequences can modulate codon assignments by altering codon-anticodon pairing or tRNA charging. Either can increase translation errors and even change the code. The frozen accident hypothesis argued that changes to the code would destabilize the proteome and reduce fitness. In studies of model organisms, mistranslation often acts as an adaptive response. These studies reveal evolutionary conserved mechanisms to maintain proteostasis even during high rates of mistranslation. This review discusses the evolutionary basis of altered genetic codes, how mistranslation is identified, and how deviations to the genetic code are exploited. We revisit early discoveries of genetic code deviations and provide examples of adaptive mistranslation events in nature. Lastly, we highlight innovations in synthetic biology to expand the genetic code. The genetic code is still evolving. Mistranslation increases proteomic diversity that enables cells to survive stress conditions or suppress a deleterious allele. Genetic code variants have been identified by genome and metagenome sequence analyses, suppressor genetics, and biochemical characterization. Understanding the mechanisms of translation and genetic code deviations enables the design of new codes to produce novel proteins. Engineering the translation machinery and expanding the genetic code to incorporate non-canonical amino acids are valuable tools in synthetic biology that are impacting biomedical research. This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.
Lorio, Morgan; Martinson, Melissa; Ferrara, Lisa
2016-01-01
Minimally invasive sacroiliac joint arthrodesis ("MI SIJ fusion") received a Category I CPT ® code (27279) effective January 1, 2015 and was assigned a work relative value unit ("RVU") of 9.03. The International Society for the Advancement of Spine Surgery ("ISASS") conducted a study consisting of a Rasch analysis of two separate surveys of surgeons to assess the accuracy of the assigned work RVU. A survey was developed and sent to ninety-three ISASS surgeon committee members. Respondents were asked to compare CPT ® 27279 to ten other comparator CPT ® codes reflective of common spine surgeries. The survey presented each comparator CPT ® code with its code descriptor as well as the description of CPT ® 27279 and asked respondents to indicate whether CPT ® 27279 was greater, equal, or less in terms of work effort than the comparator code. A second survey was sent to 557 U.S.-based spine surgeon members of ISASS and 241 spine surgeon members of the Society for Minimally Invasive Spine Surgery ("SMISS"). The design of the second survey mirrored that of the first survey except for the use of a broader set of comparator CPT ® codes (27 vs. 10). Using the work RVUs of the comparator codes, a Rasch analysis was performed to estimate the relative difficulty of CPT ® 27279, after which the work RVU of CPT ® 27279 was estimated by regression analysis. Twenty surgeons responded to the first survey and thirty-four surgeons responded to the second survey. The results of the regression analysis of the first survey indicate a work RVU for CPT ® 27279 of 14.36 and the results of the regression analysis of the second survey indicate a work RVU for CPT ® 27279 of 14.1. The Rasch analysis indicates that the current work RVU assigned to CPT ® 27279 is undervalued at 9.03. Averaging the results of the regression analyses of the two surveys indicates a work RVU for CPT ® 27279 of 14.23.
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph; Kopasakis, George
2016-01-01
An overview of recent applications of the FUN3D CFD code to computational aeroelastic, sonic boom, and aeropropulsoservoelasticity (APSE) analyses of a low-boom supersonic configuration is presented. The overview includes details of the computational models developed including multiple unstructured CFD grids suitable for aeroelastic and sonic boom analyses. In addition, aeroelastic Reduced-Order Models (ROMs) are generated and used to rapidly compute the aeroelastic response and utter boundaries at multiple flight conditions.
Parcels v0.9: prototyping a Lagrangian ocean analysis framework for the petascale age
NASA Astrophysics Data System (ADS)
Lange, Michael; van Sebille, Erik
2017-11-01
As ocean general circulation models (OGCMs) move into the petascale age, where the output of single simulations exceeds petabytes of storage space, tools to analyse the output of these models will need to scale up too. Lagrangian ocean analysis, where virtual particles are tracked through hydrodynamic fields, is an increasingly popular way to analyse OGCM output, by mapping pathways and connectivity of biotic and abiotic particulates. However, the current software stack of Lagrangian ocean analysis codes is not dynamic enough to cope with the increasing complexity, scale and need for customization of use-cases. Furthermore, most community codes are developed for stand-alone use, making it a nontrivial task to integrate virtual particles at runtime of the OGCM. Here, we introduce the new Parcels code, which was designed from the ground up to be sufficiently scalable to cope with petascale computing. We highlight its API design that combines flexibility and customization with the ability to optimize for HPC workflows, following the paradigm of domain-specific languages. Parcels is primarily written in Python, utilizing the wide range of tools available in the scientific Python ecosystem, while generating low-level C code and using just-in-time compilation for performance-critical computation. We show a worked-out example of its API, and validate the accuracy of the code against seven idealized test cases. This version 0.9 of Parcels is focused on laying out the API, with future work concentrating on support for curvilinear grids, optimization, efficiency and at-runtime coupling with OGCMs.
Dishion, Thomas J; Mun, Chung Jung; Tein, Jenn-Yun; Kim, Hanjoe; Shaw, Daniel S; Gardner, Frances; Wilson, Melvin N; Peterson, Jenene
2017-04-01
This study examined the validity of micro social observations and macro ratings of parent-child interaction in early to middle childhood. Seven hundred and thirty-one families representing multiple ethnic groups were recruited and screened as at risk in the context of Women, Infant, and Children (WIC) Nutritional Supplement service settings. Families were randomly assigned to the Family Checkup (FCU) intervention or the control condition at age 2 and videotaped in structured interactions in the home at ages 2, 3, 4, and 5. Parent-child interaction videotapes were micro-coded using the Relationship Affect Coding System (RACS) that captures the duration of two mutual dyadic states: positive engagement and coercion. Macro ratings of parenting skills were collected after coding the videotapes to assess parent use of positive behavior support and limit setting skills (or lack thereof). Confirmatory factor analyses revealed that the measurement model of macro ratings of limit setting and positive behavior support was not supported by the data, and thus, were excluded from further analyses. However, there was moderate stability in the families' micro social dynamics across early childhood and it showed significant improvements as a function of random assignment to the FCU. Moreover, parent-child dynamics were predictive of chronic behavior problems as rated by parents in middle childhood, but not emotional problems. We conclude with a discussion of the validity of the RACS and on methodological advantages of micro social coding over the statistical limitations of macro rating observations. Future directions are discussed for observation research in prevention science.
NASA Technical Reports Server (NTRS)
Steele, Gynelle C.
1999-01-01
The NASA Lewis Research Center and Flow Parametrics will enter into an agreement to commercialize the National Combustion Code (NCC). This multidisciplinary combustor design system utilizes computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. This integrated system can facilitate and enhance various phases of the design and analysis process.
Monte Carol-based validation of neutronic methodology for EBR-II analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liaw, J.R.; Finck, P.J.
1993-01-01
The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less
Analysis of SMA Hybrid Composite Structures using Commercial Codes
NASA Technical Reports Server (NTRS)
Turner, Travis L.; Patel, Hemant D.
2004-01-01
A thermomechanical model for shape memory alloy (SMA) actuators and SMA hybrid composite (SMAHC) structures has been recently implemented in the commercial finite element codes MSC.Nastran and ABAQUS. The model may be easily implemented in any code that has the capability for analysis of laminated composite structures with temperature dependent material properties. The model is also relatively easy to use and requires input of only fundamental engineering properties. A brief description of the model is presented, followed by discussion of implementation and usage in the commercial codes. Results are presented from static and dynamic analysis of SMAHC beams of two types; a beam clamped at each end and a cantilevered beam. Nonlinear static (post-buckling) and random response analyses are demonstrated for the first specimen. Static deflection (shape) control is demonstrated for the cantilevered beam. Approaches for modeling SMAHC material systems with embedded SMA in ribbon and small round wire product forms are demonstrated and compared. The results from the commercial codes are compared to those from a research code as validation of the commercial implementations; excellent correlation is achieved in all cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.P. Cramer & Associates, Inc.
2002-05-31
We recently received data on the decoded coded wire tags (CWT's) recovered from spring chinook snouts we collected during spawning surveys in the Clearwater Basin last fall (2001). We were curious about what could be learned from the tags recovered (even though our project is over), so we did some cursory analyses and have described our findings in the attached memo. Snouts were processed and codes determined by Idaho Department of Fish and Game. Most snouts did not contain CWTs, because most ad-clipped fish were not given a CWT. Further, because adults were outplanted live, we do not know whatmore » codes they contained. Each of the hatcheries from which outplanted adults were obtained had several CWT code groups returning. That means that the best we can do with the codes recovered is compare the hatchery of origin for the tag with the hatchery from which outplants were taken. The results are interesting and not exactly as we would have predicted.« less
Kisely, Steve; Crowe, Elizabeth; Lawrence, David; White, Angela; Connor, Jason
2013-08-01
In response to concerns about the health consequences of high-risk drinking by young people, the Australian Government increased the tax on pre-mixed alcoholic beverages ('alcopops') favoured by this demographic. We measured changes in admissions for alcohol-related harm to health throughout Queensland, before and after the tax increase in April 2008. We used data from the Queensland Trauma Register, Hospitals Admitted Patients Data Collection, and the Emergency Department Information System to calculate alcohol-related admission rates per 100,000 people, for 15 - 29 year-olds. We analysed data over 3 years (April 2006 - April 2009), using interrupted time-series analyses. This covered 2 years before, and 1 year after, the tax increase. We investigated both mental and behavioural consequences (via F10 codes), and intentional/unintentional injuries (S and T codes). We fitted an auto-regressive integrated moving average (ARIMA) model, to test for any changes following the increased tax. There was no decrease in alcohol-related admissions in 15 - 29 year-olds. We found similar results for males and females, as well as definitions of alcohol-related harms that were narrow (F10 codes only) and broad (F10, S and T codes). The increased tax on 'alcopops' was not associated with any reduction in hospital admissions for alcohol-related harms in Queensland 15 - 29 year-olds.
Update and evaluation of decay data for spent nuclear fuel analyses
NASA Astrophysics Data System (ADS)
Simeonov, Teodosi; Wemple, Charles
2017-09-01
Studsvik's approach to spent nuclear fuel analyses combines isotopic concentrations and multi-group cross-sections, calculated by the CASMO5 or HELIOS2 lattice transport codes, with core irradiation history data from the SIMULATE5 reactor core simulator and tabulated isotopic decay data. These data sources are used and processed by the code SNF to predict spent nuclear fuel characteristics. Recent advances in the generation procedure for the SNF decay data are presented. The SNF decay data includes basic data, such as decay constants, atomic masses and nuclide transmutation chains; radiation emission spectra for photons from radioactive decay, alpha-n reactions, bremsstrahlung, and spontaneous fission, electrons and alpha particles from radioactive decay, and neutrons from radioactive decay, spontaneous fission, and alpha-n reactions; decay heat production; and electro-atomic interaction data for bremsstrahlung production. These data are compiled from fundamental (ENDF, ENSDF, TENDL) and processed (ESTAR) sources for nearly 3700 nuclides. A rigorous evaluation procedure of internal consistency checks and comparisons to measurements and benchmarks, and code-to-code verifications is performed at the individual isotope level and using integral characteristics on a fuel assembly level (e.g., decay heat, radioactivity, neutron and gamma sources). Significant challenges are presented by the scope and complexity of the data processing, a dearth of relevant detailed measurements, and reliance on theoretical models for some data.
Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study.
Weems, Shelley; Heller, Pamela; Fenton, Susan H
2015-01-01
The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: 1. Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity. 2. Coder training and type of record (inpatient versus outpatient) affect coding productivity. 3. Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity.
Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study
Weems, Shelley; Heller, Pamela; Fenton, Susan H.
2015-01-01
The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity.Coder training and type of record (inpatient versus outpatient) affect coding productivity.Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity. PMID:26396553
Enhancement of the CAVE computer code
NASA Astrophysics Data System (ADS)
Rathjen, K. A.; Burk, H. O.
1983-12-01
The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.
Solov'ev, V V; Kel', A E; Kolchanov, N A
1989-01-01
The factors, determining the presence of inverted and symmetrical repeats in genes coding for globular proteins, have been analysed. An interesting property of genetical code has been revealed in the analysis of symmetrical repeats: the pairs of symmetrical codons corresponded to pairs of amino acids with mostly similar physical-chemical parameters. This property may explain the presence of symmetrical repeats and palindromes only in genes coding for beta-structural proteins-polypeptides, where amino acids with similar physical-chemical properties occupy symmetrical positions. A stochastic model of evolution of polynucleotide sequences has been used for analysis of inverted repeats. The modelling demonstrated that only limiting of sequences (uneven frequencies of used codons) is enough for arising of nonrandom inverted repeats in genes.
Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E
2014-01-01
Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.
Yi, Zhenzhen; Song, Weibo; Clamp, John C; Chen, Zigui; Gao, Shan; Zhang, Qianqian
2009-03-01
Comprehensive molecular analyses of phylogenetic relationships within euplotid ciliates are relatively rare, and the relationships among some families remain questionable. We performed phylogenetic analyses of the order Euplotida based on new sequences of the gene coding for small-subunit RNA (SSrRNA) from a variety of taxa across the entire order as well as sequences from some of these taxa of other genes (ITS1-5.8S-ITS2 region and histone H4) that have not been included in previous analyses. Phylogenetic trees based on SSrRNA gene sequences constructed with four different methods had a consistent branching pattern that included the following features: (1) the "typical" euplotids comprised a paraphyletic assemblage composed of two divergent clades (family Uronychiidae and families Euplotidae-Certesiidae-Aspidiscidae-Gastrocirrhidae), (2) in the family Uronychiidae, the genera Uronychia and Paradiophrys formed a clearly outlined, well-supported clade that seemed to be rather divergent from Diophrys and Diophryopsis, suggesting that the Diophrys-complex may have had a longer and more separate evolutionary history than previously supposed, (3) inclusion of 12 new SSrRNA sequences in analyses of Euplotidae revealed two new clades of species within the family and cast additional doubt on the present classification of genera within the family, and (4) the intraspecific divergence among five species of Aspidisca was far greater than those of closely related genera. The ITS1-5.8S-ITS2 coding regions and partial histone H4 genes of six morphospecies in the Diophrys-complex were sequenced along with their SSrRNA genes and used to compare phylogenies constructed from single data sets to those constructed from combined sets. Results indicated that combined analyses could be used to construct more reliable, less ambiguous phylogenies of complex groups like the order Euplotida, because they provide a greater amount and diversity of information.
Dynamics of face and annular seals with two-phase flow
NASA Technical Reports Server (NTRS)
Hughes, William F.; Basu, Prithwish; Beatty, Paul A.; Beeler, Richard M.; Lau, Stephen
1989-01-01
A detailed study was made of face and annular seals under conditions where boiling, i.e., phase change of the leaking fluid, occurs within the seal. Many seals operate in this mode because of flashing due to pressure drop and/or heat input from frictional heating. High pressure, water pumps, industrial chemical pumps, and cryogenic pumps are mentioned as a few of many applications. The initial motivation was the LOX-GOX seals for the space shuttle main engine, but the study was expanded to include any face or annular seal where boiling occurs. Some of the distinctive behavior characteristics of two-phase seals were discussed, particularly their axial stability. While two-phase seals probably exhibit instability to disturbances of other degrees of freedom such as wobble, etc., under certain conditions, such analyses are too complex to be treated at present. Since an all liquid seal (with parallel faces) has a neutral axial stiffness curve, and is stabilized axially by convergent coning, other degrees of freedom stability analyses are necessary. However, the axial stability behavior of the two-phase seal is always a consideration no matter how well the seal is aligned and regardless of the speed. Hence, axial stability is thought of as the primary design consideration for two-phase seals and indeed the stability behavior under sub-cooling variations probably overshadows other concerns. The main thrust was the dynamic analysis of axial motion of two-phase face seals, principally the determination of axial stiffness, and the steady behavior of two-phase annular seals. The main conclusions are that seals with two-phase flow may be unstable if improperly balanced. Detailed theoretical analyses of low (laminar) and high (turbulent) leakage seals are presented along with computer codes, parametric studies, and in particular a simplified PC based code that allows for rapid performance prediction. A simplified combined computer code for the performance prediction over the laminar and turbulent ranges of a two-phase seal is described and documented. The analyses, results, and computer codes are summarized.
Enrichment of Circular Code Motifs in the Genes of the Yeast Saccharomyces cerevisiae.
Michel, Christian J; Ngoune, Viviane Nguefack; Poch, Olivier; Ripp, Raymond; Thompson, Julie D
2017-12-03
A set X of 20 trinucleotides has been found to have the highest average occurrence in the reading frame, compared to the two shifted frames, of genes of bacteria, archaea, eukaryotes, plasmids and viruses. This set X has an interesting mathematical property, since X is a maximal C3 self-complementary trinucleotide circular code. Furthermore, any motif obtained from this circular code X has the capacity to retrieve, maintain and synchronize the original (reading) frame. Since 1996, the theory of circular codes in genes has mainly been developed by analysing the properties of the 20 trinucleotides of X, using combinatorics and statistical approaches. For the first time, we test this theory by analysing the X motifs, i.e., motifs from the circular code X, in the complete genome of the yeast Saccharomyces cerevisiae . Several properties of X motifs are identified by basic statistics (at the frequency level), and evaluated by comparison to R motifs, i.e., random motifs generated from 30 different random codes R. We first show that the frequency of X motifs is significantly greater than that of R motifs in the genome of S. cerevisiae . We then verify that no significant difference is observed between the frequencies of X and R motifs in the non-coding regions of S. cerevisiae , but that the occurrence number of X motifs is significantly higher than R motifs in the genes (protein-coding regions). This property is true for all cardinalities of X motifs (from 4 to 20) and for all 16 chromosomes. We further investigate the distribution of X motifs in the three frames of S. cerevisiae genes and show that they occur more frequently in the reading frame, regardless of their cardinality or their length. Finally, the ratio of X genes, i.e., genes with at least one X motif, to non-X genes, in the set of verified genes is significantly different to that observed in the set of putative or dubious genes with no experimental evidence. These results, taken together, represent the first evidence for a significant enrichment of X motifs in the genes of an extant organism. They raise two hypotheses: the X motifs may be evolutionary relics of the primitive codes used for translation, or they may continue to play a functional role in the complex processes of genome decoding and protein synthesis.
Numerical predictions of EML (electromagnetic launcher) system performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schnurr, N.M.; Kerrisk, J.F.; Davidson, R.F.
1987-01-01
The performance of an electromagnetic launcher (EML) depends on a large number of parameters, including the characteristics of the power supply, rail geometry, rail and insulator material properties, injection velocity, and projectile mass. EML system performance is frequently limited by structural or thermal effects in the launcher (railgun). A series of computer codes has been developed at the Los Alamos National Laboratory to predict EML system performance and to determine the structural and thermal constraints on barrel design. These codes include FLD, a two-dimensional electrostatic code used to calculate the high-frequency inductance gradient and surface current density distribution for themore » rails; TOPAZRG, a two-dimensional finite-element code that simultaneously analyzes thermal and electromagnetic diffusion in the rails; and LARGE, a code that predicts the performance of the entire EML system. Trhe NIKE2D code, developed at the Lawrence Livermore National Laboratory, is used to perform structural analyses of the rails. These codes have been instrumental in the design of the Lethality Test System (LTS) at Los Alamos, which has an ultimate goal of accelerating a 30-g projectile to a velocity of 15 km/s. The capabilities of the individual codes and the coupling of these codes to perform a comprehensive analysis is discussed in relation to the LTS design. Numerical predictions are compared with experimental data and presented for the LTS prototype tests.« less
Haghighi, Mohammad Hosein Hayavi; Dehghani, Mohammad; Teshnizi, Saeid Hoseini; Mahmoodi, Hamid
2014-01-01
Accurate cause of death coding leads to organised and usable death information but there are some factors that influence documentation on death certificates and therefore affect the coding. We reviewed the role of documentation errors on the accuracy of death coding at Shahid Mohammadi Hospital (SMH), Bandar Abbas, Iran. We studied the death certificates of all deceased patients in SMH from October 2010 to March 2011. Researchers determined and coded the underlying cause of death on the death certificates according to the guidelines issued by the World Health Organization in Volume 2 of the International Statistical Classification of Diseases and Health Related Problems-10th revision (ICD-10). Necessary ICD coding rules (such as the General Principle, Rules 1-3, the modification rules and other instructions about death coding) were applied to select the underlying cause of death on each certificate. Demographic details and documentation errors were then extracted. Data were analysed with descriptive statistics and chi square tests. The accuracy rate of causes of death coding was 51.7%, demonstrating a statistically significant relationship (p=.001) with major errors but not such a relationship with minor errors. Factors that result in poor quality of Cause of Death coding in SMH are lack of coder training, documentation errors and the undesirable structure of death certificates.
Airfoil Vibration Dampers program
NASA Technical Reports Server (NTRS)
Cook, Robert M.
1991-01-01
The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.
NASA Astrophysics Data System (ADS)
Zhang, Chongfu; Qiu, Kun; Xu, Bo; Ling, Yun
2008-05-01
This paper proposes an all-optical label processing scheme that uses the multiple optical orthogonal codes sequences (MOOCS)-based optical label for optical packet switching (OPS) (MOOCS-OPS) networks. In this scheme, each MOOCS is a permutation or combination of the multiple optical orthogonal codes (MOOC) selected from the multiple-groups optical orthogonal codes (MGOOC). Following a comparison of different optical label processing (OLP) schemes, the principles of MOOCS-OPS network are given and analyzed. Firstly, theoretical analyses are used to prove that MOOCS is able to greatly enlarge the number of available optical labels when compared to the previous single optical orthogonal code (SOOC) for OPS (SOOC-OPS) network. Then, the key units of the MOOCS-based optical label packets, including optical packet generation, optical label erasing, optical label extraction and optical label rewriting etc., are given and studied. These results are used to verify that the proposed MOOCS-OPS scheme is feasible.
NASA Astrophysics Data System (ADS)
Boonsong, S.; Siharak, S.; Srikanok, V.
2018-02-01
The purposes of this research were to develop the learning management, which was prepared for the enhancement of students’ Moral Ethics and Code of Ethics in Rajamangala University of Technology Thanyaburi (RMUTT). The contextual study and the ideas for learning management development was conducted by the document study, focus group method and content analysis from the document about moral ethics and code of ethics of the teaching profession concerning Graduate Diploma for Teaching Profession Program. The main tools of this research were the summarize papers and analyse papers. The results of development showed the learning management for the development of moral ethics and code of ethics of the teaching profession for Graduate Diploma for Teaching Profession students could promote desired moral ethics and code of ethics of the teaching profession character by the integrated learning techniques which consisted of Service Learning, Contract System, Value Clarification, Role Playing, and Concept Mapping. The learning management was presented in 3 steps.
Oh, Chang Seok; Lee, Soong Deok; Kim, Yi-Suk; Shin, Dong Hoon
2015-01-01
Previous study showed that East Asian mtDNA haplogroups, especially those of Koreans, could be successfully assigned by the coupled use of analyses on coding region SNP markers and control region mutation motifs. In this study, we tried to see if the same triple multiplex analysis for coding regions SNPs could be also applicable to ancient samples from East Asia as the complementation for sequence analysis of mtDNA control region. By the study on Joseon skeleton samples, we know that mtDNA haplogroup determined by coding region SNP markers successfully falls within the same haplogroup that sequence analysis on control region can assign. Considering that ancient samples in previous studies make no small number of errors in control region mtDNA sequencing, coding region SNP analysis can be used as good complimentary to the conventional haplogroup determination, especially of archaeological human bone samples buried underground over long periods. PMID:26345190
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
NASA Astrophysics Data System (ADS)
Sedmak, M. R.
The effects of the provisions of the existing corporate and personal income tax codes on solar investment decisions are analyzed. It is shown that the provisions of a tax code do not discriminate against investment in solar technologies if the present value of depreciation and interest expense tax deductions over the relevant decision period is equal to the present value of actual capital expenses. However, on the basis of a quantitative analyses, it is concluded that the existing corporate income tax code does discriminate against solar investments for the majority of corporations, although the 25 percent tax credit available to businesses for solar investments is sufficient to alleviate the distortion in most cases. In contrast, the provisions of the existing personal income tax code favor solar investments over investments in less capital intensive energy generating units, as the interest paid on loads used to finance solar investments made by individuals is tax deductible, while conventional fuel expenses are not deductible.
Ringdal, Kjetil G; Skaga, Nils Oddvar; Hestnes, Morten; Steen, Petter Andreas; Røislien, Jo; Rehn, Marius; Røise, Olav; Krüger, Andreas J; Lossius, Hans Morten
2013-05-01
Injury severity is most frequently classified using the Abbreviated Injury Scale (AIS) as a basis for the Injury Severity Score (ISS) and the New Injury Severity Score (NISS), which are used for assessment of overall injury severity in the multiply injured patient and in outcome prediction. European trauma registries recommended the AIS 2008 edition, but the levels of inter-rater agreement and reliability of ISS and NISS, associated with its use, have not been reported. Nineteen Norwegian AIS-certified trauma registry coders were invited to score 50 real, anonymised patient medical records using AIS 2008. Rater agreements for ISS and NISS were analysed using Bland-Altman plots with 95% limits of agreement (LoA). A clinically acceptable LoA range was set at ± 9 units. Reliability was analysed using a two-way mixed model intraclass correlation coefficient (ICC) statistics with corresponding 95% confidence intervals (CI) and hierarchical agglomerative clustering. Ten coders submitted their coding results. Of their AIS codes, 2189 (61.5%) agreed with a reference standard, 1187 (31.1%) real injuries were missed, and 392 non-existing injuries were recorded. All LoAs were wider than the predefined, clinically acceptable limit of ± 9, for both ISS and NISS. The joint ICC (range) between each rater and the reference standard was 0.51 (0.29,0.86) for ISS and 0.51 (0.27,0.78) for NISS. The joint ICC (range) for inter-rater reliability was 0.49 (0.19,0.85) for ISS and 0.49 (0.16,0.82) for NISS. Univariate linear regression analyses indicated a significant relationship between the number of correctly AIS-coded injuries and total number of cases coded during the rater's career, but no significant relationship between the rater-against-reference ISS and NISS ICC values and total number of cases coded during the rater's career. Based on AIS 2008, ISS and NISS were not reliable for summarising anatomic injury severity in this study. This result indicates a limitation in their use as benchmarking tools for trauma system performance. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, S.D.; Dearing, J.F.
An understanding of conditions that may cause sodium boiling and boiling propagation that may lead to dryout and fuel failure is crucial in liquid-metal fast-breeder reactor safety. In this study, the SABRE-2P subchannel analysis code has been used to analyze the ultimate transient of the in-core W-1 Sodium Loop Safety Facility experiment. This code has a 3-D simple nondynamic boiling model which is able to predict the flow instability which caused dryout. In other analyses dryout has been predicted for out-of-core test bundles and so this study provides additional confirmation of the model.
Aeroelastic Tailoring Study of N+2 Low-Boom Supersonic Commercial Transport Aircraft
NASA Technical Reports Server (NTRS)
Pak, Chan-gi
2015-01-01
The Lockheed Martins N+2 Low-boom Supersonic Commercial Transport (LSCT) aircraft is optimized in this study through the use of a multidisciplinary design optimization tool developed at the NASA Armstrong Flight Research Center. A total of 111 design variables are used in the first optimization run. Total structural weight is the objective function in this optimization run. Design requirements for strength, buckling, and flutter are selected as constraint functions during the first optimization run. The MSC Nastran code is used to obtain the modal, strength, and buckling characteristics. Flutter and trim analyses are based on ZAERO code and landing and ground control loads are computed using an in-house code.
Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim
2017-01-25
To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding 'poor' quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled 'poor' quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
NASA Technical Reports Server (NTRS)
Heinrichs, J. A.; Fee, J. J.
1972-01-01
Space station and solar array data and the analyses which were performed in support of the integrated dynamic analysis study. The analysis methods and the formulated digital simulation were developed. Control systems for space station altitude control and solar array orientation control include generic type control systems. These systems have been digitally coded and included in the simulation.
Finite Element Analysis of a NASA National Transonic Facility Wind Tunnel Balance
NASA Technical Reports Server (NTRS)
Lindell, Michael C.
1996-01-01
This paper presents the results of finite element analyses and correlation studies performed on a NASA National Transonic Facility (NTF) Wind Tunnel balance. In the past NASA has relied primarily on classical hand analyses, coupled with relatively large safety factors, for predicting maximum stresses in wind tunnel balances. Now, with the significant advancements in computer technology and sophistication of general purpose analysis codes, it is more reasonable to pursue finite element analyses of these balances. The correlation studies of the present analyses show very good agreement between the analyses and data measured with strain gages and therefore the studies give higher confidence for using finite element analyses to analyze and optimize balance designs in the future.
Finite Element Analysis of a NASA National Transonic Facility Wide Tunnel Balance
NASA Technical Reports Server (NTRS)
Lindell, Michael C. (Editor)
1999-01-01
This paper presents the results of finite element analyses and correlation studies performed on a NASA National Transonic Facility (NTF) Wind Tunnel balance. In the past NASA has relied primarily on classical hand analyses, coupled with relatively large safety factors, for predicting maximum stresses in wind tunnel balances. Now, with the significant advancements in computer technology and sophistication of general purpose analysis codes, it is more reasonable to pursue finite element analyses of these balances. The correlation studies of the present analyses show very good agreement between the analyses and data measured with strain gages and therefore the studies give higher confidence for using finite element analyses to analyze and optimize balance designs in the future.
Reliability and coverage analysis of non-repairable fault-tolerant memory systems
NASA Technical Reports Server (NTRS)
Cox, G. W.; Carroll, B. D.
1976-01-01
A method was developed for the construction of probabilistic state-space models for nonrepairable systems. Models were developed for several systems which achieved reliability improvement by means of error-coding, modularized sparing, massive replication and other fault-tolerant techniques. From the models developed, sets of reliability and coverage equations for the systems were developed. Comparative analyses of the systems were performed using these equation sets. In addition, the effects of varying subunit reliabilities on system reliability and coverage were described. The results of these analyses indicated that a significant gain in system reliability may be achieved by use of combinations of modularized sparing, error coding, and software error control. For sufficiently reliable system subunits, this gain may far exceed the reliability gain achieved by use of massive replication techniques, yet result in a considerable saving in system cost.
Turbopump Design and Analysis Approach for Nuclear Thermal Rockets
NASA Technical Reports Server (NTRS)
Chen, Shu-cheng S.; Veres, Joseph P.; Fittje, James E.
2006-01-01
A rocket propulsion system, whether it is a chemical rocket or a nuclear thermal rocket, is fairly complex in detail but rather simple in principle. Among all the interacting parts, three components stand out: they are pumps and turbines (turbopumps), and the thrust chamber. To obtain an understanding of the overall rocket propulsion system characteristics, one starts from analyzing the interactions among these three components. It is therefore of utmost importance to be able to satisfactorily characterize the turbopump, level by level, at all phases of a vehicle design cycle. Here at NASA Glenn Research Center, as the starting phase of a rocket engine design, specifically a Nuclear Thermal Rocket Engine design, we adopted the approach of using a high level system cycle analysis code (NESS) to obtain an initial analysis of the operational characteristics of a turbopump required in the propulsion system. A set of turbopump design codes (PumpDes and TurbDes) were then executed to obtain sizing and performance characteristics of the turbopump that were consistent with the mission requirements. A set of turbopump analyses codes (PUMPA and TURBA) were applied to obtain the full performance map for each of the turbopump components; a two dimensional layout of the turbopump based on these mean line analyses was also generated. Adequacy of the turbopump conceptual design will later be determined by further analyses and evaluation. In this paper, descriptions and discussions of the aforementioned approach are provided and future outlooks are discussed.
2017-10-01
for all project Aims. Timeline- months 3-6. Status: completed. Task 6: Complete primary analyses and hypothesis testing for Aim 2, including...glucose. For each of these lab tests , each VA site can name them something different and can change names over times. Labs should be linked to Logical...Observation Identifiers Names (LOINC) codes, an international standard system that assigns a numeric code to specific lab tests . However, VA data
A review of lossless audio compression standards and algorithms
NASA Astrophysics Data System (ADS)
Muin, Fathiah Abdul; Gunawan, Teddy Surya; Kartiwi, Mira; Elsheikh, Elsheikh M. A.
2017-09-01
Over the years, lossless audio compression has gained popularity as researchers and businesses has become more aware of the need for better quality and higher storage demand. This paper will analyse various lossless audio coding algorithm and standards that are used and available in the market focusing on Linear Predictive Coding (LPC) specifically due to its popularity and robustness in audio compression, nevertheless other prediction methods are compared to verify this. Advanced representation of LPC such as LSP decomposition techniques are also discussed within this paper.
1983-01-13
Naval .1 Ordnance Systems Command ) codes are detailed propagation simulations mostly at lower frequencies . These are combined with WEPH code phenomenology...AD B062349L. Scope /Abstract: This report describes a simple model for predicting the loads on box-like target structures subject to air blast. A... model and applying it to targets which can be approximated by a series of rectangular parallelopipeds. In this report the physical phenomena of high
Computational fluid mechanics utilizing the variational principle of modeling damping seals
NASA Technical Reports Server (NTRS)
Abernathy, J. M.
1986-01-01
A computational fluid dynamics code for application to traditional incompressible flow problems has been developed. The method is actually a slight compressibility approach which takes advantage of the bulk modulus and finite sound speed of all real fluids. The finite element numerical analog uses a dynamic differencing scheme based, in part, on a variational principle for computational fluid dynamics. The code was developed in order to study the feasibility of damping seals for high speed turbomachinery. Preliminary seal analyses have been performed.
Defending Norway and the Northern Flank: Analysis of NATO’s Strategic Options.
1985-12-01
22a NAME OF RESPONSIBLE INDIVIDUAL 22b TELEPHONE (Include Area Code) 22c OFFICE SYMBOL Prof. Patrick J. Parker 646-2097 56 DV C")RM 1473,s84 MAt 53...34Roy Breivik , Assuring the Security of Reinforcements to Norway," NATO s Fifteen Nations, special issue no. 2 (1982), pp. 66 67. 22...Analyses 1 2000 North Beauregard Street P.O. Box 11280 Alexandria, Virginia 22311 5. Professor Patrick J. Parker, Code 56Pr 1 4 Department of National
On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods
NASA Technical Reports Server (NTRS)
Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.
2003-01-01
Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.
[Variations in patient data coding affect hospital standardized mortality ratio (HSMR)].
van den Bosch, Wim F; Silberbusch, Joseph; Roozendaal, Klaas J; Wagner, Cordula
2010-01-01
To investigate the impact of coding variations on 'hospital standardized mortality ratio' (HSMR) and to define variation reduction measures. Retrospective, descriptive. We analysed coding variations in HSMR parameters for main diagnosis, urgency of the admission and comorbidity in the national medical registration (LMR) database of admissions in 6 Dutch top clinical hospitals during 2003-2007. More than a quarter of these admission records had been included in the HSMR calculation. Admissions with ICD-9 main diagnosis codes that were excluded from HSMR calculations were investigated for inter-hospital variability and correct exclusion. Variation in coding admission type was signalled by analyzing admission records with diagnoses that had an emergency nature by their title. Variation in the average number of comorbidity diagnoses per admission was determined as an indicator for coding variation. Interviews with coding teams were used to check whether the conclusions of the analysis were correct. Over 165,000 admissions that were excluded from HSMR calculations showed large variability between hospitals. This figure was 40% of all admissions that were included. Of the admissions with a main diagnosis indicating an emergency, 34% to 93% were recorded as an emergency. The average number of comorbidity diagnoses varied between hospitals from 0.9 to 3.0 per admission. Coding of main diagnoses, urgency of admission and comorbidities showed strong inter-hospital variation with a potentially large impact on the HSMR outcomes of the hospitals. Coding variations originated from differences in interpretation of coding rules, differences in coding capacity, quality of patient records and discharge documentation and timely delivery of these.
Mason, Marc A; Fanelli Kuczmarski, Marie; Allegro, Deanne; Zonderman, Alan B; Evans, Michele K
2015-08-01
Analysing dietary data to capture how individuals typically consume foods is dependent on the coding variables used. Individual foods consumed simultaneously, like coffee with milk, are given codes to identify these combinations. Our literature review revealed a lack of discussion about using combination codes in analysis. The present study identified foods consumed at mealtimes and by race when combination codes were or were not utilized. Duplicate analysis methods were performed on separate data sets. The original data set consisted of all foods reported; each food was coded as if it was consumed individually. The revised data set was derived from the original data set by first isolating coded foods consumed as individual items from those foods consumed simultaneously and assigning a code to designate a combination. Foods assigned a combination code, like pancakes with syrup, were aggregated and associated with a food group, defined by the major food component (i.e. pancakes), and then appended to the isolated coded foods. Healthy Aging in Neighborhoods of Diversity across the Life Span study. African-American and White adults with two dietary recalls (n 2177). Differences existed in lists of foods most frequently consumed by mealtime and race when comparing results based on original and revised data sets. African Americans reported consumption of sausage/luncheon meat and poultry, while ready-to-eat cereals and cakes/doughnuts/pastries were reported by Whites on recalls. Use of combination codes provided more accurate representation of how foods were consumed by populations. This information is beneficial when creating interventions and exploring diet-health relationships.
Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Nathan C.; Gauntt, Randall O.
Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less
Are Neighborhood-Level Characteristics Associated with Indoor Allergens in the Household?
Rosenfeld, Lindsay; Rudd, Rima; Chew, Ginger L.; Emmons, Karen; Acevedo-García, Dolores
2010-01-01
Background Individual home characteristics have been associated with indoor allergen exposure; however, the influence of neighborhood-level characteristics has not been well-studied. We defined neighborhoods as community districts determined by the New York Department of City Planning. Objective We examined the relationship between neighborhood-level characteristics and the presence of dust mite (Der f 1), cat (Fel d 1), cockroach (Bla g 2), and mouse (MUP) allergens in the household. Methods Using data from the Puerto Rican Asthma Project, a birth cohort of Puerto Rican children at risk of allergic sensitization (n=261) we examined associations between neighborhood characteristics (percent tree canopy, asthma hospitalizations per 1000 children, roadway length within 100 meters of buildings, serious housing code violations per 1000 rental units, poverty rates, and felony crime rates) and the presence of indoor allergens. Allergen cutpoints were used for categorical analyses and defined as follows: dust mite: >0.25 μg/g; cat: >1 μg/g; cockroach: >1 U/g; mouse: >1.6 μg/g. Results Serious housing code violations were statistically significantly positively associated with dust mite, cat and mouse allergens (continuous variables), adjusting for mother's income and education, and all neighborhood-level characteristics. In multivariable logistic regression analyses, medium levels of housing code violations were associated with higher dust mite and cat allergens (1.81, 95%CI: 1.08, 3.03 and 3.10, 95%CI: 1.22, 7.92, respectively). A high level of serious housing code violations was associated with higher mouse allergen (2.04, 95%CI: 1.15, 3.62). A medium level of housing code violations was associated with higher cockroach allergen (3.30, 95%CI: 1.11, 9.78). Conclusions Neighborhood-level characteristics, specifically housing code violations, appear to be related to indoor allergens, which may have implications for future research explorations and policy decisions. PMID:20100024
Robinson, Emily J; Goldstein, Laura H; McCrone, Paul; Perdue, Iain; Chalder, Trudie; Mellers, John D C; Richardson, Mark P; Murray, Joanna; Reuber, Markus; Medford, Nick; Stone, Jon; Carson, Alan; Landau, Sabine
2017-06-06
Dissociative seizures (DSs), also called psychogenic non-epileptic seizures, are a distressing and disabling problem for many patients in neurological settings with high and often unnecessary economic costs. The COgnitive behavioural therapy versus standardised medical care for adults with Dissociative non-Epileptic Seizures (CODES) trial is an evaluation of a specifically tailored psychological intervention with the aims of reducing seizure frequency and severity and improving psychological well-being in adults with DS. The aim of this paper is to report in detail the quantitative and economic analysis plan for the CODES trial, as agreed by the trial steering committee. The CODES trial is a multicentre, pragmatic, parallel group, randomised controlled trial performed to evaluate the clinical effectiveness and cost-effectiveness of 13 sessions of cognitive behavioural therapy (CBT) plus standardised medical care (SMC) compared with SMC alone for adult outpatients with DS. The objectives and design of the trial are summarised, and the aims and procedures of the planned analyses are illustrated. The proposed analysis plan addresses statistical considerations such as maintaining blinding, monitoring adherence with the protocol, describing aspects of treatment and dealing with missing data. The formal analysis approach for the primary and secondary outcomes is described, as are the descriptive statistics that will be reported. This paper provides transparency to the planned inferential analyses for the CODES trial prior to the extraction of outcome data. It also provides an update to the previously published trial protocol and guidance to those conducting similar trials. ISRCTN registry ISRCTN05681227 (registered on 5 March 2014); ClinicalTrials.gov NCT02325544 (registered on 15 December 2014).
FDNS CFD Code Benchmark for RBCC Ejector Mode Operation
NASA Technical Reports Server (NTRS)
Holt, James B.; Ruf, Joe
1999-01-01
Computational Fluid Dynamics (CFD) analysis results are compared with benchmark quality test data from the Propulsion Engineering Research Center's (PERC) Rocket Based Combined Cycle (RBCC) experiments to verify fluid dynamic code and application procedures. RBCC engine flowpath development will rely on CFD applications to capture the multi-dimensional fluid dynamic interactions and to quantify their effect on the RBCC system performance. Therefore, the accuracy of these CFD codes must be determined through detailed comparisons with test data. The PERC experiments build upon the well-known 1968 rocket-ejector experiments of Odegaard and Stroup by employing advanced optical and laser based diagnostics to evaluate mixing and secondary combustion. The Finite Difference Navier Stokes (FDNS) code was used to model the fluid dynamics of the PERC RBCC ejector mode configuration. Analyses were performed for both Diffusion and Afterburning (DAB) and Simultaneous Mixing and Combustion (SMC) test conditions. Results from both the 2D and the 3D models are presented.
Development of a thermal and structural analysis procedure for cooled radial turbines
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Deanna, Russell G.
1988-01-01
A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine is considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analyses. An inviscid, quasi three-dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous one-dimensional internal flow code for the momentum and energy equation. These boundary conditions are input to a three-dimensional heat conduction code for calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results from this case are included.
Fuzzy support vector machines for adaptive Morse code recognition.
Yang, Cheng-Hong; Jin, Li-Cheng; Chuang, Li-Yeh
2006-11-01
Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, facilitating mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. Therefore, an adaptive automatic recognition method with a high recognition rate is needed. The proposed system uses both fuzzy support vector machines and the variable-degree variable-step-size least-mean-square algorithm to achieve these objectives. We apply fuzzy memberships to each point, and provide different contributions to the decision learning function for support vector machines. Statistical analyses demonstrated that the proposed method elicited a higher recognition rate than other algorithms in the literature.
Development of Tripropellant CFD Design Code
NASA Technical Reports Server (NTRS)
Farmer, Richard C.; Cheng, Gary C.; Anderson, Peter G.
1998-01-01
A tripropellant, such as GO2/H2/RP-1, CFD design code has been developed to predict the local mixing of multiple propellant streams as they are injected into a rocket motor. The code utilizes real fluid properties to account for the mixing and finite-rate combustion processes which occur near an injector faceplate, thus the analysis serves as a multi-phase homogeneous spray combustion model. Proper accounting of the combustion allows accurate gas-side temperature predictions which are essential for accurate wall heating analyses. The complex secondary flows which are predicted to occur near a faceplate cannot be quantitatively predicted by less accurate methodology. Test cases have been simulated to describe an axisymmetric tripropellant coaxial injector and a 3-dimensional RP-1/LO2 impinger injector system. The analysis has been shown to realistically describe such injector combustion flowfields. The code is also valuable to design meaningful future experiments by determining the critical location and type of measurements needed.
NASA Technical Reports Server (NTRS)
Rathjen, K. A.; Burk, H. O.
1983-01-01
The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.
Guillerme, Thomas; Cooper, Natalie
2016-05-01
Analyses of living and fossil taxa are crucial for understanding biodiversity through time. The total evidence method allows living and fossil taxa to be combined in phylogenies, using molecular data for living taxa and morphological data for living and fossil taxa. With this method, substantial overlap of coded anatomical characters among living and fossil taxa is vital for accurately inferring topology. However, although molecular data for living species are widely available, scientists generating morphological data mainly focus on fossils. Therefore, there are fewer coded anatomical characters in living taxa, even in well-studied groups such as mammals. We investigated the number of coded anatomical characters available in phylogenetic matrices for living mammals and how these were phylogenetically distributed across orders. Eleven of 28 mammalian orders have less than 25% species with available characters; this has implications for the accurate placement of fossils, although the issue is less pronounced at higher taxonomic levels. In most orders, species with available characters are randomly distributed across the phylogeny, which may reduce the impact of the problem. We suggest that increased morphological data collection efforts for living taxa are needed to produce accurate total evidence phylogenies. © 2016 The Authors.
Spectroradiometric calibration of the Thematic Mapper and Multispectral Scanner system
NASA Technical Reports Server (NTRS)
Slater, P. N.; Palmer, J. M. (Principal Investigator)
1985-01-01
The results of analyses of Thematic Mapper (TM) images acquired on July 8 and October 28, 1984, and of a check of the calibration of the 1.22-m integrating sphere at Santa Barbara Research Center (SBRC) are described. The results obtained from the in-flight calibration attempts disagree with the pre-flight calibrations for bands 2 and 4. Considerable effort was expended in an attempt to explain the disagreement. The difficult point to explain is that the difference between the radiances predicted by the radiative transfer code (the code radiances) and the radiances predicted by the preflight calibration (the pre-flight radiances) fluctuate with spectral band. Because the spectral quantities measured at White Sands show little change with spectral band, these fluctuations are not anticipated. Analyses of other targets at White Sands such as clouds, cloud shadows, and water surfaces tend to support the pre-flight and internal calibrator calibrations. The source of the disagreement has not been identified. It could be due to: (1) a computational error in the data reduction; (2) an incorrect assumption in the input to the radiative transfer code; or (3) incorrect operation of the field equipment.
A Multilevel Shape Fit Analysis of Neutron Transmission Data
NASA Astrophysics Data System (ADS)
Naguib, K.; Sallam, O. H.; Adib, M.; Ashry, A.
A multilevel shape fit analysis of neutron transmission data is presented. A multilevel computer code SHAPE is used to analyse clean transmission data obtained from time-of-flight (TOF) measurements. The shape analysis deduces the parameters of the observed resonances in the energy region considered in the measurements. The shape code is based upon a least square fit of a multilevel Briet-Wigner formula and includes both instrumental resolution and Doppler broadenings. Operating the SHAPE code on a test example of a measured transmission data of 151Eu, 153Eu and natural Eu in the energy range 0.025-1 eV accquired a good result for the used technique of analysis.Translated AbstractAnalyse von Neutronentransmissionsdaten mittels einer VielniveauformanpassungNeutronentransmissionsdaten werden in einer Vielniveauformanpassung analysiert. Dazu werden bereinigte Daten aus Flugzeitmessungen mit dem Rechnerprogramm SHAPE bearbeitet. Man erhält die Parameter der beobachteten Resonanzen im gemessenen Energiebereich. Die Formanpassung benutzt eine Briet-Wignerformel und berücksichtigt Linienverbreiterungen infolge sowohl der Meßeinrichtung als auch des Dopplereffekts. Als praktisches Beispiel werden 151Eu, 153Eu und natürliches Eu im Energiebereich 0.025 bis 1 eV mit guter Übereinstimmung theoretischer und experimenteller Werte behandelt.
NASA Technical Reports Server (NTRS)
Amundsen, R. M.; Feldhaus, W. S.; Little, A. D.; Mitchum, M. V.
1995-01-01
Electronic integration of design and analysis processes was achieved and refined at Langley Research Center (LaRC) during the development of an optical bench for a laser-based aerospace experiment. Mechanical design has been integrated with thermal, structural and optical analyses. Electronic import of the model geometry eliminates the repetitive steps of geometry input to develop each analysis model, leading to faster and more accurate analyses. Guidelines for integrated model development are given. This integrated analysis process has been built around software that was already in use by designers and analysis at LaRC. The process as currently implemented used Pro/Engineer for design, Pro/Manufacturing for fabrication, PATRAN for solid modeling, NASTRAN for structural analysis, SINDA-85 and P/Thermal for thermal analysis, and Code V for optical analysis. Currently, the only analysis model to be built manually is the Code V model; all others can be imported for the Pro/E geometry. The translator from PATRAN results to Code V optical analysis (PATCOD) was developed and tested at LaRC. Directions for use of the translator or other models are given.
IPAC-Inlet Performance Analysis Code
NASA Technical Reports Server (NTRS)
Barnhart, Paul J.
1997-01-01
A series of analyses have been developed which permit the calculation of the performance of common inlet designs. The methods presented are useful for determining the inlet weight flows, total pressure recovery, and aerodynamic drag coefficients for given inlet geometric designs. Limited geometric input data is required to use this inlet performance prediction methodology. The analyses presented here may also be used to perform inlet preliminary design studies. The calculated inlet performance parameters may be used in subsequent engine cycle analyses or installed engine performance calculations for existing uninstalled engine data.
ADAPTION OF NONSTANDARD PIPING COMPONENTS INTO PRESENT DAY SEISMIC CODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. T. Clark; M. J. Russell; R. E. Spears
2009-07-01
With spiraling energy demand and flat energy supply, there is a need to extend the life of older nuclear reactors. This sometimes requires that existing systems be evaluated to present day seismic codes. Older reactors built in the 1960s and early 1970s often used fabricated piping components that were code compliant during their initial construction time period, but are outside the standard parameters of present-day piping codes. There are several approaches available to the analyst in evaluating these non-standard components to modern codes. The simplest approach is to use the flexibility factors and stress indices for similar standard components withmore » the assumption that the non-standard component’s flexibility factors and stress indices will be very similar. This approach can require significant engineering judgment. A more rational approach available in Section III of the ASME Boiler and Pressure Vessel Code, which is the subject of this paper, involves calculation of flexibility factors using finite element analysis of the non-standard component. Such analysis allows modeling of geometric and material nonlinearities. Flexibility factors based on these analyses are sensitive to the load magnitudes used in their calculation, load magnitudes that need to be consistent with those produced by the linear system analyses where the flexibility factors are applied. This can lead to iteration, since the magnitude of the loads produced by the linear system analysis depend on the magnitude of the flexibility factors. After the loading applied to the nonstandard component finite element model has been matched to loads produced by the associated linear system model, the component finite element model can then be used to evaluate the performance of the component under the loads with the nonlinear analysis provisions of the Code, should the load levels lead to calculated stresses in excess of Allowable stresses. This paper details the application of component-level finite element modeling to account for geometric and material nonlinear component behavior in a linear elastic piping system model. Note that this technique can be applied to the analysis of B31 piping systems.« less
Red River Waterway Thermal Studies. Report 2. Thermal Stress Analyses
1991-12-01
stress relaxation, q. Shrinkage of the concrete, and . Thermal properties of the concrete including coefficient of thermal expansion , specific heat...Finite-Element Code 12. The thermal stress analyses in this investigation was performed using ABAQUS , a general-purpose, heat-transfer and structural...model (the UMAT 9 subroutine discussed below) may be incorporated as an external subroutine linked to the ABAQUS library. 14. In order to model the
Extremely accurate sequential verification of RELAP5-3D
Mesina, George L.; Aumiller, David L.; Buschman, Francis X.
2015-11-19
Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less
Path Toward a Unifid Geometry for Radiation Transport
NASA Technical Reports Server (NTRS)
Lee, Kerry; Barzilla, Janet; Davis, Andrew; Zachmann
2014-01-01
The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex computer-aided design (CAD) models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN [high charge and energy transport code developed by NASA Langley Research Center (LaRC)], are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit-specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The workflow for achieving radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats
Application of CFX-10 to the Investigation of RPV Coolant Mixing in VVER Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moretti, Fabio; Melideo, Daniele; Terzuoli, Fulvio
2006-07-01
Coolant mixing phenomena occurring in the pressure vessel of a nuclear reactor constitute one of the main objectives of investigation by researchers concerned with nuclear reactor safety. For instance, mixing plays a relevant role in reactivity-induced accidents initiated by de-boration or boron dilution events, followed by transport of a de-borated slug into the vessel of a pressurized water reactor. Another example is constituted by temperature mixing, which may sensitively affect the consequences of a pressurized thermal shock scenario. Predictive analysis of mixing phenomena is strongly improved by the availability of computational tools able to cope with the inherent three-dimensionality ofmore » such problem, like system codes with three-dimensional capabilities, and Computational Fluid Dynamics (CFD) codes. The present paper deals with numerical analyses of coolant mixing in the reactor pressure vessel of a VVER-1000 reactor, performed by the ANSYS CFX-10 CFD code. In particular, the 'swirl' effect that has been observed to take place in the downcomer of such kind of reactor has been addressed, with the aim of assessing the capability of the codes to predict that effect, and to understand the reasons for its occurrence. Results have been compared against experimental data from V1000CT-2 Benchmark. Moreover, a boron mixing problem has been investigated, in the hypothesis that a de-borated slug, transported by natural circulation, enters the vessel. Sensitivity analyses have been conducted on some geometrical features, model parameters and boundary conditions. (authors)« less
Building code challenging the ethics behind adobe architecture in North Cyprus.
Hurol, Yonca; Yüceer, Hülya; Şahali, Öznem
2015-04-01
Adobe masonry is part of the vernacular architecture of Cyprus. Thus, it is possible to use this technology in a meaningful way on the island. On the other hand, although adobe architecture is more sustainable in comparison to other building technologies, the use of it is diminishing in North Cyprus. The application of Turkish building code in the north of the island has created complications in respect of the use of adobe masonry, because this building code demands that reinforced concrete vertical tie-beams are used together with adobe masonry. The use of reinforced concrete elements together with adobe masonry causes problems in relation to the climatic response of the building as well as causing other technical and aesthetic problems. This situation makes the design of adobe masonry complicated and various types of ethical problems also emerge. The objective of this article is to analyse the ethical problems which arise as a consequence of the restrictive character of the building code, by analysing two case studies and conducting an interview with an architect who was involved with the use of adobe masonry in North Cyprus. According to the results of this article there are ethical problems at various levels in the design of both case studies. These problems are connected to the responsibilities of architects in respect of the social benefit, material production, aesthetics and affordability of the architecture as well as presenting distrustful behaviour where the obligations of architects to their clients is concerned.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.
2011-03-01
This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less
NASA Astrophysics Data System (ADS)
Zhou, Abel; White, Graeme L.; Davidson, Rob
2018-02-01
Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.
RAY-RAMSES: a code for ray tracing on the fly in N-body simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barreira, Alexandre; Llinares, Claudio; Bose, Sownak
2016-05-01
We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementationmore » using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conventional methods, which can be important in tests of theory systematics in preparation for upcoming large scale structure surveys.« less
Progressive changes in non-coding RNA profile in leucocytes with age
Muñoz-Culla, Maider; Irizar, Haritz; Gorostidi, Ana; Alberro, Ainhoa; Osorio-Querejeta, Iñaki; Ruiz-Martínez, Javier; Olascoaga, Javier; de Munain, Adolfo López; Otaegui, David
2017-01-01
It has been observed that immune cell deterioration occurs in the elderly, as well as a chronic low-grade inflammation called inflammaging. These cellular changes must be driven by numerous changes in gene expression and in fact, both protein-coding and non-coding RNA expression alterations have been observed in peripheral blood mononuclear cells from elder people. In the present work we have studied the expression of small non-coding RNA (microRNA and small nucleolar RNA -snoRNA-) from healthy individuals from 24 to 79 years old. We have observed that the expression of 69 non-coding RNAs (56 microRNAs and 13 snoRNAs) changes progressively with chronological age. According to our results, the age range from 47 to 54 is critical given that it is the period when the expression trend (increasing or decreasing) of age-related small non-coding RNAs is more pronounced. Furthermore, age-related miRNAs regulate genes that are involved in immune, cell cycle and cancer-related processes, which had already been associated to human aging. Therefore, human aging could be studied as a result of progressive molecular changes, and different age ranges should be analysed to cover the whole aging process. PMID:28448962
Visual information processing; Proceedings of the Meeting, Orlando, FL, Apr. 20-22, 1992
NASA Technical Reports Server (NTRS)
Huck, Friedrich O. (Editor); Juday, Richard D. (Editor)
1992-01-01
Topics discussed in these proceedings include nonlinear processing and communications; feature extraction and recognition; image gathering, interpolation, and restoration; image coding; and wavelet transform. Papers are presented on noise reduction for signals from nonlinear systems; driving nonlinear systems with chaotic signals; edge detection and image segmentation of space scenes using fractal analyses; a vision system for telerobotic operation; a fidelity analysis of image gathering, interpolation, and restoration; restoration of images degraded by motion; and information, entropy, and fidelity in visual communication. Attention is also given to image coding methods and their assessment, hybrid JPEG/recursive block coding of images, modified wavelets that accommodate causality, modified wavelet transform for unbiased frequency representation, and continuous wavelet transform of one-dimensional signals by Fourier filtering.
TRAC-PF1/MOD1 support calculations for the MIST/OTIS program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujita, R.K.; Knight, T.D.
1984-01-01
We are using the Transient Reactor Analysis Code (TRAC), specifically version TRAC-PF1/MOD1, to perform analyses in support of the MultiLoop Integral-System Test (MIST) and the Once-Through Integral-System (OTIS) experiment program. We have analyzed Geradrohr Dampferzeuger Anlage (GERDA) Test 1605AA to benchmark the TRAC-PF1/MOD1 code against phenomena expected to occur in a raised-loop B and W plant during a small-break loss-of-coolant accident (SBLOCA). These results show that the code can calculate both single- and two-phase natural circulation, flow interruption, boiler-condenser-mode (BCM) heat transfer, and primary-system refill in a B and W-type geometry with low-elevation auxiliary feedwater. 19 figures, 7 tables.
NASA Technical Reports Server (NTRS)
Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd, III
1994-01-01
NASA Langley Research Center has, for several years, conducted research in the area of time-correlated gust loads for linear and nonlinear aircraft. The results of this work led NASA to recommend that the Matched-Filter-Based One-Dimensional Search Method be used for gust load analyses of nonlinear aircraft. This manual describes this method, describes a FORTRAN code which performs this method, and presents example calculations for a sample nonlinear aircraft model. The name of the code is MFD1DS (Matched-Filter-Based One-Dimensional Search). The program source code, the example aircraft equations of motion, a sample input file, and a sample program output are all listed in the appendices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chambers, Robert S.; Neidigk, Matthew A.
Sandia SPECabq is FORTRAN code that defines the user supplied subroutines needed to perform nonlinear viscoelastic analyses in the ABAQUS commercial finite element code based on the Simplified Potential Energy Clock (SPEC) Model. The SPEC model was published in the open literature in 2009. It must be compiled and linked with the ABAQUS libraries under the user supplied subroutine option of the ABAQUS executable script. The subroutine is used to analyze the thermomechanical behavior of isotropic polymers predicting things like how a polymer may undergo stress or volume relaxation under different temperature and loading environments. This subroutine enables the ABAQUSmore » finite element code to be used for analyzing the thermo-mechanical behavior of samples and parts that are made from glassy polymers.« less
The 2002 PhRMA Code and Pharmaceutical Marketing: did anybody bother to ask the reps?
Sillup, George P; Trombetta, Bill; Klimberg, Ronald
2010-10-01
After marketing tactics resulted in $1.2 billion fines, the 2002 PhRMA Code attempted to standardize marketing and sales practices. Self-regulation had varied success by other industries and by pharmaceutical industries in other countries. Similarly, the Code addressed negative responses about pharmaceutical's practices but had no provisions for monitoring violations. Representative's (reps) perspectives were assessed using an 18-item instrument with 72 reps from 25 companies. Analyses indicated that reps from bigger companies, PhRMA and non-PhRMA, adhered better. The way reps adhered was split between adhering reluctantly and following faithfully. Two thirds felt it was more difficult to do their jobs, resulting from prior entertainment-based relationships with physicians.
NASA. Marshall Space Flight Center Hydrostatic Bearing Activities
NASA Technical Reports Server (NTRS)
Benjamin, Theodore G.
1991-01-01
The basic approach for analyzing hydrostatic bearing flows at the Marshall Space Flight Center (MSFC) is briefly discussed. The Hydrostatic Bearing Team has responsibility for assessing and evaluating flow codes; evaluating friction, ignition, and galling effects; evaluating wear; and performing tests. The Office of Aerospace and Exploration Technology Turbomachinery Seals Tasks consist of tests and analysis. The MSFC in-house analyses utilize one-dimensional bulk-flow codes. Computational fluid dynamics (CFD) analysis is used to enhance understanding of bearing flow physics or to perform parametric analysis that are outside the bulk flow database. As long as the bulk flow codes are accurate enough for most needs, they will be utilized accordingly and will be supported by CFD analysis on an as-needed basis.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
Program NSEG is a rapid mission analysis code based on the use of approximate flight path equations of motion. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed vehicle characteristics are specified in tabular form. In addition to its mission performance calculation capabilities, the code also contains extensive flight envelope performance mapping capabilities. For example, rate-of-climb, turn rates, and energy maneuverability parameter values may be mapped in the Mach-altitude plane. Approximate take off and landing analyses are also performed. At high speeds, centrifugal lift effects are accounted for. Extensive turbojet and ramjet engine scaling procedures are incorporated in the code.
NASA Astrophysics Data System (ADS)
Knowlton, R. G.; Arnold, B. W.; Mattie, P. D.; Kuo, M.; Tien, N.
2006-12-01
For several years now, Taiwan has been engaged in a process to select a low-level radioactive waste (LLW) disposal site. Taiwan is generating LLW from operational and decommissioning wastes associated with nuclear power reactors, as well as research, industrial, and medical radioactive wastes. The preliminary selection process has narrowed the search to four potential candidate sites. These sites are to be evaluated in a performance assessment analysis to determine the likelihood of meeting the regulatory criteria for disposal. Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research have been working together to develop the necessary performance assessment methodology and associated computer models to perform these analyses. The methodology utilizes both deterministic (e.g., single run) and probabilistic (e.g., multiple statistical realizations) analyses to achieve the goals. The probabilistic approach provides a means of quantitatively evaluating uncertainty in the model predictions and a more robust basis for performing sensitivity analyses to better understand what is driving the dose predictions from the models. Two types of disposal configurations are under consideration: a shallow land burial concept and a cavern disposal concept. The shallow land burial option includes a protective cover to limit infiltration potential to the waste. Both conceptual designs call for the disposal of 55 gallon waste drums within concrete lined trenches or tunnels, and backfilled with grout. Waste emplaced in the drums may be solidified. Both types of sites are underlain or placed within saturated fractured bedrock material. These factors have influenced the conceptual model development of each site, as well as the selection of the models to employ for the performance assessment analyses. Several existing codes were integrated in order to facilitate a comprehensive performance assessment methodology to evaluate the potential disposal sites. First, a need existed to simulate the failure processes of the waste containers, with subsequent leaching of the waste form to the underlying host rock. The Breach, Leach, and Transport Multiple Species (BLT-MS) code was selected to meet these needs. BLT-MS also has a 2-D finite-element advective-dispersive transport module, with radionuclide in-growth and decay. BLT-MS does not solve the groundwater flow equation, but instead requires the input of Darcy flow velocity terms. These terms were abstracted from a groundwater flow model using the FEHM code. For the shallow land burial site, the HELP code was also used to evaluate the performance of the protective cover. The GoldSim code was used for two purposes: quantifying uncertainties in the predictions, and providing a platform to evaluate an alternative conceptual model involving matrix-diffusion transport. Results of the preliminary performance assessment analyses using examples to illustrate the computational framework will be presented. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
Peterson, Rachel; Gundlapalli, Adi V; Metraux, Stephen; Carter, Marjorie E; Palmer, Miland; Redd, Andrew; Samore, Matthew H; Fargo, Jamison D
2015-01-01
Researchers at the U.S. Department of Veterans Affairs (VA) have used administrative criteria to identify homelessness among U.S. Veterans. Our objective was to explore the use of these codes in VA health care facilities. We examined VA health records (2002-2012) of Veterans recently separated from the military and identified as homeless using VA conventional identification criteria (ICD-9-CM code V60.0, VA specific codes for homeless services), plus closely allied V60 codes indicating housing instability. Logistic regression analyses examined differences between Veterans who received these codes. Health care services and co-morbidities were analyzed in the 90 days post-identification of homelessness. VA conventional criteria identified 21,021 homeless Veterans from Operations Enduring Freedom, Iraqi Freedom, and New Dawn (rate 2.5%). Adding allied V60 codes increased that to 31,260 (rate 3.3%). While certain demographic differences were noted, Veterans identified as homeless using conventional or allied codes were similar with regards to utilization of homeless, mental health, and substance abuse services, as well as co-morbidities. Differences were noted in the pattern of usage of homelessness-related diagnostic codes in VA facilities nation-wide. Creating an official VA case definition for homelessness, which would include additional ICD-9-CM and other administrative codes for VA homeless services, would likely allow improved identification of homeless and at-risk Veterans. This also presents an opportunity for encouraging uniformity in applying these codes in VA facilities nationwide as well as in other large health care organizations.
Current and anticipated uses of the thermal hydraulics codes at the NRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support thesemore » needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.« less
An algebraic hypothesis about the primeval genetic code architecture.
Sánchez, Robersy; Grau, Ricardo
2009-09-01
A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D,A,C,G,U}, where symbol D represents one or more hypothetical bases with unspecific pairings. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvement of a primeval DNA repair system could make possible the transition from ancient to modern genetic codes. Our results suggest that the Watson-Crick base pairing G identical with C and A=U and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as, the transition from the former to the latter. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences. The phylogenetic analyses achieved with metrics defined in the N-dimensional vector space (B(3))(N) of DNA sequences and with the new evolutionary model presented here also suggest that an ancient DNA coding sequence with five or more bases does not contradict the expected evolutionary history.
Valenzuela-Miranda, Diego; Gallardo-Escárate, Cristian
2016-12-01
Despite the high prevalence and impact to Chilean salmon aquaculture of the intracellular bacterium Piscirickettsia salmonis, the molecular underpinnings of host-pathogen interactions remain unclear. Herein, the interplay of coding and non-coding transcripts has been proposed as a key mechanism involved in immune response. Therefore, the aim of this study was to evidence how coding and non-coding transcripts are modulated during the infection process of Atlantic salmon with P. salmonis. For this, RNA-seq was conducted in brain, spleen, and head kidney samples, revealing different transcriptional profiles according to bacterial load. Additionally, while most of the regulated genes annotated for diverse biological processes during infection, a common response associated with clathrin-mediated endocytosis and iron homeostasis was present in all tissues. Interestingly, while endocytosis-promoting factors and clathrin inductions were upregulated, endocytic receptors were mainly downregulated. Furthermore, the regulation of genes related to iron homeostasis suggested an intracellular accumulation of iron, a process in which heme biosynthesis/degradation pathways might play an important role. Regarding the non-coding response, 918 putative long non-coding RNAs were identified, where 425 were newly characterized for S. salar. Finally, co-localization and co-expression analyses revealed a strong correlation between the modulations of long non-coding RNAs and genes associated with endocytosis and iron homeostasis. These results represent the first comprehensive study of putative interplaying mechanisms of coding and non-coding RNAs during bacterial infection in salmonids. Copyright © 2016 Elsevier Ltd. All rights reserved.
Peterson, Rachel; Gundlapalli, Adi V.; Metraux, Stephen; Carter, Marjorie E.; Palmer, Miland; Redd, Andrew; Samore, Matthew H.; Fargo, Jamison D.
2015-01-01
Researchers at the U.S. Department of Veterans Affairs (VA) have used administrative criteria to identify homelessness among U.S. Veterans. Our objective was to explore the use of these codes in VA health care facilities. We examined VA health records (2002-2012) of Veterans recently separated from the military and identified as homeless using VA conventional identification criteria (ICD-9-CM code V60.0, VA specific codes for homeless services), plus closely allied V60 codes indicating housing instability. Logistic regression analyses examined differences between Veterans who received these codes. Health care services and co-morbidities were analyzed in the 90 days post-identification of homelessness. VA conventional criteria identified 21,021 homeless Veterans from Operations Enduring Freedom, Iraqi Freedom, and New Dawn (rate 2.5%). Adding allied V60 codes increased that to 31,260 (rate 3.3%). While certain demographic differences were noted, Veterans identified as homeless using conventional or allied codes were similar with regards to utilization of homeless, mental health, and substance abuse services, as well as co-morbidities. Differences were noted in the pattern of usage of homelessness-related diagnostic codes in VA facilities nation-wide. Creating an official VA case definition for homelessness, which would include additional ICD-9-CM and other administrative codes for VA homeless services, would likely allow improved identification of homeless and at-risk Veterans. This also presents an opportunity for encouraging uniformity in applying these codes in VA facilities nationwide as well as in other large health care organizations. PMID:26172386
Worry, problem elaboration and suppression of imagery: the role of concreteness.
Stöber, J
1998-01-01
Both lay concept and scientific theory claim that worry may be helpful for defining and analyzing problems. Recent studies, however, indicate that worrisome problem elaborations are less concrete than worry-free problem elaborations. This challenges the problem solving view of worry because abstract problem analyses are unlikely to lead to concrete problem solutions. Instead the findings support the avoidance theory of worry which claims that worry suppresses aversive imagery. Following research findings in the dual-coding framework [Paivio, A. (1971). Imagery and verbal processes. New York: Holt, Rhinehart and Winston; Paivio, A. (1986). Mental representations: a dual coding approach. New York: Oxford University Press.], the present article proposes that reduced concreteness may play a central role in the understanding of worry. First, reduced concreteness can explain how worry reduces imagery. Second, it offers an explanation why worrisome problem analyses are unlikely to arrive at solutions. Third, it provides a key for the understanding of worry maintenance.
NASA Technical Reports Server (NTRS)
Thompson, E.
1979-01-01
A finite element computer code for the analysis of mantle convection is described. The coupled equations for creeping viscous flow and heat transfer can be solved for either a transient analysis or steady-state analysis. For transient analyses, either a control volume or a control mass approach can be used. Non-Newtonian fluids with viscosities which have thermal and spacial dependencies can be easily incorporated. All material parameters may be written as function statements by the user or simply specified as constants. A wide range of boundary conditions, both for the thermal analysis and the viscous flow analysis can be specified. For steady-state analyses, elastic strain rates can be included. Although this manual was specifically written for users interested in mantle convection, the code is equally well suited for analysis in a number of other areas including metal forming, glacial flows, and creep of rock and soil.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontogeorgakos, D.; Derstine, K.; Wright, A.
2013-06-01
The purpose of the TREAT reactor is to generate large transient neutron pulses in test samples without over-heating the core to simulate fuel assembly accident conditions. The power transients in the present HEU core are inherently self-limiting such that the core prevents itself from overheating even in the event of a reactivity insertion accident. The objective of this study was to support the assessment of the feasibility of the TREAT core conversion based on the present reactor performance metrics and the technical specifications of the HEU core. The LEU fuel assembly studied had the same overall design, materials (UO 2more » particles finely dispersed in graphite) and impurities content as the HEU fuel assembly. The Monte Carlo N–Particle code (MCNP) and the point kinetics code TREKIN were used in the analyses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steiner, J.L.; Lime, J.F.; Elson, J.S.
One dimensional TRAC transient calculations of the process inherent ultimate safety (PIUS) advanced reactor design were performed for a pump-trip SCRAM. The TRAC calculations showed that the reactor power response and shutdown were in qualitative agreement with the one-dimensional analyses presented in the PIUS Preliminary Safety Information Document (PSID) submitted by Asea Brown Boveri (ABB) to the US Nuclear Regulatory Commission for preapplication safety review. The PSID analyses were performed with the ABB-developed RIGEL code. The TRAC-calculated phenomena and trends were also similar to those calculated with another one-dimensional PIUS model, the Brookhaven National Laboratory developed PIPA code. A TRACmore » pump-trip SCRAM transient has also been calculated with a TRAC model containing a multi-dimensional representation of the PIUS intemal flow structures and core region. The results obtained using the TRAC fully one-dimensional PIUS model are compared to the RIGEL, PIPA, and TRAC multi-dimensional results.« less
Reiche, Kristin; Kasack, Katharina; Schreiber, Stephan; Lüders, Torben; Due, Eldri U.; Naume, Bjørn; Riis, Margit; Kristensen, Vessela N.; Horn, Friedemann; Børresen-Dale, Anne-Lise; Hackermüller, Jörg; Baumbusch, Lars O.
2014-01-01
Breast cancer, the second leading cause of cancer death in women, is a highly heterogeneous disease, characterized by distinct genomic and transcriptomic profiles. Transcriptome analyses prevalently assessed protein-coding genes; however, the majority of the mammalian genome is expressed in numerous non-coding transcripts. Emerging evidence supports that many of these non-coding RNAs are specifically expressed during development, tumorigenesis, and metastasis. The focus of this study was to investigate the expression features and molecular characteristics of long non-coding RNAs (lncRNAs) in breast cancer. We investigated 26 breast tumor and 5 normal tissue samples utilizing a custom expression microarray enclosing probes for mRNAs as well as novel and previously identified lncRNAs. We identified more than 19,000 unique regions significantly differentially expressed between normal versus breast tumor tissue, half of these regions were non-coding without any evidence for functional open reading frames or sequence similarity to known proteins. The identified non-coding regions were primarily located in introns (53%) or in the intergenic space (33%), frequently orientated in antisense-direction of protein-coding genes (14%), and commonly distributed at promoter-, transcription factor binding-, or enhancer-sites. Analyzing the most diverse mRNA breast cancer subtypes Basal-like versus Luminal A and B resulted in 3,025 significantly differentially expressed unique loci, including 682 (23%) for non-coding transcripts. A notable number of differentially expressed protein-coding genes displayed non-synonymous expression changes compared to their nearest differentially expressed lncRNA, including an antisense lncRNA strongly anticorrelated to the mRNA coding for histone deacetylase 3 (HDAC3), which was investigated in more detail. Previously identified chromatin-associated lncRNAs (CARs) were predominantly downregulated in breast tumor samples, including CARs located in the protein-coding genes for CALD1, FTX, and HNRNPH1. In conclusion, a number of differentially expressed lncRNAs have been identified with relation to cancer-related protein-coding genes. PMID:25264628
Reiche, Kristin; Kasack, Katharina; Schreiber, Stephan; Lüders, Torben; Due, Eldri U; Naume, Bjørn; Riis, Margit; Kristensen, Vessela N; Horn, Friedemann; Børresen-Dale, Anne-Lise; Hackermüller, Jörg; Baumbusch, Lars O
2014-01-01
Breast cancer, the second leading cause of cancer death in women, is a highly heterogeneous disease, characterized by distinct genomic and transcriptomic profiles. Transcriptome analyses prevalently assessed protein-coding genes; however, the majority of the mammalian genome is expressed in numerous non-coding transcripts. Emerging evidence supports that many of these non-coding RNAs are specifically expressed during development, tumorigenesis, and metastasis. The focus of this study was to investigate the expression features and molecular characteristics of long non-coding RNAs (lncRNAs) in breast cancer. We investigated 26 breast tumor and 5 normal tissue samples utilizing a custom expression microarray enclosing probes for mRNAs as well as novel and previously identified lncRNAs. We identified more than 19,000 unique regions significantly differentially expressed between normal versus breast tumor tissue, half of these regions were non-coding without any evidence for functional open reading frames or sequence similarity to known proteins. The identified non-coding regions were primarily located in introns (53%) or in the intergenic space (33%), frequently orientated in antisense-direction of protein-coding genes (14%), and commonly distributed at promoter-, transcription factor binding-, or enhancer-sites. Analyzing the most diverse mRNA breast cancer subtypes Basal-like versus Luminal A and B resulted in 3,025 significantly differentially expressed unique loci, including 682 (23%) for non-coding transcripts. A notable number of differentially expressed protein-coding genes displayed non-synonymous expression changes compared to their nearest differentially expressed lncRNA, including an antisense lncRNA strongly anticorrelated to the mRNA coding for histone deacetylase 3 (HDAC3), which was investigated in more detail. Previously identified chromatin-associated lncRNAs (CARs) were predominantly downregulated in breast tumor samples, including CARs located in the protein-coding genes for CALD1, FTX, and HNRNPH1. In conclusion, a number of differentially expressed lncRNAs have been identified with relation to cancer-related protein-coding genes.
Validation of the WIMSD4M cross-section generation code with benchmark results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leal, L.C.; Deen, J.R.; Woodruff, W.L.
1995-02-01
The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment for Research and Test (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the procedure to generatemore » cross-section libraries for reactor analyses and calculations utilizing the WIMSD4M code. To do so, the results of calculations performed with group cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory(ORNL) unreflected critical spheres, the TRX critical experiments, and calculations of a modified Los Alamos highly-enriched heavy-water moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.« less
Mason, Marc A; Kuczmarski, Marie Fanelli; Allegro, Deanne; Zonderman, Alan B; Evans, Michele K
2016-01-01
Objective Analysing dietary data to capture how individuals typically consume foods is dependent on the coding variables used. Individual foods consumed simultaneously, like coffee with milk, are given codes to identify these combinations. Our literature review revealed a lack of discussion about using combination codes in analysis. The present study identified foods consumed at mealtimes and by race when combination codes were or were not utilized. Design Duplicate analysis methods were performed on separate data sets. The original data set consisted of all foods reported; each food was coded as if it was consumed individually. The revised data set was derived from the original data set by first isolating coded foods consumed as individual items from those foods consumed simultaneously and assigning a code to designate a combination. Foods assigned a combination code, like pancakes with syrup, were aggregated and associated with a food group, defined by the major food component (i.e. pancakes), and then appended to the isolated coded foods. Setting Healthy Aging in Neighborhoods of Diversity across the Life Span study. Subjects African-American and White adults with two dietary recalls (n 2177). Results Differences existed in lists of foods most frequently consumed by mealtime and race when comparing results based on original and revised data sets. African Americans reported consumption of sausage/luncheon meat and poultry, while ready-to-eat cereals and cakes/doughnuts/pastries were reported by Whites on recalls. Conclusions Use of combination codes provided more accurate representation of how foods were consumed by populations. This information is beneficial when creating interventions and exploring diet–health relationships. PMID:25435191
NASA Astrophysics Data System (ADS)
Villiger, Arturo; Schaer, Stefan; Dach, Rolf; Prange, Lars; Jäggi, Adrian
2017-04-01
It is common to handle code biases in the Global Navigation Satellite System (GNSS) data analysis as conventional differential code biases (DCBs): P1-C1, P1-P2, and P2-C2. Due to the increasing number of signals and systems in conjunction with various tracking modes for the different signals (as defined in RINEX3 format), the number of DCBs would increase drastically and the bookkeeping becomes almost unbearable. The Center for Orbit Determination in Europe (CODE) has thus changed its processing scheme to observable-specific signal biases (OSB). This means that for each observation involved all related satellite and receiver biases are considered. The OSB contributions from various ionosphere analyses (geometry-free linear combination) using different observables and frequencies and from clock analyses (ionosphere-free linear combination) are then combined on normal equation level. By this, one consistent set of OSB values per satellite and receiver can be obtained that contains all information needed for GNSS-related processing. This advanced procedure of code bias handling is now also applied to the IGS (International GNSS Service) MGEX (Multi-GNSS Experiment) procedure at CODE. Results for the biases from the legacy IGS solution as well as the CODE MGEX processing (considering GPS, GLONASS, Galileo, BeiDou, and QZSS) are presented. The consistency with the traditional method is confirmed and the new results are discussed regarding the long-term stability. When processing code data, it is essential to know the true observable types in order to correct for the associated biases. CODE has been verifying the receiver tracking technologies for GPS based on estimated DCB multipliers (for the RINEX 2 case). With the change to OSB, the original verification approach was extended to search for the best fitting observable types based on known OSB values. In essence, a multiplier parameter is estimated for each involved GNSS observable type. This implies that we could recover, for receivers tracking a combination of signals, even the factors of these combinations. The verification of the observable types is crucial to identify the correct observable types of RINEX 2 data (which does not contain the signal modulation in comparison to RINEX 3). The correct information of the used observable types is essential for precise point positioning (PPP) applications and GNSS ambiguity resolution. Multi-GNSS OSBs and verified receiver tracking modes are essential to get best possible multi-GNSS solutions for geodynamic purposes and other applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, B.C.J.; Sha, W.T.; Doria, M.L.
1980-11-01
The governing equations, i.e., conservation equations for mass, momentum, and energy, are solved as a boundary-value problem in space and an initial-value problem in time. BODYFIT-1FE code uses the technique of boundary-fitted coordinate systems where all the physical boundaries are transformed to be coincident with constant coordinate lines in the transformed space. By using this technique, one can prescribe boundary conditions accurately without interpolation. The transformed governing equations in terms of the boundary-fitted coordinates are then solved by using implicit cell-by-cell procedure with a choice of either central or upwind convective derivatives. It is a true benchmark rod-bundle code withoutmore » invoking any assumptions in the case of laminar flow. However, for turbulent flow, some empiricism must be employed due to the closure problem of turbulence modeling. The detailed velocity and temperature distributions calculated from the code can be used to benchmark and calibrate empirical coefficients employed in subchannel codes and porous-medium analyses.« less
CASL VMA FY16 Milestone Report (L3:VMA.VUQ.P13.07) Westinghouse Mixing with COBRA-TF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, Natalie
2016-09-30
COBRA-TF (CTF) is a low-resolution code currently maintained as CASL's subchannel analysis tool. CTF operates as a two-phase, compressible code over a mesh comprised of subchannels and axial discretized nodes. In part because CTF is a low-resolution code, simulation run time is not computationally expensive, only on the order of minutes. Hi-resolution codes such as STAR-CCM+ can be used to train lower-fidelity codes such as CTF. Unlike STAR-CCM+, CTF has no turbulence model, only a two-phase turbulent mixing coefficient, β. β can be set to a constant value or calculated in terms of Reynolds number using an empirical correlation. Resultsmore » from STAR-CCM+ can be used to inform the appropriate value of β. Once β is calibrated, CTF runs can be an inexpensive alternative to costly STAR-CCM+ runs for scoping analyses. Based on the results of CTF runs, STAR-CCM+ can be run for specific parameters of interest. CASL areas of application are CIPS for single phase analysis and DNB-CTF for two-phase analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vance, J.N.; Holderness, J.H.; James, D.W.
1992-12-01
Waste stream scaling factors based on sampling programs are vulnerable to one or more of the following factors: sample representativeness, analytic accuracy, and measurement sensitivity. As an alternative to sample analyses or as a verification of the sampling results, this project proposes the use of the RADSOURCE code, which accounts for the release of fuel-source radionuclides. Once the release rates of these nuclides from fuel are known, the code develops scaling factors for waste streams based on easily measured Cobalt-60 (Co-60) and Cesium-137 (Cs-137). The project team developed mathematical models to account for the appearance rate of 10CFR61 radionuclides inmore » reactor coolant. They based these models on the chemistry and nuclear physics of the radionuclides involved. Next, they incorporated the models into a computer code that calculates plant waste stream scaling factors based on reactor coolant gamma- isotopic data. Finally, the team performed special sampling at 17 reactors to validate the models in the RADSOURCE code.« less
Analysis of SMA Hybrid Composite Structures in MSC.Nastran and ABAQUS
NASA Technical Reports Server (NTRS)
Turner, Travis L.; Patel, Hemant D.
2005-01-01
A thermoelastic constitutive model for shape memory alloy (SMA) actuators and SMA hybrid composite (SMAHC) structures was recently implemented in the commercial finite element codes MSC.Nastran and ABAQUS. The model may be easily implemented in any code that has the capability for analysis of laminated composite structures with temperature dependent material properties. The model is also relatively easy to use and requires input of only fundamental engineering properties. A brief description of the model is presented, followed by discussion of implementation and usage in the commercial codes. Results are presented from static and dynamic analysis of SMAHC beams of two types; a beam clamped at each end and a cantilever beam. Nonlinear static (post-buckling) and random response analyses are demonstrated for the first specimen. Static deflection (shape) control is demonstrated for the cantilever beam. Approaches for modeling SMAHC material systems with embedded SMA in ribbon and small round wire product forms are demonstrated and compared. The results from the commercial codes are compared to those from a research code as validation of the commercial implementations; excellent correlation is achieved in all cases.
Second Generation Integrated Composite Analyzer (ICAN) Computer Code
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.
1993-01-01
This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.
Transient Ejector Analysis (TEA) code user's guide
NASA Technical Reports Server (NTRS)
Drummond, Colin K.
1993-01-01
A FORTRAN computer program for the semi analytic prediction of unsteady thrust augmenting ejector performance has been developed, based on a theoretical analysis for ejectors. That analysis blends classic self-similar turbulent jet descriptions with control-volume mixing region elements. Division of the ejector into an inlet, diffuser, and mixing region allowed flexibility in the modeling of the physics for each region. In particular, the inlet and diffuser analyses are simplified by a quasi-steady-analysis, justified by the assumption that pressure is the forcing function in those regions. Only the mixing region is assumed to be dominated by viscous effects. The present work provides an overview of the code structure, a description of the required input and output data file formats, and the results for a test case. Since there are limitations to the code for applications outside the bounds of the test case, the user should consider TEA as a research code (not as a production code), designed specifically as an implementation of the proposed ejector theory. Program error flags are discussed, and some diagnostic routines are presented.
Xu, Yun; Muhamadali, Howbeer; Sayqal, Ali; Dixon, Neil; Goodacre, Royston
2016-10-28
Partial least squares (PLS) is one of the most commonly used supervised modelling approaches for analysing multivariate metabolomics data. PLS is typically employed as either a regression model (PLS-R) or a classification model (PLS-DA). However, in metabolomics studies it is common to investigate multiple, potentially interacting, factors simultaneously following a specific experimental design. Such data often cannot be considered as a "pure" regression or a classification problem. Nevertheless, these data have often still been treated as a regression or classification problem and this could lead to ambiguous results. In this study, we investigated the feasibility of designing a hybrid target matrix Y that better reflects the experimental design than simple regression or binary class membership coding commonly used in PLS modelling. The new design of Y coding was based on the same principle used by structural modelling in machine learning techniques. Two real metabolomics datasets were used as examples to illustrate how the new Y coding can improve the interpretability of the PLS model compared to classic regression/classification coding.
ICD-10 procedure codes produce transition challenges.
Boyd, Andrew D; Li, Jianrong 'John'; Kenost, Colleen; Zaim, Samir Rachid; Krive, Jacob; Mittal, Manish; Satava, Richard A; Burton, Michael; Smith, Jacob; Lussier, Yves A
2018-01-01
The transition of procedure coding from ICD-9-CM-Vol-3 to ICD-10-PCS has generated problems for the medical community at large resulting from the lack of clarity required to integrate two non-congruent coding systems. We hypothesized that quantifying these issues with network topology analyses offers a better understanding of the issues, and therefore we developed solutions (online tools) to empower hospital administrators and researchers to address these challenges. Five topologies were identified: "identity"(I), "class-to-subclass"(C2S), "subclass-toclass"(S2C), "convoluted(C)", and "no mapping"(NM). The procedure codes in the 2010 Illinois Medicaid dataset (3,290 patients, 116 institutions) were categorized as C=55%, C2S=40%, I=3%, NM=2%, and S2C=1%. Majority of the problematic and ambiguous mappings (convoluted) pertained to operations in ophthalmology cardiology, urology, gyneco-obstetrics, and dermatology. Finally, the algorithms were expanded into a user-friendly tool to identify problematic topologies and specify lists of procedural codes utilized by medical professionals and researchers for mitigating error-prone translations, simplifying research, and improving quality.http://www.lussiergroup.org/transition-to-ICD10PCS.
Kopčavar Guček, Nena; Petek, Davorina; Švab, Igor; Selič, Polona
2016-03-01
In 1996 the World Health Organization declared intimate partner violence (IPV) the most important public health problem. Meta-analyses in 2013 showed every third female globally had been a victim of violence. Experts find screening controversial; family medicine is the preferred environment for identifying victims of violence, but barriers on both sides prevent patients from discussing it with doctors. In July 2014, a qualitative study was performed through semi-structured interviews with ten family doctors of different ages and gender, working in rural or urban environments. Sound recordings of the interviews were transcribed, and the record verified. The data were interpreted using content analysis. A coding scheme was developed and later verified and analysed by two independent researchers. The text of the interviews was analysed according to the coding scheme. Two coding schemes were developed: one for screening, and the other for the active detection of IPV. The main themes emerging as barriers to screening were lack of time, staff turnover, inadequate finance, ignorance of a clear definition, poor commitment to screening, obligatory follow-up, risk of deterioration of the doctor-patient relationship, and insincerity on the part of the patient. Additionally, cultural aspects of violence, uncertainty/ helplessness, fear, lack of competence and qualifications, autonomy/negative experience, and passive role/stigma/ fear on the part of the patients were barriers to active detection. All the participating doctors had had previous experience with active detection of IPV and were aware of its importance. Due to several barriers to screening for violence they preferred active detection.
KOPČAVAR GUČEK, Nena; PETEK, Davorina; ŠVAB, Igor; SELIČ, Polona
2016-01-01
Introduction In 1996 the World Health Organization declared intimate partner violence (IPV) the most important public health problem. Meta-analyses in 2013 showed every third female globally had been a victim of violence. Experts find screening controversial; family medicine is the preferred environment for identifying victims of violence, but barriers on both sides prevent patients from discussing it with doctors. Methods In July 2014, a qualitative study was performed through semi-structured interviews with ten family doctors of different ages and gender, working in rural or urban environments. Sound recordings of the interviews were transcribed, and the record verified. The data were interpreted using content analysis. A coding scheme was developed and later verified and analysed by two independent researchers. The text of the interviews was analysed according to the coding scheme. Results Two coding schemes were developed: one for screening, and the other for the active detection of IPV. The main themes emerging as barriers to screening were lack of time, staff turnover, inadequate finance, ignorance of a clear definition, poor commitment to screening, obligatory follow-up, risk of deterioration of the doctor-patient relationship, and insincerity on the part of the patient. Additionally, cultural aspects of violence, uncertainty/ helplessness, fear, lack of competence and qualifications, autonomy/negative experience, and passive role/stigma/ fear on the part of the patients were barriers to active detection. Conclusion All the participating doctors had had previous experience with active detection of IPV and were aware of its importance. Due to several barriers to screening for violence they preferred active detection. PMID:27647084
Zahran, Sammy; Magzamen, Sheryl; Breunig, Ian M; Mielke, Howard W
2014-08-01
Previous studies link maternal blood lead (Pb) levels and pregnancy-related hypertensive disorders. Assess the relationship between neighborhood soil Pb and maternal eclampsia risk. Zip code summarized high density soil survey data of New Orleans collected before and after Hurricanes Katrina and Rita (HKR) were merged with pregnancy outcome data on 75,501 mothers from the Louisiana office of public health. Cross-sectional logistic regression analyses are performed testing the association between pre-HKR accumulation of Pb in soils in thirty-two neighborhoods and eclampsia risk. Then we examine whether measured declines in soil Pb following the flooding of the city resulted in corresponding reductions of eclampsia risk. Cross-sectional analyses show that a one standard deviation increase in soil Pb increases the odds of eclampsia by a factor of 1.48 (95% CI: 1.31, 1.66). Mothers in zip code areas with soil Pb>333 mg/kg were 4.00 (95% CI: 3.00, 5.35) times more likely to experience eclampsia than mothers residing in neighborhoods with soil Pb<50mg/kg. Difference-in-differences analyses capturing the exogenous reduction in soil Pb following the 2005 flooding of New Orleans indicate that mothers residing in zip codes experiencing decrease in soil Pb (-387.9 to -33.6 mg/kg) experienced a significant decline in eclampsia risk (OR=0.619; 95% CI: 0.397, 0.963). Mothers residing in neighborhoods with high accumulation of Pb in soils are at heightened risk of experiencing eclampsia. Copyright © 2014 Elsevier Inc. All rights reserved.
The gene coding for the B cell surface protein CD19 is localized on human chromosome 16p11.
Stapleton, P; Kozmik, Z; Weith, A; Busslinger, M
1995-02-01
The CD19 gene codes for one of the earliest markers of the human B cell lineage and is a target for the B lymphoid-specific transcription factor BSAP (Pax-5). The transmembrane protein CD19 has been implicated in controlling proliferation of mature B lymphocytes by modulating signal transduction through the antigen receptor. In this study, we have employed Southern blot and fluorescence in situ hybridization analyses to localize the CD19 gene to human chromosome 16p11.
A Study to Determine the Need for a Standard Limiting the Horsepower of Recreational Boats.
1978-09-01
Acceptance Number of Number Fatal Accidents Non -Fatal Accidents - (Lost control ) 1 93 2 :No attempt to avoid collision) 1 19 72 fAttempted to avoic, not enough...base, and an explanation of the computer SModel designed to aid in organizing and analyzing the data are presented with the results of the analyses. An...Standard 75 S 3.2 Non -Powering Related Accident Sample 76 3.3 Coded Information and Coding Form 77 • - 3.4 Effectiveness Evaluation of the Current
Impulse Response Measurements Over Space-Earth Paths Using the GPS Coarse/Acquisition Codes
NASA Technical Reports Server (NTRS)
Lemmon, J. J.; Papazian, P. B.
1995-01-01
The impulse responses of radio transmission channels over space-earth paths were measured using the course/acquisition code signals from the Global Positioning System of satellites. The data acquisition system and signal processing techniques used to develop the impulse responses are described. Examples of impulse response measurements are presented. The results indicate that this measurement approach enables detection of multipath signals that are 20 dB or more below the power of the direct arrival. Channel characteristics that could be investigated with additional measurements and analyses are discussed.
Comparisons of Flutter Analyses for an Experimental Fan
NASA Technical Reports Server (NTRS)
Bakhle, Milind A.; Reddy, T. S. R.; Stefko, George L.
2010-01-01
Two propulsion aeroelasticity codes were used to model the aeroelastic characteristics of an experimental forward-swept fan that encountered flutter during wind tunnel testing. Both of these three-dimensional codes model the unsteady flowfield due to blade vibrations using the Navier-Stokes equations. In the first approach, the unsteady flow equations are solved using an implicit time-marching approach. In the second approach, the unsteady flow equations are converted to a harmonic balance form and solved using a pseudo-time marching method. This paper describes the flutter calculations and compares the results to experimental measurements.
Manual of phosphoric acid fuel cell power plant optimization model and computer program
NASA Technical Reports Server (NTRS)
Lu, C. Y.; Alkasab, K. A.
1984-01-01
An optimized cost and performance model for a phosphoric acid fuel cell power plant system was derived and developed into a modular FORTRAN computer code. Cost, energy, mass, and electrochemical analyses were combined to develop a mathematical model for optimizing the steam to methane ratio in the reformer, hydrogen utilization in the PAFC plates per stack. The nonlinear programming code, COMPUTE, was used to solve this model, in which the method of mixed penalty function combined with Hooke and Jeeves pattern search was chosen to evaluate this specific optimization problem.
Compositional correlations in the chicken genome.
Musto, H; Romero, H; Zavala, A; Bernardi, G
1999-09-01
This paper analyses the compositional correlations that hold in the chicken genome. Significant linear correlations were found among the regions studied-coding sequences (and their first, second, and third codon positions), flanking regions (5' and 3'), and introns-as is the case in the human genome. We found that these compositional correlations are not limited to global GC levels but even extend to individual bases. Furthermore, an analysis of 1037 coding sequences has confirmed a correlation among GC(3), GC(2), and GC(1). The implications of these results are discussed.
Gene and genon concept: coding versus regulation
2007-01-01
We analyse here the definition of the gene in order to distinguish, on the basis of modern insight in molecular biology, what the gene is coding for, namely a specific polypeptide, and how its expression is realized and controlled. Before the coding role of the DNA was discovered, a gene was identified with a specific phenotypic trait, from Mendel through Morgan up to Benzer. Subsequently, however, molecular biologists ventured to define a gene at the level of the DNA sequence in terms of coding. As is becoming ever more evident, the relations between information stored at DNA level and functional products are very intricate, and the regulatory aspects are as important and essential as the information coding for products. This approach led, thus, to a conceptual hybrid that confused coding, regulation and functional aspects. In this essay, we develop a definition of the gene that once again starts from the functional aspect. A cellular function can be represented by a polypeptide or an RNA. In the case of the polypeptide, its biochemical identity is determined by the mRNA prior to translation, and that is where we locate the gene. The steps from specific, but possibly separated sequence fragments at DNA level to that final mRNA then can be analysed in terms of regulation. For that purpose, we coin the new term “genon”. In that manner, we can clearly separate product and regulative information while keeping the fundamental relation between coding and function without the need to introduce a conceptual hybrid. In mRNA, the program regulating the expression of a gene is superimposed onto and added to the coding sequence in cis - we call it the genon. The complementary external control of a given mRNA by trans-acting factors is incorporated in its transgenon. A consequence of this definition is that, in eukaryotes, the gene is, in most cases, not yet present at DNA level. Rather, it is assembled by RNA processing, including differential splicing, from various pieces, as steered by the genon. It emerges finally as an uninterrupted nucleic acid sequence at mRNA level just prior to translation, in faithful correspondence with the amino acid sequence to be produced as a polypeptide. After translation, the genon has fulfilled its role and expires. The distinction between the protein coding information as materialised in the final polypeptide and the processing information represented by the genon allows us to set up a new information theoretic scheme. The standard sequence information determined by the genetic code expresses the relation between coding sequence and product. Backward analysis asks from which coding region in the DNA a given polypeptide originates. The (more interesting) forward analysis asks in how many polypeptides of how many different types a given DNA segment is expressed. This concerns the control of the expression process for which we have introduced the genon concept. Thus, the information theoretic analysis can capture the complementary aspects of coding and regulation, of gene and genon. PMID:18087760
Nonlinear heat transfer and structural analyses of SSME turbine blades
NASA Technical Reports Server (NTRS)
Abdul-Aziz, A.; Kaufman, A.
1987-01-01
Three-dimensional nonlinear finite-element heat transfer and structural analyses were performed for the first stage high-pressure fuel turbopump blade of the space shuttle main engine (SSME). Directionally solidified (DS) MAR-M 246 material properties were considered for the analyses. Analytical conditions were based on a typical test stand engine cycle. Blade temperature and stress-strain histories were calculated using MARC finite-element computer code. The study was undertaken to assess the structural response of an SSME turbine blade and to gain greater understanding of blade damage mechanisms, convective cooling effects, and the thermal-mechanical effects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, Matthew; Yin, Shengjun; Stevens, Gary
2012-01-01
In past years, the authors have undertaken various studies of nozzles in both boiling water reactors (BWRs) and pressurized water reactors (PWRs) located in the reactor pressure vessel (RPV) adjacent to the core beltline region. Those studies described stress and fracture mechanics analyses performed to assess various RPV nozzle geometries, which were selected based on their proximity to the core beltline region, i.e., those nozzle configurations that are located close enough to the core region such that they may receive sufficient fluence prior to end-of-life (EOL) to require evaluation of embrittlement as part of the RPV analyses associated with pressure-temperaturemore » (P-T) limits. In this paper, additional stress and fracture analyses are summarized that were performed for additional PWR nozzles with the following objectives: To expand the population of PWR nozzle configurations evaluated, which was limited in the previous work to just two nozzles (one inlet and one outlet nozzle). To model and understand differences in stress results obtained for an internal pressure load case using a two-dimensional (2-D) axi-symmetric finite element model (FEM) vs. a three-dimensional (3-D) FEM for these PWR nozzles. In particular, the ovalization (stress concentration) effect of two intersecting cylinders, which is typical of RPV nozzle configurations, was investigated. To investigate the applicability of previously recommended linear elastic fracture mechanics (LEFM) hand solutions for calculating the Mode I stress intensity factor for a postulated nozzle corner crack for pressure loading for these PWR nozzles. These analyses were performed to further expand earlier work completed to support potential revision and refinement of Title 10 to the U.S. Code of Federal Regulations (CFR), Part 50, Appendix G, Fracture Toughness Requirements, and are intended to supplement similar evaluation of nozzles presented at the 2008, 2009, and 2011 Pressure Vessels and Piping (PVP) Conferences. This work is also relevant to the ongoing efforts of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel (B&PV) Code, Section XI, Working Group on Operating Plant Criteria (WGOPC) efforts to incorporate nozzle fracture mechanics solutions into a revision to ASME B&PV Code, Section XI, Nonmandatory Appendix G.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir; O'Malley, Daniel; Lin, Youzuo
2016-07-01
Mads.jl (Model analysis and decision support in Julia) is a code that streamlines the process of using data and models for analysis and decision support. It is based on another open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11- 035). Mads.jl can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. It enables a number of data- and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. The code also can use a series of alternative adaptive computational techniques for Bayesian sampling, Monte Carlo,more » and Bayesian Information-Gap Decision Theory. The code is implemented in the Julia programming language, and has high-performance (parallel) and memory management capabilities. The code uses a series of third party modules developed by others. The code development will also include contributions to the existing third party modules written in Julia; this contributions will be important for the efficient implementation of the algorithm used by Mads.jl. The code also uses a series of LANL developed modules that are developed by Dan O'Malley; these modules will be also a part of the Mads.jl release. Mads.jl will be released under GPL V3 license. The code will be distributed as a Git repo at gitlab.com and github.com. Mads.jl manual and documentation will be posted at madsjulia.lanl.gov.« less
Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes
NASA Astrophysics Data System (ADS)
Aghara, S. K.; Sriprisan, S. I.; Singleterry, R. C.; Sato, T.
2015-01-01
Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm2 Al shield followed by 30 g/cm2 of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E < 100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results.
Radiation from advanced solid rocket motor plumes
NASA Technical Reports Server (NTRS)
Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.
1994-01-01
The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.
Towards a European code of medical ethics. Ethical and legal issues.
Patuzzo, Sara; Pulice, Elisabetta
2017-01-01
The feasibility of a common European code of medical ethics is discussed, with consideration and evaluation of the difficulties such a project is going to face, from both the legal and ethical points of view. On the one hand, the analysis will underline the limits of a common European code of medical ethics as an instrument for harmonising national professional rules in the European context; on the other hand, we will highlight some of the potentials of this project, which could be increased and strengthened through a proper rulemaking process and through adequate and careful choice of content. We will also stress specific elements and devices that should be taken into consideration during the establishment of the code, from both procedural and content perspectives. Regarding methodological issues, the limits and potentialities of a common European code of medical ethics will be analysed from an ethical point of view and then from a legal perspective. The aim of this paper is to clarify the framework for the potential but controversial role of the code in the European context, showing the difficulties in enforcing and harmonising national ethical rules into a European code of medical ethics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Analysis of typical WWER-1000 severe accident scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorokin, Yu.S.; Shchekoldin, V.V.; Borisov, L.N.
2004-07-01
At present in EDO 'Gidropress' there is a certain experience of performing the analyses of severe accidents of reactor plant with WWER with application of domestic and foreign codes. Important data were also obtained by the results of calculation modeling of integrated experiments with fuel assembly melting comprising a real fuel. Systematization and consideration of these data in development and assimilation of codes are extremely important in connection with large uncertainty still existing in understanding and adequate description of phenomenology of severe accidents. The presented report gives a comparison of analysis results of severe accidents of reactor plant with WWER-1000more » for two typical scenarios made by using American MELCOR code and the Russian RATEG/SVECHA/HEFEST code. The results of calculation modeling are compared using above codes with the data of experiment FPT1 with fuel assembly melting comprising a real fuel, which has been carried out at the facility Phebus (France). The obtained results are considered in the report from the viewpoint of: - adequacy of results of calculation modeling of separate phenomena during severe accidents of RP with WWER by using the above codes; - influence of uncertainties (degree of details of calculation models, choice of parameters of models etc.); - choice of those or other setup variables (options) in the used codes; - necessity of detailed modeling of processes and phenomena as applied to design justification of safety of RP with WWER. (authors)« less
Changing Beliefs about Trauma: A Qualitative Study of Cognitive Processing Therapy.
Price, Jennifer L; MacDonald, Helen Z; Adair, Kathryn C; Koerner, Naomi; Monson, Candice M
2016-03-01
Controlled qualitative methods complement quantitative treatment outcome research and enable a more thorough understanding of the effects of therapy and the suspected mechanisms of action. Thematic analyses were used to examine outcomes of cognitive processing therapy (CPT) for posttraumatic stress disorder (PTSD) in a randomized controlled trial of individuals diagnosed with military-related PTSD (n = 15). After sessions 1 and 11, participants wrote "impact statements" describing their appraisals of their trauma and beliefs potentially impacted by traumatic events. Trained raters coded each of these statements using a thematic coding scheme. An analysis of thematic coding revealed positive changes over the course of therapy in participants' perspective on their trauma and their future, supporting the purported mechanisms of CPT. Implications of this research for theory and clinical practice are discussed.
AphasiaBank: a resource for clinicians.
Forbes, Margaret M; Fromm, Davida; Macwhinney, Brian
2012-08-01
AphasiaBank is a shared, multimedia database containing videos and transcriptions of ~180 aphasic individuals and 140 nonaphasic controls performing a uniform set of discourse tasks. The language in the videos is transcribed in Codes for the Human Analysis of Transcripts (CHAT) format and coded for analysis with Computerized Language ANalysis (CLAN) programs, which can perform a wide variety of language analyses. The database and the CLAN programs are freely available to aphasia researchers and clinicians for educational, clinical, and scholarly uses. This article describes the database, suggests some ways in which clinicians and clinician researchers might find these materials useful, and introduces a new language analysis program, EVAL, designed to streamline the transcription and coding processes, while still producing an extensive and useful language profile. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Heat transfer in rocket engine combustion chambers and regeneratively cooled nozzles
NASA Technical Reports Server (NTRS)
1993-01-01
A conjugate heat transfer computational fluid dynamics (CFD) model to describe regenerative cooling in the main combustion chamber and nozzle and in the injector faceplate region for a launch vehicle class liquid rocket engine was developed. An injector model for sprays which treats the fluid as a variable density, single-phase media was formulated, incorporated into a version of the FDNS code, and used to simulate the injector flow typical of that in the Space Shuttle Main Engine (SSME). Various chamber related heat transfer analyses were made to verify the predictive capability of the conjugate heat transfer analysis provided by the FDNS code. The density based version of the FDNS code with the real fluid property models developed was successful in predicting the streamtube combustion of individual injector elements.
NASA Astrophysics Data System (ADS)
The present conference on the development status of communications systems in the context of electronic warfare gives attention to topics in spread spectrum code acquisition, digital speech technology, fiber-optics communications, free space optical communications, the networking of HF systems, and applications and evaluation methods for digital speech. Also treated are issues in local area network system design, coding techniques and applications, technology applications for HF systems, receiver technologies, software development status, channel simultion/prediction methods, C3 networking spread spectrum networks, the improvement of communication efficiency and reliability through technical control methods, mobile radio systems, and adaptive antenna arrays. Finally, communications system cost analyses, spread spectrum performance, voice and image coding, switched networks, and microwave GaAs ICs, are considered.
Boundary-Layer Stability Analysis of the Mean Flows Obtained Using Unstructured Grids
NASA Technical Reports Server (NTRS)
Liao, Wei; Malik, Mujeeb R.; Lee-Rausch, Elizabeth M.; Li, Fei; Nielsen, Eric J.; Buning, Pieter G.; Chang, Chau-Lyan; Choudhari, Meelan M.
2012-01-01
Boundary-layer stability analyses of mean flows extracted from unstructured-grid Navier- Stokes solutions have been performed. A procedure has been developed to extract mean flow profiles from the FUN3D unstructured-grid solutions. Extensive code-to-code validations have been performed by comparing the extracted mean ows as well as the corresponding stability characteristics to the predictions based on structured-grid solutions. Comparisons are made on a range of problems from a simple at plate to a full aircraft configuration-a modified Gulfstream-III with a natural laminar flow glove. The future aim of the project is to extend the adjoint-based design capability in FUN3D to include natural laminar flow and laminar flow control by integrating it with boundary-layer stability analysis codes, such as LASTRAC.
HSR combustion analytical research
NASA Technical Reports Server (NTRS)
Nguyen, H. Lee
1992-01-01
Increasing the pressure and temperature of the engines of a new generation of supersonic airliners increases the emissions of nitrogen oxides (NO(x)) to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of evolving and implementing low emissions combustor technologies, NASA LeRC has pursued a combustion analysis code program to guide combustor design processes, to identify potential concepts of the greatest promise, and to optimize them at low cost, with short turnaround time. The computational analyses are evaluated at actual engine operating conditions. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts were made in further improving the code capabilities for modeling the physics and the numerical methods of solution. Then test cases and measurements from experiments are used for code validation.
Standardized verification of fuel cycle modeling
Feng, B.; Dixon, B.; Sunny, E.; ...
2016-04-05
A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less
TRAC-PF1/MOD1 pretest predictions of MIST experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyack, B.E.; Steiner, J.L.; Siebe, D.A.
Los Alamos National Laboratory is a participant in the Integral System Test (IST) program initiated in June 1983 to provide integral system test data on specific issues and phenomena relevant to post small-break loss-of-coolant accidents (SBLOCAs) in Babcock and Wilcox plant designs. The Multi-Loop Integral System Test (MIST) facility is the largest single component in the IST program. During Fiscal Year 1986, Los Alamos performed five MIST pretest analyses. The five experiments were chosen on the basis of their potential either to approach the facility limits or to challenge the predictive capability of the TRAC-PF1/MOD1 code. Three SBLOCA tests weremore » examined which included nominal test conditions, throttled auxiliary feedwater and asymmetric steam-generator cooldown, and reduced high-pressure-injection (HPI) capacity, respectively. Also analyzed were two ''feed-and-bleed'' cooling tests with reduced HPI and delayed HPI initiation. Results of the tests showed that the MIST facility limits would not be approached in the five tests considered. Early comparisons with preliminary test data indicate that the TRAC-PF1/MOD1 code is correctly calculating the dominant phenomena occurring in the MIST facility during the tests. Posttest analyses are planned to provide a quantitative assessment of the code's ability to predict MIST transients.« less
Constitutive modeling for isotropic materials (HOST)
NASA Technical Reports Server (NTRS)
Lindholm, U. S.; Chan, K. S.; Bodner, S. R.; Weber, R. M.; Walker, K. P.; Cassenti, B. N.
1985-01-01
This report presents the results of the second year of work on a problem which is part of the NASA HOST Program. Its goals are: (1) to develop and validate unified constitutive models for isotropic materials, and (2) to demonstrate their usefulness for structural analyses of hot section components of gas turbine engines. The unified models selected for development and evaluation are that of Bodner-Partom and Walker. For model evaluation purposes, a large constitutive data base is generated for a B1900 + Hf alloy by performing uniaxial tensile, creep, cyclic, stress relation, and thermomechanical fatigue (TMF) tests as well as biaxial (tension/torsion) tests under proportional and nonproportional loading over a wide range of strain rates and temperatures. Systematic approaches for evaluating material constants from a small subset of the data base are developed. Correlations of the uniaxial and biaxial tests data with the theories of Bodner-Partom and Walker are performed to establish the accuracy, range of applicability, and integability of the models. Both models are implemented in the MARC finite element computer code and used for TMF analyses. Benchmark notch round experiments are conducted and the results compared with finite-element analyses using the MARC code and the Walker model.
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.
Understanding the Flow Physics of Shock Boundary-Layer Interactions Using CFD and Numerical Analyses
NASA Technical Reports Server (NTRS)
Friedlander, David J.
2013-01-01
Computational fluid dynamic (CFD) analyses of the University of Michigan (UM) Shock/Boundary-Layer Interaction (SBLI) experiments were performed as an extension of the CFD SBLI Workshop held at the 48th AIAA Aerospace Sciences Meeting in 2010. In particular, the UM Mach 2.75 Glass Tunnel with a semi-spanning 7.75deg wedge was analyzed in attempts to explore key physics pertinent to SBLI's, including thermodynamic and viscous boundary conditions as well as turbulence modeling. Most of the analyses were 3D CFD simulations using the OVERFLOW flow solver, with additional quasi-1D simulations performed with an in house MATLAB code interfacing with the NIST REFPROP code to explore perfect verses non-ideal air. A fundamental exploration pertaining to the effects of particle image velocimetry (PIV) on post-processing data is also shown. Results from the CFD simulations showed an improvement in agreement with experimental data with key contributions including adding a laminar zone upstream of the wedge and the necessity of mimicking PIV particle lag for comparisons. Results from the quasi-1D simulation showed that there was little difference between perfect and non-ideal air for the configuration presented.
ASSESSING THE IMPACT OF LANDUSE/LANDCOVER ON STREAM CHEMISTRY IN MARYLAND
Spatial and statistical analyses were conducted to investigate the relationships between stream chemistry (nitrate, sulfate, dissolved organic carbon, etc.), habitat and satellite-derived landuse maps for the state of Maryland. Hydrologic Unit Code (HUC) watershed boundaries (8-...
Software Assurance Measurement -- State of the Practice
2013-11-01
quality and productivity. 30+ languages, C/C++, Java , .NET, Oracle, PeopleSoft, SAP, Siebel, Spring, Struts, Hibernate , and all major databases. ChecKing...NET 39 ActionScript 39 Ada 40 C/C++ 40 Java 41 JavaScript 42 Objective-C 42 Opa 42 Packages 42 Perl 42 PHP 42 Python 42 Formal Methods...Suite—A tool for Ada, C, C++, C#, and Java code that comprises various analyses such as architecture checking, interface analyses, and clone detection
Mun, Chung Jung; Tein, Jenn-Yun; Kim, Hanjoe; Shaw, Daniel S.; Gardner, Frances; Wilson, Melvin N.; Peterson, Jenene
2018-01-01
This study examined the validity of micro social observations and macro ratings of parent–child interaction in early to middle childhood. Seven hundred and thirty-one families representing multiple ethnic groups were recruited and screened as at risk in the context of Women, Infant, and Children (WIC) Nutritional Supplement service settings. Families were randomly assigned to the Family Checkup (FCU) intervention or the control condition at age 2 and videotaped in structured interactions in the home at ages 2, 3, 4, and 5. Parent–child interaction videotapes were microcoded using the Relationship Affect Coding System (RACS) that captures the duration of two mutual dyadic states: positive engagement and coercion. Macro ratings of parenting skills were collected after coding the videotapes to assess parent use of positive behavior support and limit setting skills (or lack thereof). Confirmatory factor analyses revealed that the measurement model of macro ratings of limit setting and positive behavior support was not supported by the data, and thus, were excluded from further analyses. However, there was moderate stability in the families’ micro social dynamics across early childhood and it showed significant improvements as a function of random assignment to the FCU. Moreover, parent–child dynamics were predictive of chronic behavior problems as rated by parents in middle childhood, but not emotional problems. We conclude with a discussion of the validity of the RACS and on methodological advantages of micro social coding over the statistical limitations of macro rating observations. Future directions are discussed for observation research in prevention science. PMID:27620623
Recurrent and functional regulatory mutations in breast cancer.
Rheinbay, Esther; Parasuraman, Prasanna; Grimsby, Jonna; Tiao, Grace; Engreitz, Jesse M; Kim, Jaegil; Lawrence, Michael S; Taylor-Weiner, Amaro; Rodriguez-Cuevas, Sergio; Rosenberg, Mara; Hess, Julian; Stewart, Chip; Maruvka, Yosef E; Stojanov, Petar; Cortes, Maria L; Seepo, Sara; Cibulskis, Carrie; Tracy, Adam; Pugh, Trevor J; Lee, Jesse; Zheng, Zongli; Ellisen, Leif W; Iafrate, A John; Boehm, Jesse S; Gabriel, Stacey B; Meyerson, Matthew; Golub, Todd R; Baselga, Jose; Hidalgo-Miranda, Alfredo; Shioda, Toshi; Bernards, Andre; Lander, Eric S; Getz, Gad
2017-07-06
Genomic analysis of tumours has led to the identification of hundreds of cancer genes on the basis of the presence of mutations in protein-coding regions. By contrast, much less is known about cancer-causing mutations in non-coding regions. Here we perform deep sequencing in 360 primary breast cancers and develop computational methods to identify significantly mutated promoters. Clear signals are found in the promoters of three genes. FOXA1, a known driver of hormone-receptor positive breast cancer, harbours a mutational hotspot in its promoter leading to overexpression through increased E2F binding. RMRP and NEAT1, two non-coding RNA genes, carry mutations that affect protein binding to their promoters and alter expression levels. Our study shows that promoter regions harbour recurrent mutations in cancer with functional consequences and that the mutations occur at similar frequencies as in coding regions. Power analyses indicate that more such regions remain to be discovered through deep sequencing of adequately sized cohorts of patients.
Development of a new version of the Vehicle Protection Factor Code (VPF3)
NASA Astrophysics Data System (ADS)
Jamieson, Terrance J.
1990-10-01
The Vehicle Protection Factor (VPF) Code is an engineering tool for estimating radiation protection afforded by armoured vehicles and other structures exposed to neutron and gamma ray radiation from fission, thermonuclear, and fusion sources. A number of suggestions for modifications have been offered by users of early versions of the code. These include: implementing some of the more advanced features of the air transport rating code, ATR5, used to perform the air over ground radiation transport analyses; allowing the ability to study specific vehicle orientations within the free field; implementing an adjoint transport scheme to reduce the number of transport runs required; investigating the possibility of accelerating the transport scheme; and upgrading the computer automated design (CAD) package used by VPF. The generation of radiation free field fluences for infinite air geometries as required for aircraft analysis can be accomplished by using ATR with the air over ground correction factors disabled. Analysis of the effects of fallout bearing debris clouds on aircraft will require additional modelling of VPF.
Short-lived non-coding transcripts (SLiTs): Clues to regulatory long non-coding RNA.
Tani, Hidenori
2017-03-22
Whole transcriptome analyses have revealed a large number of novel long non-coding RNAs (lncRNAs). Although the importance of lncRNAs has been documented in previous reports, the biological and physiological functions of lncRNAs remain largely unknown. The role of lncRNAs seems an elusive problem. Here, I propose a clue to the identification of regulatory lncRNAs. The key point is RNA half-life. RNAs with a long half-life (t 1/2 > 4 h) contain a significant proportion of ncRNAs, as well as mRNAs involved in housekeeping functions, whereas RNAs with a short half-life (t 1/2 < 4 h) include known regulatory ncRNAs and regulatory mRNAs. This novel class of ncRNAs with a short half-life can be categorized as Short-Lived non-coding Transcripts (SLiTs). I consider that SLiTs are likely to be rich in functionally uncharacterized regulatory RNAs. This review describes recent progress in research into SLiTs.
Posttest analysis of the 1:6-scale reinforced concrete containment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeiffer, P.A.; Kennedy, J.M.; Marchertas, A.H.
A prediction of the response of the Sandia National Laboratories 1:6- scale reinforced concrete containment model test was made by Argonne National Laboratory. ANL along with nine other organizations performed a detailed nonlinear response analysis of the 1:6-scale model containment subjected to overpressurization in the fall of 1986. The two-dimensional code TEMP-STRESS and the three-dimensional NEPTUNE code were utilized (1) to predict the global response of the structure, (2) to identify global failure sites and the corresponding failure pressures and (3) to identify some local failure sites and pressure levels. A series of axisymmetric models was studied with the two-dimensionalmore » computer program TEMP-STRESS. The comparison of these pretest computations with test data from the containment model has provided a test for the capability of the respective finite element codes to predict global failure modes, and hence serves as a validation of these codes. Only the two-dimensional analyses will be discussed in this paper. 3 refs., 10 figs.« less
Overview of the relevant CFD work at Thiokol Corporation
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Loh, Hai-Tien
1992-01-01
An in-house developed proprietary advanced computational fluid dynamics code called SHARP (Trademark) is a primary tool for many flow simulations and design analyses. The SHARP code is a time dependent, two dimensional (2-D) axisymmetric numerical solution technique for the compressible Navier-Stokes equations. The solution technique in SHARP uses a vectorizable implicit, second order accurate in time and space, finite volume scheme based on an upwind flux-difference splitting of a Roe-type approximated Riemann solver, Van Leer's flux vector splitting, and a fourth order artificial dissipation scheme with a preconditioning to accelerate the flow solution. Turbulence is simulated by an algebraic model, and ultimately the kappa-epsilon model. Some other capabilities of the code are 2-D two-phase Lagrangian particle tracking and cell blockages. Extensive development and testing has been conducted on the 3-D version of the code with flow, combustion, and turbulence interactions. The emphasis here is on the specific applications of SHARP in Solid Rocket Motor design. Information is given in viewgraph form.
FDNS CFD Code Benchmark for RBCC Ejector Mode Operation: Continuing Toward Dual Rocket Effects
NASA Technical Reports Server (NTRS)
West, Jeff; Ruf, Joseph H.; Turner, James E. (Technical Monitor)
2000-01-01
Computational Fluid Dynamics (CFD) analysis results are compared with benchmark quality test data from the Propulsion Engineering Research Center's (PERC) Rocket Based Combined Cycle (RBCC) experiments to verify fluid dynamic code and application procedures. RBCC engine flowpath development will rely on CFD applications to capture the multi -dimensional fluid dynamic interactions and to quantify their effect on the RBCC system performance. Therefore, the accuracy of these CFD codes must be determined through detailed comparisons with test data. The PERC experiments build upon the well-known 1968 rocket-ejector experiments of Odegaard and Stroup by employing advanced optical and laser based diagnostics to evaluate mixing and secondary combustion. The Finite Difference Navier Stokes (FDNS) code [2] was used to model the fluid dynamics of the PERC RBCC ejector mode configuration. Analyses were performed for the Diffusion and Afterburning (DAB) test conditions at the 200-psia thruster operation point, Results with and without downstream fuel injection are presented.
Ince, Robin A. A.; Jaworska, Katarzyna; Gross, Joachim; Panzeri, Stefano; van Rijsbergen, Nicola J.; Rousselet, Guillaume A.; Schyns, Philippe G.
2016-01-01
A key to understanding visual cognition is to determine “where”, “when”, and “how” brain responses reflect the processing of the specific visual features that modulate categorization behavior—the “what”. The N170 is the earliest Event-Related Potential (ERP) that preferentially responds to faces. Here, we demonstrate that a paradigmatic shift is necessary to interpret the N170 as the product of an information processing network that dynamically codes and transfers face features across hemispheres, rather than as a local stimulus-driven event. Reverse-correlation methods coupled with information-theoretic analyses revealed that visibility of the eyes influences face detection behavior. The N170 initially reflects coding of the behaviorally relevant eye contralateral to the sensor, followed by a causal communication of the other eye from the other hemisphere. These findings demonstrate that the deceptively simple N170 ERP hides a complex network information processing mechanism involving initial coding and subsequent cross-hemispheric transfer of visual features. PMID:27550865
Processes involved in solving mathematical problems
NASA Astrophysics Data System (ADS)
Shahrill, Masitah; Putri, Ratu Ilma Indra; Zulkardi, Prahmana, Rully Charitas Indra
2018-04-01
This study examines one of the instructional practices features utilized within the Year 8 mathematics lessons in Brunei Darussalam. The codes from the TIMSS 1999 Video Study were applied and strictly followed, and from the 183 mathematics problems recorded, there were 95 problems with a solution presented during the public segments of the video-recorded lesson sequences of the four sampled teachers. The analyses involved firstly, identifying the processes related to mathematical problem statements, and secondly, examining the different processes used in solving the mathematical problems for each problem publicly completed during the lessons. The findings revealed that for three of the teachers, their problem statements coded as `using procedures' ranged from 64% to 83%, while the remaining teacher had 40% of his problem statements coded as `making connections.' The processes used when solving the problems were mainly `using procedures', and none of the problems were coded as `giving results only'. Furthermore, all four teachers made use of making the relevant connections in solving the problems given to their respective students.
The complete mitochondrial genome of Papilio glaucus and its phylogenetic implications.
Shen, Jinhui; Cong, Qian; Grishin, Nick V
2015-09-01
Due to the intriguing morphology, lifecycle, and diversity of butterflies and moths, Lepidoptera are emerging as model organisms for the study of genetics, evolution and speciation. The progress of these studies relies on decoding Lepidoptera genomes, both nuclear and mitochondrial. Here we describe a protocol to obtain mitogenomes from Next Generation Sequencing reads performed for whole-genome sequencing and report the complete mitogenome of Papilio (Pterourus) glaucus. The circular mitogenome is 15,306 bp in length and rich in A and T. It contains 13 protein-coding genes (PCGs), 22 transfer-RNA-coding genes (tRNA), and 2 ribosomal-RNA-coding genes (rRNA), with a gene order typical for mitogenomes of Lepidoptera. We performed phylogenetic analyses based on PCG and RNA-coding genes or protein sequences using Bayesian Inference and Maximum Likelihood methods. The phylogenetic trees consistently show that among species with available mitogenomes Papilio glaucus is the closest to Papilio (Agehana) maraho from Asia.
Causes of Death Data in the Global Burden of Disease Estimates for Ischemic and Hemorrhagic Stroke.
Truelsen, Thomas; Krarup, Lars-Henrik; Iversen, Helle K; Mensah, George A; Feigin, Valery L; Sposato, Luciano A; Naghavi, Mohsen
2015-01-01
Stroke mortality estimates in the Global Burden of Disease (GBD) study are based on routine mortality statistics and redistribution of ill-defined codes that cannot be a cause of death, the so-called 'garbage codes' (GCs). This study describes the contribution of these codes to stroke mortality estimates. All available mortality data were compiled and non-specific cause codes were redistributed based on literature review and statistical methods. Ill-defined codes were redistributed to their specific cause of disease by age, sex, country and year. The reassignment was done based on the International Classification of Diseases and the pathology behind each code by checking multiple causes of death and literature review. Unspecified stroke and primary and secondary hypertension are leading contributing 'GCs' to stroke mortality estimates for hemorrhagic stroke (HS) and ischemic stroke (IS). There were marked differences in the fraction of death assigned to IS and HS for unspecified stroke and hypertension between GBD regions and between age groups. A large proportion of stroke fatalities are derived from the redistribution of 'unspecified stroke' and 'hypertension' with marked regional differences. Future advancements in stroke certification, data collections and statistical analyses may improve the estimation of the global stroke burden. © 2015 S. Karger AG, Basel.
RETRANO3 benchmarks for Beaver Valley plant transients and FSAR analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaumont, E.T.; Feltus, M.A.
1993-01-01
Any best-estimate code (e.g., RETRANO3) results must be validated against plant data and final safety analysis report (FSAR) predictions. The need for two independent means of benchmarking is necessary to ensure that the results were not biased toward a particular data set and to have a certain degree of accuracy. The code results need to be compared with previous results and show improvements over previous code results. Ideally, the two best means of benchmarking a thermal hydraulics code are comparing results from previous versions of the same code along with actual plant data. This paper describes RETRAN03 benchmarks against RETRAN02more » results, actual plant data, and FSAR predictions. RETRAN03, the Electric Power Research Institute's latest version of the RETRAN thermal-hydraulic analysis codes, offers several upgrades over its predecessor, RETRAN02 Mod5. RETRAN03 can use either implicit or semi-implicit numerics, whereas RETRAN02 Mod5 uses only semi-implicit numerics. Another major upgrade deals with slip model options. RETRAN03 added several new models, including a five-equation model for more accurate modeling of two-phase flow. RETPAN02 Mod5 should give similar but slightly more conservative results than RETRAN03 when executed with RETRAN02 Mod5 options.« less
The Initial Atmospheric Transport (IAT) Code: Description and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrow, Charles W.; Bartel, Timothy James
The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34Dmore » accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.« less
De Matteis, Sara; Jarvis, Deborah; Young, Heather; Young, Alan; Allen, Naomi; Potts, James; Darnton, Andrew; Rushton, Lesley; Cullinan, Paul
2017-03-01
Objectives The standard approach to the assessment of occupational exposures is through the manual collection and coding of job histories. This method is time-consuming and costly and makes it potentially unfeasible to perform high quality analyses on occupational exposures in large population-based studies. Our aim was to develop a novel, efficient web-based tool to collect and code lifetime job histories in the UK Biobank, a population-based cohort of over 500 000 participants. Methods We developed OSCAR (occupations self-coding automatic recording) based on the hierarchical structure of the UK Standard Occupational Classification (SOC) 2000, which allows individuals to collect and automatically code their lifetime job histories via a simple decision-tree model. Participants were asked to find each of their jobs by selecting appropriate job categories until they identified their job title, which was linked to a hidden 4-digit SOC code. For each occupation a job title in free text was also collected to estimate Cohen's kappa (κ) inter-rater agreement between SOC codes assigned by OSCAR and an expert manual coder. Results OSCAR was administered to 324 653 UK Biobank participants with an existing email address between June and September 2015. Complete 4-digit SOC-coded lifetime job histories were collected for 108 784 participants (response rate: 34%). Agreement between the 4-digit SOC codes assigned by OSCAR and the manual coder for a random sample of 400 job titles was moderately good [κ=0.45, 95% confidence interval (95% CI) 0.42-0.49], and improved when broader job categories were considered (κ=0.64, 95% CI 0.61-0.69 at a 1-digit SOC-code level). Conclusions OSCAR is a novel, efficient, and reasonably reliable web-based tool for collecting and automatically coding lifetime job histories in large population-based studies. Further application in other research projects for external validation purposes is warranted.
Humphreys-Pereira, Danny A; Elling, Axel A
2014-01-01
Root-knot nematodes (Meloidogyne spp.) are among the most important plant pathogens. In this study, the mitochondrial (mt) genomes of the root-knot nematodes, M. chitwoodi and M. incognita were sequenced. PCR analyses suggest that both mt genomes are circular, with an estimated size of 19.7 and 18.6-19.1kb, respectively. The mt genomes each contain a large non-coding region with tandem repeats and the control region. The mt gene arrangement of M. chitwoodi and M. incognita is unlike that of other nematodes. Sequence alignments of the two Meloidogyne mt genomes showed three translocations; two in transfer RNAs and one in cox2. Compared with other nematode mt genomes, the gene arrangement of M. chitwoodi and M. incognita was most similar to Pratylenchus vulnus. Phylogenetic analyses (Maximum Likelihood and Bayesian inference) were conducted using 78 complete mt genomes of diverse nematode species. Analyses based on nucleotides and amino acids of the 12 protein-coding mt genes showed strong support for the monophyly of class Chromadorea, but only amino acid-based analyses supported the monophyly of class Enoplea. The suborder Spirurina was not monophyletic in any of the phylogenetic analyses, contradicting the Clade III model, which groups Ascaridomorpha, Spiruromorpha and Oxyuridomorpha based on the small subunit ribosomal RNA gene. Importantly, comparisons of mt gene arrangement and tree-based methods placed Meloidogyne as sister taxa of Pratylenchus, a migratory plant endoparasitic nematode, and not with the sedentary endoparasitic Heterodera. Thus, comparative analyses of mt genomes suggest that sedentary endoparasitism in Meloidogyne and Heterodera is based on convergent evolution. Copyright © 2014 Elsevier B.V. All rights reserved.
2011-01-01
Background Electronic patient records are generally coded using extensive sets of codes but the significance of the utilisation of individual codes may be unclear. Item response theory (IRT) models are used to characterise the psychometric properties of items included in tests and questionnaires. This study asked whether the properties of medical codes in electronic patient records may be characterised through the application of item response theory models. Methods Data were provided by a cohort of 47,845 participants from 414 family practices in the UK General Practice Research Database (GPRD) with a first stroke between 1997 and 2006. Each eligible stroke code, out of a set of 202 OXMIS and Read codes, was coded as either recorded or not recorded for each participant. A two parameter IRT model was fitted using marginal maximum likelihood estimation. Estimated parameters from the model were considered to characterise each code with respect to the latent trait of stroke diagnosis. The location parameter is referred to as a calibration parameter, while the slope parameter is referred to as a discrimination parameter. Results There were 79,874 stroke code occurrences available for analysis. Utilisation of codes varied between family practices with intraclass correlation coefficients of up to 0.25 for the most frequently used codes. IRT analyses were restricted to 110 Read codes. Calibration and discrimination parameters were estimated for 77 (70%) codes that were endorsed for 1,942 stroke patients. Parameters were not estimated for the remaining more frequently used codes. Discrimination parameter values ranged from 0.67 to 2.78, while calibration parameters values ranged from 4.47 to 11.58. The two parameter model gave a better fit to the data than either the one- or three-parameter models. However, high chi-square values for about a fifth of the stroke codes were suggestive of poor item fit. Conclusion The application of item response theory models to coded electronic patient records might potentially contribute to identifying medical codes that offer poor discrimination or low calibration. This might indicate the need for improved coding sets or a requirement for improved clinical coding practice. However, in this study estimates were only obtained for a small proportion of participants and there was some evidence of poor model fit. There was also evidence of variation in the utilisation of codes between family practices raising the possibility that, in practice, properties of codes may vary for different coders. PMID:22176509
Magnetic Feature Tracking in the SDO Era: Past Sacrifices, Recent Advances, and Future Possibilities
NASA Astrophysics Data System (ADS)
Lamb, D. A.; DeForest, C. E.; Van Kooten, S.
2014-12-01
When implementing computer vision codes, a common reaction to the high angular resolution and the high cadence of SDO's image products has been to reduce the resolution and cadence of the data so that it "looks like" SOHO data. This can be partially justified on physical grounds: if the phenomenon that a computer vision code is trying to detect was characterized in low-resolution, low cadence data, then the higher quality data may not be needed. But sacrificing at least two, and sometimes all four main advantages of SDO's imaging data (the other two being a higher duty cycle and additional data products) threatens to also discard the perhaps more subtle discoveries waiting to be made: a classic baby-with-the-bath-water situation. In this presentation, we discuss some of the sacrifices made in implementing SWAMIS-EF, an automatic emerging magnetic flux region detection code for SDO/HMI, and how those sacrifices simultaneously simplified and complicated development of the code. SWAMIS-EF is a feature-finding code, and we will describe some situations and analyses in which a feature-finding code excels, and some in which a different type of algorithm may produce more favorable results. In particular, because the solar magnetic field is irreducibly complex at the currently observed spatial scales, searching for phenomena such as flux emergence using even semi-strict physical criteria often leads to large numbers of false or missed detections. This undesirable behavior can be mitigated by relaxing the imposed physical criteria, but here too there are tradeoffs: decreased numbers of missed detections may increase the number of false detections if the selection criteria are not both sensitive and specific to the searched-for phenomenon. Finally, we describe some recent steps we have taken to overcome these obstacles, by fully embracing the high resolution, high cadence SDO data, optimizing and partially parallelizing our existing code as a first step to allow fast magnetic feature tracking of full resolution HMI magnetograms. Even with the above caveats, if used correctly such a tool can provide a wealth of information on the positions, motions, and patterns of features, enabling large, cross-scale analyses that can answer important questions related to the solar dynamo and to coronal heating.
NASA Astrophysics Data System (ADS)
Carlo Ponzo, Felice; Ditommaso, Rocco; Nigro, Antonella; Nigro, Domenico S.; Iacovino, Chiara
2017-04-01
After the Mw 6.0 mainshock of August 24, 2016 at 03.36 a.m. (local time), with the epicenter located between the towns of Accumoli (province of Rieti), Amatrice (province of Rieti) and Arquata del Tronto (province of Ascoli Piceno), several activities were started in order to perform some preliminary evaluations on the characteristics of the recent seismic sequence in the areas affected by the earthquake. Ambient vibration acquisitions have been performed using two three-directional velocimetric synchronized stations, with a natural frequency equal to 0.5Hz and a digitizer resolution of equal to 24bit. The activities are continuing after the events of the seismic sequence of October 26 and October 30, 2016. In this paper, in order to compare recorded and code provision values in terms of peak (PGA, PGV and PGD), spectral and integral (Housner Intensity) seismic parameters, several preliminary analyses have been performed on accelerometric time-histories acquired by three near fault station of the RAN (Italian Accelerometric Network): Amatrice station (station code AMT), Norcia station (station code NRC) and Castelsantangelo sul Nera station (station code CNE). Several comparisons between the elastic response spectra derived from accelerometric recordings and the elastic demand spectra provided by the Italian seismic code (NTC 2008) have been performed. Preliminary results retrieved from these analyses highlight several apparent difference between experimental data and conventional code provision. Then, the ongoing seismic sequence appears compatible with the historical seismicity in terms of integral parameters, but not in terms of peak and spectral values. It seems appropriate to reconsider the necessity to revise the simplified design approach based on the conventional spectral values. Acknowledgements This study was partially funded by the Italian Department of Civil Protection within the project DPC-RELUIS 2016 - RS4 ''Seismic observatory of structures and health monitoring'' and by the "Centre of Integrated Geomorphology for the Mediterranean Area - CGIAM" within the Framework Agreement with the University of Basilicata "Study, Research and Experimentation in the Field of Analysis and Monitoring of Seismic Vulnerability of Strategic and Relevant Buildings for the purposes of Civil Protection and Development of Innovative Strategies of Seismic Reinforcement".
Barriers to healthy eating among food pantry clients
USDA-ARS?s Scientific Manuscript database
This study explored perspectives on barriers of eating healthy among food pantry clients. Food pantry clients participated in focus groups/interviews. Qualitative data were coded and analyzed using content analyses and grounded theory approach. Themes were then identified. Quantitative data were ana...
77 FR 59768 - Shipping and Transportation; Technical, Organizational, and Conforming Amendments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-01
.... Abbreviations II. Regulatory History III. Basis and Purpose IV. Background V. Regulatory Analyses A. Regulatory....S.C. United States Code II. Regulatory History We did not publish a notice of proposed rulemaking... its place, the text ``(CG-ENG)''. [[Page 59778
DOT National Transportation Integrated Search
2012-11-01
Generic, code-based design procedures cannot account for the anticipated short-period attenuation and long-period amplification of earthquake ground motions in the deep, soft sediments of the Mississippi Embayment within the New Madrid Seismic Zone (...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
This document outlines the development of a high fidelity, best estimate nuclear power plant severe transient simulation capability that will complement or enhance the integral system codes historically used for licensing and analysis of severe accidents. As with other tools in the Risk Informed Safety Margin Characterization (RISMC) Toolkit, the ultimate user of Enhanced Severe Transient Analysis and Prevention (ESTAP) capability is the plant decision-maker; the deliverable to that customer is a modern, simulation-based safety analysis capability, applicable to a much broader class of safety issues than is traditional Light Water Reactor (LWR) licensing analysis. Currently, the RISMC pathway’s majormore » emphasis is placed on developing RELAP-7, a next-generation safety analysis code, and on showing how to use RELAP-7 to analyze margin from a modern point of view: that is, by characterizing margin in terms of the probabilistic spectra of the “loads” applied to systems, structures, and components (SSCs), and the “capacity” of those SSCs to resist those loads without failing. The first objective of the ESTAP task, and the focus of one task of this effort, is to augment RELAP-7 analyses with user-selected multi-dimensional, multi-phase models of specific plant components to simulate complex phenomena that may lead to, or exacerbate, severe transients and core damage. Such phenomena include: coolant crossflow between PWR assemblies during a severe reactivity transient, stratified single or two-phase coolant flow in primary coolant piping, inhomogeneous mixing of emergency coolant water or boric acid with hot primary coolant, and water hammer. These are well-documented phenomena associated with plant transients but that are generally not captured in system codes. They are, however, generally limited to specific components, structures, and operating conditions. The second ESTAP task is to similarly augment a severe (post-core damage) accident integral analyses code with high fidelity simulations that would allow investigation of multi-dimensional, multi-phase containment phenomena that are only treated approximately in established codes.« less
Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results
NASA Technical Reports Server (NTRS)
Jones, Scott
2015-01-01
Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines.OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.
Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results
NASA Technical Reports Server (NTRS)
Jones, Scott M.
2015-01-01
Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.
Spatial panel analyses of alcohol outlets and motor vehicle crashes in California: 1999–2008
Ponicki, William R.; Gruenewald, Paul J.; Remer, Lillian G.
2014-01-01
Although past research has linked alcohol outlet density to higher rates of drinking and many related social problems, there is conflicting evidence of density’s association with traffic crashes. An abundance of local alcohol outlets simultaneously encourages drinking and reduces driving distances required to obtain alcohol, leading to an indeterminate expected impact on alcohol-involved crash risk. This study separately investigates the effects of outlet density on (1) the risk of injury crashes relative to population and (2) the likelihood that any given crash is alcohol-involved, as indicated by police reports and single-vehicle nighttime status of crashes. Alcohol outlet density effects are estimated using Bayesian misalignment Poisson analyses of all California ZIP codes over the years 1999–2008. These misalignment models allow panel analysis of ZIP-code data despite frequent redefinition of postal-code boundaries, while also controlling for overdispersion and the effects of spatial autocorrelation. Because models control for overall retail density, estimated alcohol-outlet associations represent the extra effect of retail establishments selling alcohol. The results indicate a number of statistically well-supported associations between retail density and crash behavior, but the implied effects on crash risks are relatively small. Alcohol-serving restaurants have a greater impact on overall crash risks than on the likelihood that those crashes involve alcohol, whereas bars primarily affect the odds that crashes are alcohol-involved. Off-premise outlet density is negatively associated with risks of both crashes and alcohol involvement, while the presence of a tribal casino in a ZIP code is linked to higher odds of police-reported drinking involvement. Alcohol outlets in a given area are found to influence crash risks both locally and in adjacent ZIP codes, and significant spatial autocorrelation also suggests important relationships across geographical units. These results suggest that each type of alcohol outlet can have differing impacts on risks of crashing as well as the alcohol involvement of those crashes. PMID:23537623
de Hoyos-Alonso, María del Canto; Bonis, Julio; Bryant, Verónica; Castell Alcalá, María Victoria; Otero Puime, Ángel
2016-01-01
To ascertain the diagnosis associated with specific treatment for dementia in the Primary Care Electronic Clinical Record (PC-ECR) and to analyse the factors associated with the quality of registration. Descriptive study of patients taking cholinesterase inhibitors or memantine registered in Database for pharmacoepidemiological research in PC (BIFAP) 2011: 24,575 patients between 2002 and 2011. Diagnoses associated with first prescription of these drugs were grouped into 5 categories: "dementia", "memory impairment", "dementia-related diseases", "intercurrent processes" and "convenience codes". We calculated the prevalence of each category by age and sex for each study year (95%CI) and analysed the associations and trend for 2002-2011 using difference in proportions in independent samples and binary logistic regression. A code of "dementia" was associated with first prescription in 56.5% (95%CI: 55.8-57.1) of patients. It was higher in women [OR1.09 (95%CI: 1.03-1.15)] and with increasing follow-up time [OR1.07 (95%CI: 1.06-1.08) for each year of follow-up]. "Convenience codes" [16.3% (95%CI: 15.8-16.7)] were coded more frequently in women and in those ≥80 years; "Memory impairment" [12.4% (95%CI: 12.0-12.8)], "related diseases" [4.6% (95%CI: 4.4-4.8)] and "intercurrent processes" [10.3% (95%CI: 9.9-10.6)] were used more in men and in persons <80 years. Between 2002 and 2011 improved the use of "convenience codes". Almost half of the patients taking cholinesterase inhibitors or memantine do not have a diagnosis of dementia registered in their PC-ECR. Registration improves with increasing time of follow-up. Improvements are needed in the PC-ECR, adequate care coordination, and proactive approach to increase the quality of dementia registration. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Neuwirth, Alexander L; Stitzlein, Russell N; Neuwirth, Madalyn G; Kelz, Rachel K; Mehta, Samir
2018-01-17
Future generations of orthopaedic surgeons must continue to be trained in the surgical management of hip fractures. This study assesses the effect of resident participation on outcomes for the treatment of intertrochanteric hip fractures. The National Surgical Quality Improvement Program (NSQIP) database (2010 to 2013) was queried for intertrochanteric hip fractures (International Classification of Diseases, 9th Revision, Clinical Modification [ICD-9-CM] code 820.21) treated with either extramedullary (Current Procedural Terminology [CPT] code 27244) or intramedullary (CPT code 27245) fixation. Demographic variables, including resident participation, as well as primary (death and serious morbidity) and secondary outcome variables were extracted for analysis. Univariate, propensity score-matched, and multivariate logistic regression analyses were performed to evaluate outcome variables. Data on resident participation were available for 1,764 cases (21.0%). Univariate analyses for all intertrochanteric hip fractures demonstrated no significant difference in 30-day mortality (6.3% versus 7.8%; p = 0.264) or serious morbidity (44.9% versus 43.2%; p = 0.506) between the groups with and without resident participation. Multivariate and propensity score-matched analyses gave similar results. Resident involvement was associated with prolonged operating-room time, length of stay, and time to discharge when a prolonged case was defined as one above the 90th percentile for time parameters. Resident participation was not associated with an increase in morbidity or mortality but was associated with an increase in time-related secondary outcome measures. While attending surgeon supervision is necessary, residents can and should be involved in the care of these patients without concern that resident involvement negatively impacts perioperative morbidity and mortality. Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.
Kühne, Annett; Kaiser, Rolf; Schirmer, Markus; Heider, Ulrike; Muhlke, Sabine; Niere, Wiebke; Overbeck, Tobias; Hohloch, Karin; Trümper, Lorenz; Sezer, Orhan; Brockmöller, Jürgen
2007-07-01
Melphalan is widely used in the treatment of multiple myeloma. Pharmacokinetics of this alkylating drug shows high inter-individual variability. As melphalan is a phenylalanine derivative, the pharmacokinetic variability may be determined by genetic polymorphisms in the L-type amino acid transporters LAT1 (SLC7A5) and LAT2 (SLC7A8). Pharmacokinetics were analysed in 64 patients after first administration of intravenous melphalan. Severity of side effects was documented according to WHO criteria. Genomic DNA was analysed for polymorphisms in LAT1 and LAT2 by sequencing of the entire coding region, intron-exon boundaries and 2 kb upstream promoter region. Selected polymorphisms in the common heavy chain of both transporters, the protein 4F2hc (SLC3A2), were analysed by single nucleotide primer extension. Melphalan pharmacokinetics was highly variable with up to 6.2-fold differences in total clearance. A total of 44 polymorphisms were identified in LAT1 and 21 polymorphisms in LAT2. From all variants, only five were in the coding region and only one heterozygous non-synonymous polymorphism (Ala94Thr) was found in LAT2. Numerous polymorphisms were found in the LAT1 and LAT2 5'-flanking regions but did not correlate with expression of the respective genes. No significant correlations could be observed between the polymorphisms in 4F2hc, LAT1, and LAT2 with melphalan pharmacokinetics or with melphalan side effects. The study confirmed that these transporter genes are highly conserved, particularly in the coding sequences. Genetic variation in 4F2hc, LAT1, and LAT2 does not appear to be a major cause of inter-individual variability in pharmacokinetics and of adverse reactions to melphalan.
Atkinson, Sophie; Marguerat, Samuel; Bitton, Danny; Bachand, Francois; Rodriguez-Lopez, Maria; Rallis, Charalampos; Lemay, Jean-Francois; Cotobal, Cristina; Malecki, Michal; Smialowski, Pawel; Mata, Juan; Korber, Philipp; Bahler, Jurg
2018-06-18
Long non-coding RNAs (lncRNAs), which are longer than 200 nucleotides but often unstable, contribute a substantial and diverse portion to pervasive non-coding transcriptomes. Most lncRNAs are poorly annotated and understood, although several play important roles in gene regulation and diseases. Here we systematically uncover and analyse lncRNAs in Schizosaccharomyces pombe. Based on RNA-seq data from twelve RNA-processing mutants and nine physiological conditions, we identify 5775 novel lncRNAs, nearly 4-times the previously annotated lncRNAs. The expression of most lncRNAs becomes strongly induced under the genetic and physiological perturbations, most notably during late meiosis. Most lncRNAs are cryptic and suppressed by three RNA-processing pathways: the nuclear exosome, cytoplasmic exonuclease, and RNAi. Double-mutant analyses reveal substantial coordination and redundancy among these pathways. We classify lncRNAs by their dominant pathway into cryptic unstable transcripts (CUTs), Xrn1-sensitive unstable transcripts (XUTs), and Dicer-sensitive unstable transcripts (DUTs). XUTs and DUTs are enriched for antisense lncRNAs, while CUTs are often bidirectional and actively translated. The cytoplasmic exonuclease, along with RNAi, dampens the expression of thousands of lncRNAs and mRNAs that become induced during meiosis. Antisense lncRNA expression mostly negatively correlates with sense mRNA expression in the physiological, but not the genetic conditions. Intergenic and bidirectional lncRNAs emerge from nucleosome-depleted regions, upstream of positioned nucleosomes. Our results highlight both similarities and differences to lncRNA regulation in budding yeast. This broad survey of the lncRNA repertoire and characteristics in S. pombe, and the interwoven regulatory pathways that target lncRNAs, provides a rich framework for their further functional analyses. Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Ferrero, Giulio; Cordero, Francesca; Tarallo, Sonia; Arigoni, Maddalena; Riccardo, Federica; Gallo, Gaetano; Ronco, Guglielmo; Allasia, Marco; Kulkarni, Neha; Matullo, Giuseppe; Vineis, Paolo; Calogero, Raffaele A; Pardini, Barbara; Naccarati, Alessio
2018-01-09
The role of non-coding RNAs in different biological processes and diseases is continuously expanding. Next-generation sequencing together with the parallel improvement of bioinformatics analyses allows the accurate detection and quantification of an increasing number of RNA species. With the aim of exploring new potential biomarkers for disease classification, a clear overview of the expression levels of common/unique small RNA species among different biospecimens is necessary. However, except for miRNAs in plasma, there are no substantial indications about the pattern of expression of various small RNAs in multiple specimens among healthy humans. By analysing small RNA-sequencing data from 243 samples, we have identified and compared the most abundantly and uniformly expressed miRNAs and non-miRNA species of comparable size with the library preparation in four different specimens (plasma exosomes, stool, urine, and cervical scrapes). Eleven miRNAs were commonly detected among all different specimens while 231 miRNAs were globally unique across them. Classification analysis using these miRNAs provided an accuracy of 99.6% to recognize the sample types. piRNAs and tRNAs were the most represented non-miRNA small RNAs detected in all specimen types that were analysed, particularly in urine samples. With the present data, the most uniformly expressed small RNAs in each sample type were also identified. A signature of small RNAs for each specimen could represent a reference gene set in validation studies by RT-qPCR. Overall, the data reported hereby provide an insight of the constitution of the human miRNome and of other small non-coding RNAs in various specimens of healthy individuals.
ICD-10 procedure codes produce transition challenges
Boyd, Andrew D.; Li, Jianrong ‘John’; Kenost, Colleen; Zaim, Samir Rachid; Krive, Jacob; Mittal, Manish; Satava, Richard A.; Burton, Michael; Smith, Jacob; Lussier, Yves A.
2018-01-01
The transition of procedure coding from ICD-9-CM-Vol-3 to ICD-10-PCS has generated problems for the medical community at large resulting from the lack of clarity required to integrate two non-congruent coding systems. We hypothesized that quantifying these issues with network topology analyses offers a better understanding of the issues, and therefore we developed solutions (online tools) to empower hospital administrators and researchers to address these challenges. Five topologies were identified: “identity”(I), “class-to-subclass”(C2S), “subclass-toclass”(S2C), “convoluted(C)”, and “no mapping”(NM). The procedure codes in the 2010 Illinois Medicaid dataset (3,290 patients, 116 institutions) were categorized as C=55%, C2S=40%, I=3%, NM=2%, and S2C=1%. Majority of the problematic and ambiguous mappings (convoluted) pertained to operations in ophthalmology cardiology, urology, gyneco-obstetrics, and dermatology. Finally, the algorithms were expanded into a user-friendly tool to identify problematic topologies and specify lists of procedural codes utilized by medical professionals and researchers for mitigating error-prone translations, simplifying research, and improving quality.http://www.lussiergroup.org/transition-to-ICD10PCS PMID:29888037
Local variations in the timing of RSV epidemics.
Noveroske, Douglas B; Warren, Joshua L; Pitzer, Virginia E; Weinberger, Daniel M
2016-11-11
Respiratory syncytial virus (RSV) is a primary cause of hospitalizations in children worldwide. The timing of seasonal RSV epidemics needs to be known in order to administer prophylaxis to high-risk infants at the appropriate time. We used data from the Connecticut State Inpatient Database to identify RSV hospitalizations based on ICD-9 diagnostic codes. Harmonic regression analyses were used to evaluate RSV epidemic timing at the county level and ZIP code levels. Linear regression was used to investigate associations between the socioeconomic status of a locality and RSV epidemic timing. 9,740 hospitalizations coded as RSV occurred among children less than 2 years old between July 1, 1997 and June 30, 2013. The earliest ZIP code had a seasonal RSV epidemic that peaked, on average, 4.64 weeks earlier than the latest ZIP code. Earlier epidemic timing was significantly associated with demographic characteristics (higher population density and larger fraction of the population that was black). Seasonal RSV epidemics in Connecticut occurred earlier in areas that were more urban (higher population density and larger fraction of the population that was). These findings could be used to better time the administration of prophylaxis to high-risk infants.
Design Analysis of SNS Target StationBiological Shielding Monoligh with Proton Power Uprate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bekar, Kursat B.; Ibrahim, Ahmad M.
2017-05-01
This report documents the analysis of the dose rate in the experiment area outside the Spallation Neutron Source (SNS) target station shielding monolith with proton beam energy of 1.3 GeV. The analysis implemented a coupled three dimensional (3D)/two dimensional (2D) approach that used both the Monte Carlo N-Particle Extended (MCNPX) 3D Monte Carlo code and the Discrete Ordinates Transport (DORT) two dimensional deterministic code. The analysis with proton beam energy of 1.3 GeV showed that the dose rate in continuously occupied areas on the lateral surface outside the SNS target station shielding monolith is less than 0.25 mrem/h, which compliesmore » with the SNS facility design objective. However, the methods and codes used in this analysis are out of date and unsupported, and the 2D approximation of the target shielding monolith does not accurately represent the geometry. We recommend that this analysis is updated with modern codes and libraries such as ADVANTG or SHIFT. These codes have demonstrated very high efficiency in performing full 3D radiation shielding analyses of similar and even more difficult problems.« less
RNA editing differently affects protein-coding genes in D. melanogaster and H. sapiens.
Grassi, Luigi; Leoni, Guido; Tramontano, Anna
2015-07-14
When an RNA editing event occurs within a coding sequence it can lead to a different encoded amino acid. The biological significance of these events remains an open question: they can modulate protein functionality, increase the complexity of transcriptomes or arise from a loose specificity of the involved enzymes. We analysed the editing events in coding regions that produce or not a change in the encoded amino acid (nonsynonymous and synonymous events, respectively) in D. melanogaster and in H. sapiens and compared them with the appropriate random models. Interestingly, our results show that the phenomenon has rather different characteristics in the two organisms. For example, we confirm the observation that editing events occur more frequently in non-coding than in coding regions, and report that this effect is much more evident in H. sapiens. Additionally, in this latter organism, editing events tend to affect less conserved residues. The less frequently occurring editing events in Drosophila tend to avoid drastic amino acid changes. Interestingly, we find that, in Drosophila, changes from less frequently used codons to more frequently used ones are favoured, while this is not the case in H. sapiens.
Fukushima Daiichi Radionuclide Inventories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardoni, Jeffrey N.; Jankovsky, Zachary Kyle
Radionuclide inventories are generated to permit detailed analyses of the Fukushima Daiichi meltdowns. This is necessary information for severe accident calculations, dose calculations, and source term and consequence analyses. Inventories are calculated using SCALE6 and compared to values predicted by international researchers supporting the OECD/NEA's Benchmark Study on the Accident at Fukushima Daiichi Nuclear Power Station (BSAF). Both sets of inventory information are acceptable for best-estimate analyses of the Fukushima reactors. Consistent nuclear information for severe accident codes, including radionuclide class masses and core decay powers, are also derived from the SCALE6 analyses. Key nuclide activity ratios are calculated asmore » functions of burnup and nuclear data in order to explore the utility for nuclear forensics and support future decommissioning efforts.« less
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Perry, Boyd, III; Florance, James R.; Sanetrik, Mark D.; Wieseman, Carol D.; Stevens, William L.; Funk, Christie J.; Hur, Jiyoung; Christhilf, David M.; Coulson, David A.
2011-01-01
A summary of computational and experimental aeroelastic and aeroservoelastic (ASE) results for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analyses and multiple ASE wind-tunnel tests of the S4T have been performed in support of the ASE element in the Supersonics Program, part of NASA's Fundamental Aeronautics Program. The computational results to be presented include linear aeroelastic and ASE analyses, nonlinear aeroelastic analyses using an aeroelastic CFD code, and rapid aeroelastic analyses using CFD-based reduced-order models (ROMs). Experimental results from two closed-loop wind-tunnel tests performed at NASA Langley's Transonic Dynamics Tunnel (TDT) will be presented as well.
NASA Technical Reports Server (NTRS)
Whitlow, W., Jr.; Bennett, R. M.
1982-01-01
Since the aerodynamic theory is nonlinear, the method requires the coupling of two iterative processes - an aerodynamic analysis and a structural analysis. A full potential analysis code, FLO22, is combined with a linear structural analysis to yield aerodynamic load distributions on and deflections of elastic wings. This method was used to analyze an aeroelastically-scaled wind tunnel model of a proposed executive-jet transport wing and an aeroelastic research wing. The results are compared with the corresponding rigid-wing analyses, and some effects of elasticity on the aerodynamic loading are noted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Platania, P., E-mail: platania@ifp.cnr.it; Figini, L.; Farina, D.
The purpose of this work is the optical modeling and physical performances evaluations of the JT-60SA ECRF launcher system. The beams have been simulated with the electromagnetic code GRASP® and used as input for ECCD calculations performed with the beam tracing code GRAY, capable of modeling propagation, absorption and current drive of an EC Gaussion beam with general astigmatism. Full details of the optical analysis has been taken into account to model the launched beams. Inductive and advanced reference scenarios has been analysed for physical evaluations in the full poloidal and toroidal steering ranges for two slightly different layouts ofmore » the launcher system.« less
Calculations of the energy levels and oscillator strengths of the Ne-like Fe Ion (Fe XVII)
NASA Astrophysics Data System (ADS)
Zhong, Jia-yong; Zhang, Jie; Zhao, Gang; Lu, Xin
Energy levels and oscillator strengths among the 27 fine-structure levels belonging to the (ls 22s 2)2p 6, 2p 53s, 2p 53p and 2p 53d configurations of the neon-like iron ion have been calculated using three atomic structure codes RCN/RCG, AUTOSTRUCTURE (AS) and GRASP. Relativistic corrections of the wave functions are taken into account in the RCN/RCG calculation. The results agree well with the available experimental and theoretical data. The accuracy of the three codes is analysed.
Komeda, Masao; Kawasaki, Kozo; Obara, Toru
2013-04-01
We studied a new silicon irradiation holder with a neutron filter designed to make the vertical neutron flux profile uniform. Since an irradiation holder has to be made of a low activation material, we applied aluminum blended with B4C as the holder material. Irradiation methods to achieve uniform flux with a filter are discussed using Monte-Carlo calculation code MVP. Validation of the use of the MVP code for the holder's analyses is also discussed via characteristic experiments. Copyright © 2013 Elsevier Ltd. All rights reserved.
Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles
NASA Technical Reports Server (NTRS)
Wieting, A. R.
1979-01-01
The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.
Validity of administrative coding in identifying patients with upper urinary tract calculi.
Semins, Michelle J; Trock, Bruce J; Matlaga, Brian R
2010-07-01
Administrative databases are increasingly used for epidemiological investigations. We performed a study to assess the validity of ICD-9 codes for upper urinary tract stone disease in an administrative database. We retrieved the records of all inpatients and outpatients at Johns Hopkins Hospital between November 2007 and October 2008 with an ICD-9 code of 592, 592.0, 592.1 or 592.9 as one of the first 3 diagnosis codes. A random number generator selected 100 encounters for further review. We considered a patient to have a true diagnosis of an upper tract stone if the medical records specifically referenced a kidney stone event, or included current or past treatment for a kidney stone. Descriptive and comparative analyses were performed. A total of 8,245 encounters coded as upper tract calculus were identified and 100 were randomly selected for review. Two patients could not be identified within the electronic medical record and were excluded from the study. The positive predictive value of using all ICD-9 codes for an upper tract calculus (592, 592.0, 592.1) to identify subjects with renal or ureteral stones was 95.9%. For 592.0 only the positive predictive value was 85%. However, although the positive predictive value for 592.1 only was 100%, 26 subjects (76%) with a ureteral stone were not appropriately billed with this code. ICD-9 coding for urinary calculi is likely to be sufficiently valid to be useful in studies using administrative data to analyze stone disease. However, ICD-9 coding is not a reliable means to distinguish between subjects with renal and ureteral calculi. Copyright (c) 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
System Level RBDO for Military Ground Vehicles using High Performance Computing
2008-01-01
platform. Only the analyses that required more than 24 processors were conducted on the Onyx 350 due to the limited number of processors on the...optimization constraints varied. The queues set the number of processors and number of finite element code licenses available to the analyses. sgi ONYX ...3900: unix 24 MIPS R16000 PROCESSORS 4 IR2 GRAPHICS PIPES 4 IR3 GRAPHICS PIPES 24 GBYTES MEMORY 36 GBYTES LOCAL DISK SPACE sgi ONYX 350: unix 32 MIPS
Performance simulation in high altitude platforms (HAPs) communications systems
NASA Astrophysics Data System (ADS)
Ulloa-Vásquez, Fernando; Delgado-Penin, J. A.
2002-07-01
This paper considers the analysis by simulation of a digital narrowband communication system for an scenario which consists of a High-Altitude aeronautical Platform (HAP) and fixed/mobile terrestrial transceivers. The aeronautical channel is modelled considering geometrical (angle of elevation vs. horizontal distance of the terrestrial reflectors) and statistical arguments and under these circumstances a serial concatenated coded digital transmission is analysed for several hypothesis related to radio-electric coverage areas. The results indicate a good feasibility for the communication system proposed and analysed.
Munasinghe, A; Chang, D; Mamidanna, R; Middleton, S; Joy, M; Penninckx, F; Darzi, A; Livingston, E; Faiz, O
2014-07-01
Significant variation in colorectal surgery outcomes exists between different countries. Better understanding of the sources of variable outcomes using administrative data requires alignment of differing clinical coding systems. We aimed to map similar diagnoses and procedures across administrative coding systems used in different countries. Administrative data were collected in a central database as part of the Global Comparators (GC) Project. In order to unify these data, a systematic translation of diagnostic and procedural codes was undertaken. Codes for colorectal diagnoses, resections, operative complications and reoperative interventions were mapped across the respective national healthcare administrative coding systems. Discharge data from January 2006 to June 2011 for patients who had undergone colorectal surgical resections were analysed to generate risk-adjusted models for mortality, length of stay, readmissions and reoperations. In all, 52 544 case records were collated from 31 institutions in five countries. Mapping of all the coding systems was achieved so that diagnosis and procedures from the participant countries could be compared. Using the aligned coding systems to develop risk-adjusted models, the 30-day mortality rate for colorectal surgery was 3.95% (95% CI 0.86-7.54), the 30-day readmission rate was 11.05% (5.67-17.61), the 28-day reoperation rate was 6.13% (3.68-9.66) and the mean length of stay was 14 (7.65-46.76) days. The linkage of international hospital administrative data that we developed enabled comparison of documented surgical outcomes between countries. This methodology may facilitate international benchmarking. Colorectal Disease © 2014 The Association of Coloproctology of Great Britain and Ireland.
Pang, Jack X Q; Ross, Erin; Borman, Meredith A; Zimmer, Scott; Kaplan, Gilaad G; Heitman, Steven J; Swain, Mark G; Burak, Kelly W; Quan, Hude; Myers, Robert P
2015-09-11
Epidemiologic studies of alcoholic hepatitis (AH) have been hindered by the lack of a validated International Classification of Disease (ICD) coding algorithm for use with administrative data. Our objective was to validate coding algorithms for AH using a hospitalization database. The Hospital Discharge Abstract Database (DAD) was used to identify consecutive adults (≥18 years) hospitalized in the Calgary region with a diagnosis code for AH (ICD-10, K70.1) between 01/2008 and 08/2012. Medical records were reviewed to confirm the diagnosis of AH, defined as a history of heavy alcohol consumption, elevated AST and/or ALT (<300 U/L), serum bilirubin >34 μmol/L, and elevated INR. Subgroup analyses were performed according to the diagnosis field in which the code was recorded (primary vs. secondary) and AH severity. Algorithms that incorporated ICD-10 codes for cirrhosis and its complications were also examined. Of 228 potential AH cases, 122 patients had confirmed AH, corresponding to a positive predictive value (PPV) of 54% (95% CI 47-60%). PPV improved when AH was the primary versus a secondary diagnosis (67% vs. 21%; P < 0.001). Algorithms that included diagnosis codes for ascites (PPV 75%; 95% CI 63-86%), cirrhosis (PPV 60%; 47-73%), and gastrointestinal hemorrhage (PPV 62%; 51-73%) had improved performance, however, the prevalence of these diagnoses in confirmed AH cases was low (29-39%). In conclusion the low PPV of the diagnosis code for AH suggests that caution is necessary if this hospitalization database is used in large-scale epidemiologic studies of this condition.
Thermal-structural analyses of Space Shuttle Main Engine (SSME) hot section components
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Thompson, Robert L.
1988-01-01
Three dimensional nonlinear finite element heat transfer and structural analyses were performed for the first stage high pressure fuel turbopump (HPFTP) blade of the space shuttle main engine (SSME). Directionally solidified (DS) MAR-M 246 and single crystal (SC) PWA-1480 material properties were used for the analyses. Analytical conditions were based on a typical test stand engine cycle. Blade temperature and stress strain histories were calculated by using the MARC finite element computer code. The structural response of an SSME turbine blade was assessed and a greater understanding of blade damage mechanisms, convective cooling effects, and thermal mechanical effects was gained.
Composite blade structural analyzer (COBSTRAN) user's manual
NASA Technical Reports Server (NTRS)
Aiello, Robert A.
1989-01-01
The installation and use of a computer code, COBSTRAN (COmposite Blade STRuctrual ANalyzer), developed for the design and analysis of composite turbofan and turboprop blades and also for composite wind turbine blades was described. This code combines composite mechanics and laminate theory with an internal data base of fiber and matrix properties. Inputs to the code are constituent fiber and matrix material properties, factors reflecting the fabrication process, composite geometry and blade geometry. COBSTRAN performs the micromechanics, macromechanics and laminate analyses of these fiber composites. COBSTRAN generates a NASTRAN model with equivalent anisotropic homogeneous material properties. Stress output from NASTRAN is used to calculate individual ply stresses, strains, interply stresses, thru-the-thickness stresses and failure margins. Curved panel structures may be modeled providing the curvature of a cross-section is defined by a single value function. COBSTRAN is written in FORTRAN 77.
NASA Astrophysics Data System (ADS)
Caminata, A.; Agostini, M.; Altenmüller, K.; Appel, S.; Bellini, G.; Benziger, J.; Berton, N.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Cavalcante, P.; Chepurnov, A.; Choi, K.; Cribier, M.; D'Angelo, D.; Davini, S.; Derbin, A.; Di Noto, L.; Drachnev, I.; Durero, M.; Empl, A.; Etenko, A.; Farinon, S.; Fischer, V.; Fomenko, K.; Franco, D.; Gabriele, F.; Gaffiot, J.; Galbiati, C.; Ghiano, C.; Giammarchi, M.; Goeger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, T.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jonquères, N.; Jedrzejczak, K.; Kaiser, M.; Kobychev, V.; Korablev, D.; Korga, G.; Kornoukhov, V.; Kryn, D.; Lachenmaier, T.; Lasserre, T.; Laubenstein, M.; Lehnert, B.; Link, J.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Manecki, S.; Maneschg, W.; Marcocci, S.; Maricic, J.; Mention, G.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Mosteiro, P.; Muratova, V.; Musenich, R.; Neumair, B.; Oberauer, L.; Obolensky, M.; Ortica, F.; Pallavicini, M.; Papp, L.; Perasso, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Roncin, R.; Rossi, N.; Schönert, S.; Scola, L.; Semenov, D.; Simgen, H.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Thurn, J.; Toropova, M.; Unzhakov, E.; Veyssiere, C.; Vishneva, A.; Vivier, M.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Weinz, S.; Winter, J.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.
2016-02-01
Borexino is an unsegmented neutrino detector operating at LNGS in central Italy. The experiment has shown its performances through its unprecedented accomplishments in the solar and geoneutrino detection. These performances make it an ideal tool to accomplish a state- of-the-art experiment able to test the existence of sterile neutrinos (SOX experiment). For both the solar and the SOX analysis, a good understanding of the detector response is fundamental. Consequently, calibration campaigns with radioactive sources have been performed over the years. The calibration data are of extreme importance to develop an accurate Monte Carlo code. This code is used in all the neutrino analyses. The Borexino-SOX calibration techniques and program and the advances on the detector simulation code in view of the start of the SOX data taking are presented. 1
Developments in REDES: The rocket engine design expert system
NASA Technical Reports Server (NTRS)
Davidian, Kenneth O.
1990-01-01
The Rocket Engine Design Expert System (REDES) is being developed at the NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP, a nozzle design program named RAO, a regenerative cooling channel performance evaluation code named RTE, and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES is built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.
Developments in REDES: The Rocket Engine Design Expert System
NASA Technical Reports Server (NTRS)
Davidian, Kenneth O.
1990-01-01
The Rocket Engine Design Expert System (REDES) was developed at NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP; a nozzle design program named RAO; a regenerative cooling channel performance evaluation code named RTE; and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES was built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.
Hypersonic CFD applications for the National Aero-Space Plane
NASA Technical Reports Server (NTRS)
Richardson, Pamela F.; Mcclinton, Charles R.; Bittner, Robert D.; Dilley, A. Douglas; Edwards, Kelvin W.
1989-01-01
Design and analysis of the NASP depends heavily upon developing the critical technology areas that cover the entire engineering design of the vehicle. These areas include materials, structures, propulsion systems, propellants, integration of airframe and propulsion systems, controls, subsystems, and aerodynamics areas. Currently, verification of many of the classical engineering tools relies heavily on computational fluid dynamics. Advances are being made in the development of CFD codes to accomplish nose-to-tail analyses for hypersonic aircraft. Additional details involving the partial development, analysis, verification, and application of the CFL3D code and the SPARK combustor code are discussed. A nonequilibrium version of CFL3D that is presently being developed and tested is also described. Examples are given of portion calculations for research hypersonic aircraft geometries and comparisons with experiment data show good agreement.
Space shuttle rendezous, radiation and reentry analysis code
NASA Technical Reports Server (NTRS)
Mcglathery, D. M.
1973-01-01
A preliminary space shuttle mission design and analysis tool is reported emphasizing versatility, flexibility, and user interaction through the use of a relatively small computer (IBM-7044). The Space Shuttle Rendezvous, Radiation and Reentry Analysis Code is used to perform mission and space radiation environmental analyses for four typical space shuttle missions. Included also is a version of the proposed Apollo/Soyuz rendezvous and docking test mission. Tangential steering circle to circle low-thrust tug orbit raising and the effects of the trapped radiation environment on trajectory shaping due to solar electric power losses are also features of this mission analysis code. The computational results include a parametric study on single impulse versus double impulse deorbiting for relatively low space shuttle orbits as well as some definitive data on the magnetically trapped protons and electrons encountered on a particular mission.
Musshauser, Doris; Bader, Angelika; Wildt, Beatrice; Hochleitner, Margarethe
2006-09-01
The aim of the present study was to evaluate the physical and mental health status of female workers from five different occupational groups and to identify possible sociodemographic and gender-coded family-related factors as well as work characteristics influencing women's health. The identified predictors of health status were subjected to a gender-sensitive analysis and their relations to one another are discussed. A total of 1083 female hospital workers including medical doctors, technical and administrative personnel, nurses and a group mainly consisting of scientific personnel and psychologists completed a questionnaire measuring work- and family-related variables, sociodemographic data and the Short-form 36 Health Questionnaire (SF-36). Data were analysed by multivariate regression analyses. Female medical doctors reported highest scores for all physical health dimensions except General Health. Our study population showed general low mental health status among administrative personnel and the heterogeneous group, others, scored highest on all mental health component scores. A series of eight regression analyses were performed. Three variables contributed highly significantly to all SF-36 subscale scores: age, satisfaction with work schedule, and the unpaid work variable. Age had the strongest influence on all physical dimensions except General Health (beta=-0.17) and had no detectable influence on mental health scores. The unpaid work variable (beta=-0.23; p<0.001) exerted a stronger influence on General Health than did age. Nevertheless, these variables were limited predictors of physical and mental health status. In all occupational groups the amount of time spent daily on child care and household tasks, as a traditional gender-coded factor, and satisfaction with work schedule were the only contributors to mental health among working women in this study. Traditional sociodemographic data had no effect on mental health status. In addition to age, these factors were shown to be the only predictors of physical health status of female workers. Gender coded-factors matter. These findings underline the importance of including gender-coded family- and work-related variables in medical research over and above basic sociodemographic data in order to describe study populations more clearly.
Impact of recent molecular phylogenetic studies on classification of ascomycete yeasts
USDA-ARS?s Scientific Manuscript database
Analyses of concatenated gene sequences as well as whole genome sequences are resolving relationships among the ascomycete yeasts (Saccharomycotina), thus allowing classification of members of this subphylum to be based on phylogeny. In addition, changes implemented in the new Botanical Code [Intern...
Ethnography of Communication: Cultural Codes and Norms.
ERIC Educational Resources Information Center
Carbaugh, Donal
The primary tasks of the ethnographic researcher are to discover, describe, and comparatively analyze different speech communities' ways of speaking. Two general abstractions occurring in ethnographic analyses are normative and cultural. Communicative norms are formulated in analyzing and explaining the "patterned use of speech."…
Translanguaging: Developing Its Conceptualisation and Contextualisation
ERIC Educational Resources Information Center
Lewis, Gwyn; Jones, Bryn; Baker, Colin
2012-01-01
Following from Lewis, Jones, and Baker (this issue), this article analyses the relationship between the new concept of "translanguaging" particularly in the classroom context and more historic terms such as code-switching and translation, indicating differences in (socio)linguistic and ideological understandings as well as in classroom…
Aerodynamic Database Development for Mars Smart Lander Vehicle Configurations
NASA Technical Reports Server (NTRS)
Bobskill, Glenn J.; Parikh, Paresh C.; Prabhu, Ramadas K.; Tyler, Erik D.
2002-01-01
An aerodynamic database has been generated for the Mars Smart Lander Shelf-All configuration using computational fluid dynamics (CFD) simulations. Three different CFD codes, USM3D and FELISA, based on unstructured grid technology and LAURA, an established and validated structured CFD code, were used. As part of this database development, the results for the Mars continuum were validated with experimental data and comparisons made where applicable. The validation of USM3D and LAURA with the Unitary experimental data, the use of intermediate LAURA check analyses, as well as the validation of FELISA with the Mach 6 CF(sub 4) experimental data provided a higher confidence in the ability for CFD to provide aerodynamic data in order to determine the static trim characteristics for longitudinal stability. The analyses of the noncontinuum regime showed the existence of multiple trim angles of attack that can be unstable or stable trim points. This information is needed to design guidance controller throughout the trajectory.
Assessment of seismic design response factors of concrete wall buildings
NASA Astrophysics Data System (ADS)
Mwafy, Aman
2011-03-01
To verify the seismic design response factors of high-rise buildings, five reference structures, varying in height from 20- to 60-stories, were selected and designed according to modern design codes to represent a wide range of concrete wall structures. Verified fiber-based analytical models for inelastic simulation were developed, considering the geometric nonlinearity and material inelasticity of the structural members. The ground motion uncertainty was accounted for by employing 20 earthquake records representing two seismic scenarios, consistent with the latest understanding of the tectonic setting and seismicity of the selected reference region (UAE). A large number of Inelastic Pushover Analyses (IPAs) and Incremental Dynamic Collapse Analyses (IDCAs) were deployed for the reference structures to estimate the seismic design response factors. It is concluded that the factors adopted by the design code are adequately conservative. The results of this systematic assessment of seismic design response factors apply to a wide variety of contemporary concrete wall buildings with various characteristics.
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.; Inger, George R.
1999-01-01
The local viscous-inviscid interaction field generated by a wall temperature jump on a flat plate in supersonic flow and on the windside of a Reusable Launch Vehicle in hypersonic flow is studied in detail by both a Navier-Stokes numerical code and an analytical triple-deck model. Treatment of the rapid heat transfer changes both upstream and downstream of the jump is included. Closed form relationships derived from the triple-deck theory are presented. The analytically predicted pressure and heating variations including upstream influence are found to be in generally good agreement with the Computational Fluid Dynamic (CFD) predictions. These analyses not only clarify the interactive physics involved but also are useful in preliminary design of thermal protection systems and as an insertable module to improve CFD code efficiency when applied to such small-scale interaction problems. The analyses only require conditions at the wall and boundary-layer edge which are easily extracted from a baseline, constant wall temperature, CFD solution.
Deep intronic GPR143 mutation in a Japanese family with ocular albinism
Naruto, Takuya; Okamoto, Nobuhiko; Masuda, Kiyoshi; Endo, Takao; Hatsukawa, Yoshikazu; Kohmoto, Tomohiro; Imoto, Issei
2015-01-01
Deep intronic mutations are often ignored as possible causes of human disease. Using whole-exome sequencing, we analysed genomic DNAs of a Japanese family with two male siblings affected by ocular albinism and congenital nystagmus. Although mutations or copy number alterations of coding regions were not identified in candidate genes, the novel intronic mutation c.659-131 T > G within GPR143 intron 5 was identified as hemizygous in affected siblings and as heterozygous in the unaffected mother. This mutation was predicted to create a cryptic splice donor site within intron 5 and activate a cryptic acceptor site at 41nt upstream, causing the insertion into the coding sequence of an out-of-frame 41-bp pseudoexon with a premature stop codon in the aberrant transcript, which was confirmed by minigene experiments. This result expands the mutational spectrum of GPR143 and suggests the utility of next-generation sequencing integrated with in silico and experimental analyses for improving the molecular diagnosis of this disease. PMID:26061757
Deep intronic GPR143 mutation in a Japanese family with ocular albinism.
Naruto, Takuya; Okamoto, Nobuhiko; Masuda, Kiyoshi; Endo, Takao; Hatsukawa, Yoshikazu; Kohmoto, Tomohiro; Imoto, Issei
2015-06-10
Deep intronic mutations are often ignored as possible causes of human disease. Using whole-exome sequencing, we analysed genomic DNAs of a Japanese family with two male siblings affected by ocular albinism and congenital nystagmus. Although mutations or copy number alterations of coding regions were not identified in candidate genes, the novel intronic mutation c.659-131 T > G within GPR143 intron 5 was identified as hemizygous in affected siblings and as heterozygous in the unaffected mother. This mutation was predicted to create a cryptic splice donor site within intron 5 and activate a cryptic acceptor site at 41nt upstream, causing the insertion into the coding sequence of an out-of-frame 41-bp pseudoexon with a premature stop codon in the aberrant transcript, which was confirmed by minigene experiments. This result expands the mutational spectrum of GPR143 and suggests the utility of next-generation sequencing integrated with in silico and experimental analyses for improving the molecular diagnosis of this disease.
Trajectory-based heating analysis for the European Space Agency/Rosetta Earth Return Vehicle
NASA Technical Reports Server (NTRS)
Henline, William D.; Tauber, Michael E.
1994-01-01
A coupled, trajectory-based flowfield and material thermal-response analysis is presented for the European Space Agency proposed Rosetta comet nucleus sample return vehicle. The probe returns to earth along a hyperbolic trajectory with an entry velocity of 16.5 km/s and requires an ablative heat shield on the forebody. Combined radiative and convective ablating flowfield analyses were performed for the significant heating portion of the shallow ballistic entry trajectory. Both quasisteady ablation and fully transient analyses were performed for a heat shield composed of carbon-phenolic ablative material. Quasisteady analysis was performed using the two-dimensional axisymmetric codes RASLE and BLIMPK. Transient computational results were obtained from the one-dimensional ablation/conduction code CMA. Results are presented for heating, temperature, and ablation rate distributions over the probe forebody for various trajectory points. Comparison of transient and quasisteady results indicates that, for the heating pulse encountered by this probe, the quasisteady approach is conservative from the standpoint of predicted surface recession.
An organizational perspective on ethics as a form of regulation.
Hoeyer, Klaus; Lynöe, Niels
2009-11-01
In this paper we propose a theoretical framework for analysing the history and function of ethics as a form of regulation. Ethics in the form of codes, rules and declarations, constitutes regulatory policies, and we wish to suggest analysing such policies from an organizational perspective. In many instances ethics policies are reactions to particular events involving harm of patients or research participants. As such they seem to come forward as solutions to specific problems. However, not all such events that instigate the making of new policies, and policies often have other effects and are used for other purposes than what we might expect from the events preceding them: when ethics takes on the form of policy making, the relationship between problems and solutions is more complex. We suggest that an organizational perspective on ethics codes, rules and declarations can deliver a relevant framework for future studies of the implications of wanting to address ethical problems through policy making.
Global analysis of the Burkholderia thailandensis quorum sensing-controlled regulon.
Majerczyk, Charlotte; Brittnacher, Mitchell; Jacobs, Michael; Armour, Christopher D; Radey, Mathew; Schneider, Emily; Phattarasokul, Somsak; Bunt, Richard; Greenberg, E Peter
2014-04-01
Burkholderia thailandensis contains three acyl-homoserine lactone quorum sensing circuits and has two additional LuxR homologs. To identify B. thailandensis quorum sensing-controlled genes, we carried out transcriptome sequencing (RNA-seq) analyses of quorum sensing mutants and their parent. The analyses were grounded in the fact that we identified genes coding for factors shown previously to be regulated by quorum sensing among a larger set of quorum-controlled genes. We also found that genes coding for contact-dependent inhibition were induced by quorum sensing and confirmed that specific quorum sensing mutants had a contact-dependent inhibition defect. Additional quorum-controlled genes included those for the production of numerous secondary metabolites, an uncharacterized exopolysaccharide, and a predicted chitin-binding protein. This study provides insights into the roles of the three quorum sensing circuits in the saprophytic lifestyle of B. thailandensis, and it provides a foundation on which to build an understanding of the roles of quorum sensing in the biology of B. thailandensis and the closely related pathogenic Burkholderia pseudomallei and Burkholderia mallei.
Origin and evolution of the long non-coding genes in the X-inactivation center.
Romito, Antonio; Rougeulle, Claire
2011-11-01
Random X chromosome inactivation (XCI), the eutherian mechanism of X-linked gene dosage compensation, is controlled by a cis-acting locus termed the X-inactivation center (Xic). One of the striking features that characterize the Xic landscape is the abundance of loci transcribing non-coding RNAs (ncRNAs), including Xist, the master regulator of the inactivation process. Recent comparative genomic analyses have depicted the evolutionary scenario behind the origin of the X-inactivation center, revealing that this locus evolved from a region harboring protein-coding genes. During mammalian radiation, this ancestral protein-coding region was disrupted in the marsupial group, whilst it provided in eutherian lineage the starting material for the non-translated RNAs of the X-inactivation center. The emergence of non-coding genes occurred by a dual mechanism involving loss of protein-coding function of the pre-existing genes and integration of different classes of mobile elements, some of which modeled the structure and sequence of the non-coding genes in a species-specific manner. The rising genes started to produce transcripts that acquired function in regulating the epigenetic status of the X chromosome, as shown for Xist, its antisense Tsix, Jpx, and recently suggested for Ftx. Thus, the appearance of the Xic, which occurred after the divergence between eutherians and marsupials, was the basis for the evolution of random X inactivation as a strategy to achieve dosage compensation. Copyright © 2011. Published by Elsevier Masson SAS.
Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes.
Aghara, S K; Sriprisan, S I; Singleterry, R C; Sato, T
2015-01-01
Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm(2) Al shield followed by 30 g/cm(2) of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E<100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results. Copyright © 2015 The Committee on Space Research (COSPAR). All rights reserved.
Hjerpe, Per; Boström, Kristina Bengtsson; Lindblad, Ulf; Merlo, Juan
2012-12-01
To investigate the impact on ICD coding behaviour of a new case-mix reimbursement system based on coded patient diagnoses. The main hypothesis was that after the introduction of the new system the coding of chronic diseases like hypertension and cancer would increase and the variance in propensity for coding would decrease on both physician and health care centre (HCC) levels. Cross-sectional multilevel logistic regression analyses were performed in periods covering the time before and after the introduction of the new reimbursement system. Skaraborg primary care, Sweden. All patients (n = 76 546 to 79 826) 50 years of age and older visiting 468 to 627 physicians at the 22 public HCCs in five consecutive time periods of one year each. Registered codes for hypertension and cancer diseases in Skaraborg primary care database (SPCD). After the introduction of the new reimbursement system the adjusted prevalence of hypertension and cancer in SPCD increased from 17.4% to 32.2% and from 0.79% to 2.32%, respectively, probably partly due to an increased diagnosis coding of indirect patient contacts. The total variance in the propensity for coding declined simultaneously at the physician level for both diagnosis groups. Changes in the healthcare reimbursement system may directly influence the contents of a research database that retrieves data from clinical practice. This should be taken into account when using such a database for research purposes, and the data should be validated for each diagnosis.
Predicting Spike Occurrence and Neuronal Responsiveness from LFPs in Primary Somatosensory Cortex
Storchi, Riccardo; Zippo, Antonio G.; Caramenti, Gian Carlo; Valente, Maurizio; Biella, Gabriele E. M.
2012-01-01
Local Field Potentials (LFPs) integrate multiple neuronal events like synaptic inputs and intracellular potentials. LFP spatiotemporal features are particularly relevant in view of their applications both in research (e.g. for understanding brain rhythms, inter-areal neural communication and neronal coding) and in the clinics (e.g. for improving invasive Brain-Machine Interface devices). However the relation between LFPs and spikes is complex and not fully understood. As spikes represent the fundamental currency of neuronal communication this gap in knowledge strongly limits our comprehension of neuronal phenomena underlying LFPs. We investigated the LFP-spike relation during tactile stimulation in primary somatosensory (S-I) cortex in the rat. First we quantified how reliably LFPs and spikes code for a stimulus occurrence. Then we used the information obtained from our analyses to design a predictive model for spike occurrence based on LFP inputs. The model was endowed with a flexible meta-structure whose exact form, both in parameters and structure, was estimated by using a multi-objective optimization strategy. Our method provided a set of nonlinear simple equations that maximized the match between models and true neurons in terms of spike timings and Peri Stimulus Time Histograms. We found that both LFPs and spikes can code for stimulus occurrence with millisecond precision, showing, however, high variability. Spike patterns were predicted significantly above chance for 75% of the neurons analysed. Crucially, the level of prediction accuracy depended on the reliability in coding for the stimulus occurrence. The best predictions were obtained when both spikes and LFPs were highly responsive to the stimuli. Spike reliability is known to depend on neuron intrinsic properties (i.e. on channel noise) and on spontaneous local network fluctuations. Our results suggest that the latter, measured through the LFP response variability, play a dominant role. PMID:22586452
Advanced Small Perturbation Potential Flow Theory for Unsteady Aerodynamic and Aeroelastic Analyses
NASA Technical Reports Server (NTRS)
Batina, John T.
2005-01-01
An advanced small perturbation (ASP) potential flow theory has been developed to improve upon the classical transonic small perturbation (TSP) theories that have been used in various computer codes. These computer codes are typically used for unsteady aerodynamic and aeroelastic analyses in the nonlinear transonic flight regime. The codes exploit the simplicity of stationary Cartesian meshes with the movement or deformation of the configuration under consideration incorporated into the solution algorithm through a planar surface boundary condition. The new ASP theory was developed methodically by first determining the essential elements required to produce full-potential-like solutions with a small perturbation approach on the requisite Cartesian grid. This level of accuracy required a higher-order streamwise mass flux and a mass conserving surface boundary condition. The ASP theory was further developed by determining the essential elements required to produce results that agreed well with Euler solutions. This level of accuracy required mass conserving entropy and vorticity effects, and second-order terms in the trailing wake boundary condition. Finally, an integral boundary layer procedure, applicable to both attached and shock-induced separated flows, was incorporated for viscous effects. The resulting ASP potential flow theory, including entropy, vorticity, and viscous effects, is shown to be mathematically more appropriate and computationally more accurate than the classical TSP theories. The formulaic details of the ASP theory are described fully and the improvements are demonstrated through careful comparisons with accepted alternative results and experimental data. The new theory has been used as the basis for a new computer code called ASP3D (Advanced Small Perturbation - 3D), which also is briefly described with representative results.
A New Capability for Nuclear Thermal Propulsion Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amiri, Benjamin W.; Nuclear and Radiological Engineering Department, University of Florida, Gainesville, FL 32611; Kapernick, Richard J.
2007-01-30
This paper describes a new capability for Nuclear Thermal Propulsion (NTP) design that has been developed, and presents the results of some analyses performed with this design tool. The purpose of the tool is to design to specified mission and material limits, while maximizing system thrust to weight. The head end of the design tool utilizes the ROCket Engine Transient Simulation (ROCETS) code to generate a system design and system design requirements as inputs to the core analysis. ROCETS is a modular system level code which has been used extensively in the liquid rocket engine industry for many years. Themore » core design tool performs high-fidelity reactor core nuclear and thermal-hydraulic design analysis. At the heart of this process are two codes TMSS-NTP and NTPgen, which together greatly automate the analysis, providing the capability to rapidly produce designs that meet all specified requirements while minimizing mass. A PERL based command script, called CORE DESIGNER controls the execution of these two codes, and checks for convergence throughout the process. TMSS-NTP is executed first, to produce a suite of core designs that meet the specified reactor core mechanical, thermal-hydraulic and structural requirements. The suite of designs consists of a set of core layouts and, for each core layout specific designs that span a range of core fuel volumes. NTPgen generates MCNPX models for each of the core designs from TMSS-NTP. Iterative analyses are performed in NTPgen until a reactor design (fuel volume) is identified for each core layout that meets cold and hot operation reactivity requirements and that is zoned to meet a radial core power distribution requirement.« less
Chung, Kevin C.; Song, Jae W.; Shauver, Melissa J.; Cullison, Terry M.; Noone, R. Barrett
2011-01-01
Background To evaluate the case mix of plastic surgeons in their early years of practice by examining candidate case-logs submitted for the Oral Examination. Methods De-identified data from 2000–2009 consisting of case-logs submitted by young plastic surgery candidates for the Oral Examination were analyzed. Data consisted of exam year, CPT (Current Procedural Terminology) Codes and the designation of each CPT code as cosmetic or reconstructive by the candidate, and patient age and gender. Subgroup analyses for comprehensive, cosmetic, craniomaxillofacial, and hand surgery modules were performed by using the CPT code list designated by the American Board of Plastic Surgery Maintenance of Certification in Plastic Surgery ( ) module framework. Results We examined case-logs from a yearly average of 261 candidates over 10 years. Wider variations in yearly percent change in median cosmetic surgery case volumes (−62.5% to 30%) were observed when compared to the reconstructive surgery case volumes (−18.0% to 25.7%). Compared to cosmetic surgery cases per candidate, which varied significantly from year-to-year (p<0.0001), reconstructive surgery cases per candidate did not vary significantly (p=0.954). Subgroup analyses of proportions of types of surgical procedures based on CPT code categories, revealed hand surgery to be the least performed procedure relative to comprehensive, craniomaxillofacial, and cosmetic surgery procedures. Conclusions Graduates of plastic surgery training programs are committed to performing a broad spectrum of reconstructive and cosmetic surgical procedures in their first year of practice. However, hand surgery continues to have a small presence in the practice profiles of young plastic surgeons. PMID:21788850
NASA Astrophysics Data System (ADS)
Brandelik, Andreas
2009-07-01
CALCMIN, an open source Visual Basic program, was implemented in EXCEL™. The program was primarily developed to support geoscientists in their routine task of calculating structural formulae of minerals on the basis of chemical analysis mainly obtained by electron microprobe (EMP) techniques. Calculation programs for various minerals are already included in the form of sub-routines. These routines are arranged in separate modules containing a minimum of code. The architecture of CALCMIN allows the user to easily develop new calculation routines or modify existing routines with little knowledge of programming techniques. By means of a simple mouse-click, the program automatically generates a rudimentary framework of code using the object model of the Visual Basic Editor (VBE). Within this framework simple commands and functions, which are provided by the program, can be used, for example, to perform various normalization procedures or to output the results of the computations. For the clarity of the code, element symbols are used as variables initialized by the program automatically. CALCMIN does not set any boundaries in complexity of the code used, resulting in a wide range of possible applications. Thus, matrix and optimization methods can be included, for instance, to determine end member contents for subsequent thermodynamic calculations. Diverse input procedures are provided, such as the automated read-in of output files created by the EMP. Furthermore, a subsequent filter routine enables the user to extract specific analyses in order to use them for a corresponding calculation routine. An event-driven, interactive operating mode was selected for easy application of the program. CALCMIN leads the user from the beginning to the end of the calculation process.
Predicting spike occurrence and neuronal responsiveness from LFPs in primary somatosensory cortex.
Storchi, Riccardo; Zippo, Antonio G; Caramenti, Gian Carlo; Valente, Maurizio; Biella, Gabriele E M
2012-01-01
Local Field Potentials (LFPs) integrate multiple neuronal events like synaptic inputs and intracellular potentials. LFP spatiotemporal features are particularly relevant in view of their applications both in research (e.g. for understanding brain rhythms, inter-areal neural communication and neuronal coding) and in the clinics (e.g. for improving invasive Brain-Machine Interface devices). However the relation between LFPs and spikes is complex and not fully understood. As spikes represent the fundamental currency of neuronal communication this gap in knowledge strongly limits our comprehension of neuronal phenomena underlying LFPs. We investigated the LFP-spike relation during tactile stimulation in primary somatosensory (S-I) cortex in the rat. First we quantified how reliably LFPs and spikes code for a stimulus occurrence. Then we used the information obtained from our analyses to design a predictive model for spike occurrence based on LFP inputs. The model was endowed with a flexible meta-structure whose exact form, both in parameters and structure, was estimated by using a multi-objective optimization strategy. Our method provided a set of nonlinear simple equations that maximized the match between models and true neurons in terms of spike timings and Peri Stimulus Time Histograms. We found that both LFPs and spikes can code for stimulus occurrence with millisecond precision, showing, however, high variability. Spike patterns were predicted significantly above chance for 75% of the neurons analysed. Crucially, the level of prediction accuracy depended on the reliability in coding for the stimulus occurrence. The best predictions were obtained when both spikes and LFPs were highly responsive to the stimuli. Spike reliability is known to depend on neuron intrinsic properties (i.e. on channel noise) and on spontaneous local network fluctuations. Our results suggest that the latter, measured through the LFP response variability, play a dominant role.
Snyder, Matthew J; Nguyen, Dana R; Womack, Jasmyne J; Bunt, Christopher W; Westerfield, Katie L; Bell, Adriane E; Ledford, Christy J W
2018-03-01
Collection of feedback regarding medical student clinical experiences for formative or summative purposes remains a challenge across clinical settings. The purpose of this study was to determine whether the use of a quick response (QR) code-linked online feedback form improves the frequency and efficiency of rater feedback. In 2016, we compared paper-based feedback forms, an online feedback form, and a QR code-linked online feedback form at 15 family medicine clerkship sites across the United States. Outcome measures included usability, number of feedback submissions per student, number of unique raters providing feedback, and timeliness of feedback provided to the clerkship director. The feedback method was significantly associated with usability, with QR code scoring the highest, and paper second. Accessing feedback via QR code was associated with the shortest time to prepare feedback. Across four rotations, separate repeated measures analyses of variance showed no effect of feedback system on the number of submissions per student or the number of unique raters. The results of this study demonstrate that preceptors in the family medicine clerkship rate QR code-linked feedback as a high usability platform. Additionally, this platform resulted in faster form completion than paper or online forms. An overarching finding of this study is that feedback forms must be portable and easily accessible. Potential implementation barriers and the social norm for providing feedback in this manner need to be considered.
Studying the genetic basis of speciation in high gene flow marine invertebrates
2016-01-01
A growing number of genes responsible for reproductive incompatibilities between species (barrier loci) exhibit the signals of positive selection. However, the possibility that genes experiencing positive selection diverge early in speciation and commonly cause reproductive incompatibilities has not been systematically investigated on a genome-wide scale. Here, I outline a research program for studying the genetic basis of speciation in broadcast spawning marine invertebrates that uses a priori genome-wide information on a large, unbiased sample of genes tested for positive selection. A targeted sequence capture approach is proposed that scores single-nucleotide polymorphisms (SNPs) in widely separated species populations at an early stage of allopatric divergence. The targeted capture of both coding and non-coding sequences enables SNPs to be characterized at known locations across the genome and at genes with known selective or neutral histories. The neutral coding and non-coding SNPs provide robust background distributions for identifying FST-outliers within genes that can, in principle, identify specific mutations experiencing diversifying selection. If natural hybridization occurs between species, the neutral coding and non-coding SNPs can provide a neutral admixture model for genomic clines analyses aimed at finding genes exhibiting strong blocks to introgression. Strongylocentrotid sea urchins are used as a model system to outline the approach but it can be used for any group that has a complete reference genome available. PMID:29491951
Zhang, Haiyun; Sun, Dejun; Li, Defu; Zheng, Zeguang; Xu, Jingyi; Liang, Xue; Zhang, Chenting; Wang, Sheng; Wang, Jian; Lu, Wenju
2018-05-15
Long non-coding RNAs (lncRNAs) have critical regulatory roles in protein-coding gene expression. Aberrant expression profiles of lncRNAs have been observed in various human diseases. In this study, we investigated transcriptome profiles in lung tissues of chronic cigarette smoke (CS)-induced COPD mouse model. We found that 109 lncRNAs and 260 mRNAs were significantly differential expressed in lungs of chronic CS-induced COPD mouse model compared with control animals. GO and KEGG analyses indicated that differentially expressed lncRNAs associated protein-coding genes were mainly involved in protein processing of endoplasmic reticulum pathway, and taurine and hypotaurine metabolism pathway. The combination of high throughput data analysis and the results of qRT-PCR validation in lungs of chronic CS-induced COPD mouse model, 16HBE cells with CSE treatment and PBMC from patients with COPD revealed that NR_102714 and its associated protein-coding gene UCHL1 might be involved in the development of COPD both in mouse and human. In conclusion, our study demonstrated that aberrant expression profiles of lncRNAs and mRNAs existed in lungs of chronic CS-induced COPD mouse model. From animal models perspective, these results might provide further clues to investigate biological functions of lncRNAs and their potential target protein-coding genes in the pathogenesis of COPD.
A distributed code for color in natural scenes derived from center-surround filtered cone signals
Kellner, Christian J.; Wachtler, Thomas
2013-01-01
In the retina of trichromatic primates, chromatic information is encoded in an opponent fashion and transmitted to the lateral geniculate nucleus (LGN) and visual cortex via parallel pathways. Chromatic selectivities of neurons in the LGN form two separate clusters, corresponding to two classes of cone opponency. In the visual cortex, however, the chromatic selectivities are more distributed, which is in accordance with a population code for color. Previous studies of cone signals in natural scenes typically found opponent codes with chromatic selectivities corresponding to two directions in color space. Here we investigated how the non-linear spatio-chromatic filtering in the retina influences the encoding of color signals. Cone signals were derived from hyper-spectral images of natural scenes and preprocessed by center-surround filtering and rectification, resulting in parallel ON and OFF channels. Independent Component Analysis (ICA) on these signals yielded a highly sparse code with basis functions that showed spatio-chromatic selectivities. In contrast to previous analyses of linear transformations of cone signals, chromatic selectivities were not restricted to two main chromatic axes, but were more continuously distributed in color space, similar to the population code of color in the early visual cortex. Our results indicate that spatio-chromatic processing in the retina leads to a more distributed and more efficient code for natural scenes. PMID:24098289
Perspectives on barriers to eating healthy among food pantry clients
USDA-ARS?s Scientific Manuscript database
The objective of this study was to explore perspectives on barriers of eating healthy among food pantry clients. Food pantry clients participated in focus groups/interviews. Qualitative data were coded and analyzed using content analyses and grounded theory approach. Themes were then identified. Qua...
Economic incentives and diagnostic coding in a public health care system.
Anthun, Kjartan Sarheim; Bjørngaard, Johan Håkon; Magnussen, Jon
2017-03-01
We analysed the association between economic incentives and diagnostic coding practice in the Norwegian public health care system. Data included 3,180,578 hospital discharges in Norway covering the period 1999-2008. For reimbursement purposes, all discharges are grouped in diagnosis-related groups (DRGs). We examined pairs of DRGs where the addition of one or more specific diagnoses places the patient in a complicated rather than an uncomplicated group, yielding higher reimbursement. The economic incentive was measured as the potential gain in income by coding a patient as complicated, and we analysed the association between this gain and the share of complicated discharges within the DRG pairs. Using multilevel linear regression modelling, we estimated both differences between hospitals for each DRG pair and changes within hospitals for each DRG pair over time. Over the whole period, a one-DRG-point difference in price was associated with an increased share of complicated discharges of 14.2 (95 % confidence interval [CI] 11.2-17.2) percentage points. However, a one-DRG-point change in prices between years was only associated with a 0.4 (95 % CI [Formula: see text] to 1.8) percentage point change of discharges into the most complicated diagnostic category. Although there was a strong increase in complicated discharges over time, this was not as closely related to price changes as expected.
NASA Astrophysics Data System (ADS)
Zhou, Zhenggan; Ma, Baoquan; Jiang, Jingtao; Yu, Guang; Liu, Kui; Zhang, Dongmei; Liu, Weiping
2014-10-01
Air-coupled ultrasonic testing (ACUT) technique has been viewed as a viable solution in defect detection of advanced composites used in aerospace and aviation industries. However, the giant mismatch of acoustic impedance in air-solid interface makes the transmission efficiency of ultrasound low, and leads to poor signal-to-noise (SNR) ratio of received signal. The utilisation of signal-processing techniques in non-destructive testing is highly appreciated. This paper presents a wavelet filtering and phase-coded pulse compression hybrid method to improve the SNR and output power of received signal. The wavelet transform is utilised to filter insignificant components from noisy ultrasonic signal, and pulse compression process is used to improve the power of correlated signal based on cross-correction algorithm. For the purpose of reasonable parameter selection, different families of wavelets (Daubechies, Symlet and Coiflet) and decomposition level in discrete wavelet transform are analysed, different Barker codes (5-13 bits) are also analysed to acquire higher main-to-side lobe ratio. The performance of the hybrid method was verified in a honeycomb composite sample. Experimental results demonstrated that the proposed method is very efficient in improving the SNR and signal strength. The applicability of the proposed method seems to be a very promising tool to evaluate the integrity of high ultrasound attenuation composite materials using the ACUT.
Garita-Cambronero, Jerson; Palacio-Bielsa, Ana; López, María M; Cubero, Jaime
2016-01-01
Xanthomonas arboricola is a species in genus Xanthomonas which is mainly comprised of plant pathogens. Among the members of this taxon, X. arboricola pv. pruni, the causal agent of bacterial spot disease of stone fruits and almond, is distributed worldwide although it is considered a quarantine pathogen in the European Union. Herein, we report the draft genome sequence, the classification, the annotation and the sequence analyses of a virulent strain, IVIA 2626.1, and an avirulent strain, CITA 44, of X. arboricola associated with Prunus spp. The draft genome sequence of IVIA 2626.1 consists of 5,027,671 bp, 4,720 protein coding genes and 50 RNA encoding genes. The draft genome sequence of strain CITA 44 consists of 4,760,482 bp, 4,250 protein coding genes and 56 RNA coding genes. Initial comparative analyses reveals differences in the presence of structural and regulatory components of the type IV pilus, the type III secretion system, the type III effectors as well as variations in the number of the type IV secretion systems. The genome sequence data for these strains will facilitate the development of molecular diagnostics protocols that differentiate virulent and avirulent strains. In addition, comparative genome analysis will provide insights into the plant-pathogen interaction during the bacterial spot disease process.
Nuclear data uncertainty propagation by the XSUSA method in the HELIOS2 lattice code
NASA Astrophysics Data System (ADS)
Wemple, Charles; Zwermann, Winfried
2017-09-01
Uncertainty quantification has been extensively applied to nuclear criticality analyses for many years and has recently begun to be applied to depletion calculations. However, regulatory bodies worldwide are trending toward requiring such analyses for reactor fuel cycle calculations, which also requires uncertainty propagation for isotopics and nuclear reaction rates. XSUSA is a proven methodology for cross section uncertainty propagation based on random sampling of the nuclear data according to covariance data in multi-group representation; HELIOS2 is a lattice code widely used for commercial and research reactor fuel cycle calculations. This work describes a technique to automatically propagate the nuclear data uncertainties via the XSUSA approach through fuel lattice calculations in HELIOS2. Application of the XSUSA methodology in HELIOS2 presented some unusual challenges because of the highly-processed multi-group cross section data used in commercial lattice codes. Currently, uncertainties based on the SCALE 6.1 covariance data file are being used, but the implementation can be adapted to other covariance data in multi-group structure. Pin-cell and assembly depletion calculations, based on models described in the UAM-LWR Phase I and II benchmarks, are performed and uncertainties in multiplication factor, reaction rates, isotope concentrations, and delayed-neutron data are calculated. With this extension, it will be possible for HELIOS2 users to propagate nuclear data uncertainties directly from the microscopic cross sections to subsequent core simulations.
Chimeric mitochondrial peptides from contiguous regular and swinger RNA.
Seligmann, Hervé
2016-01-01
Previous mass spectrometry analyses described human mitochondrial peptides entirely translated from swinger RNAs, RNAs where polymerization systematically exchanged nucleotides. Exchanges follow one among 23 bijective transformation rules, nine symmetric exchanges (X ↔ Y, e.g. A ↔ C) and fourteen asymmetric exchanges (X → Y → Z → X, e.g. A → C → G → A), multiplying by 24 DNA's protein coding potential. Abrupt switches from regular to swinger polymerization produce chimeric RNAs. Here, human mitochondrial proteomic analyses assuming abrupt switches between regular and swinger transcriptions, detect chimeric peptides, encoded by part regular, part swinger RNA. Contiguous regular- and swinger-encoded residues within single peptides are stronger evidence for translation of swinger RNA than previously detected, entirely swinger-encoded peptides: regular parts are positive controls matched with contiguous swinger parts, increasing confidence in results. Chimeric peptides are 200 × rarer than swinger peptides (3/100,000 versus 6/1000). Among 186 peptides with > 8 residues for each regular and swinger parts, regular parts of eleven chimeric peptides correspond to six among the thirteen recognized, mitochondrial protein-coding genes. Chimeric peptides matching partly regular proteins are rarer and less expressed than chimeric peptides matching non-coding sequences, suggesting targeted degradation of misfolded proteins. Present results strengthen hypotheses that the short mitogenome encodes far more proteins than hitherto assumed. Entirely swinger-encoded proteins could exist.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milani, Gabriele, E-mail: milani@stru.polimi.it, E-mail: gabriele.milani@polimi.it; Valente, Marco
This study presents some FE results regarding the behavior under horizontal loads of eight existing masonry towers located in the North-East of Italy. The towers, albeit unique for geometric and architectural features, show some affinities which justify a comparative analysis, as for instance the location and the similar masonry material. Their structural behavior under horizontal loads is therefore influenced by geometrical issues, such as slenderness, walls thickness, perforations, irregularities, presence of internal vaults, etc., all features which may be responsible for a peculiar output. The geometry of the towers is deduced from both existing available documentation and in-situ surveys. Onmore » the basis of such geometrical data, a detailed 3D realistic mesh is conceived, with a point by point characterization of each single geometric element. The FE models are analysed under seismic loads acting along geometric axes of the plan section, both under non-linear static (pushover) and non-linear dynamic excitation assumptions. A damage-plasticity material model exhibiting softening in both tension and compression, already available in the commercial code Abaqus, is used for masonry. Pushover analyses are performed with both G1 and G2 horizontal loads distribution, according to Italian code requirements, along X+/− and Y+/− directions. Non-linear dynamic analyses are performed along both X and Y directions with a real accelerogram scaled to different peak ground accelerations. Some few results are presented in this paper. It is found that the results obtained with pushover analyses reasonably well fit expensive non-linear dynamic simulations, with a slightly less conservative trend.« less
Target gene analyses of 39 amelogenesis imperfecta kindreds
Chan, Hui-Chen; Estrella, Ninna M. R. P.; Milkovich, Rachel N.; Kim, Jung-Wook; Simmer, James P.; Hu, Jan C-C.
2012-01-01
Previously, mutational analyses identified six disease-causing mutations in 24 amelogenesis imperfecta (AI) kindreds. We have since expanded the number of AI kindreds to 39, and performed mutation analyses covering the coding exons and adjoining intron sequences for the six proven AI candidate genes [amelogenin (AMELX), enamelin (ENAM), family with sequence similarity 83, member H (FAM83H), WD repeat containing domain 72 (WDR72), enamelysin (MMP20), and kallikrein-related peptidase 4 (KLK4)] and for ameloblastin (AMBN) (a suspected candidate gene). All four of the X-linked AI families (100%) had disease-causing mutations in AMELX, suggesting that AMELX is the only gene involved in the aetiology of X-linked AI. Eighteen families showed an autosomal-dominant pattern of inheritance. Disease-causing mutations were identified in 12 (67%): eight in FAM83H, and four in ENAM. No FAM83H coding-region or splice-junction mutations were identified in three probands with autosomal-dominant hypocalcification AI (ADHCAI), suggesting that a second gene may contribute to the aetiology of ADHCAI. Six families showed an autosomal-recessive pattern of inheritance, and disease-causing mutations were identified in three (50%): two in MMP20, and one in WDR72. No disease-causing mutations were found in 11 families with only one affected member. We conclude that mutation analyses of the current candidate genes for AI have about a 50% chance of identifying the disease-causing mutation in a given kindred. PMID:22243262
GobyWeb: Simplified Management and Analysis of Gene Expression and DNA Methylation Sequencing Data
Dorff, Kevin C.; Chambwe, Nyasha; Zeno, Zachary; Simi, Manuele; Shaknovich, Rita; Campagne, Fabien
2013-01-01
We present GobyWeb, a web-based system that facilitates the management and analysis of high-throughput sequencing (HTS) projects. The software provides integrated support for a broad set of HTS analyses and offers a simple plugin extension mechanism. Analyses currently supported include quantification of gene expression for messenger and small RNA sequencing, estimation of DNA methylation (i.e., reduced bisulfite sequencing and whole genome methyl-seq), or the detection of pathogens in sequenced data. In contrast to previous analysis pipelines developed for analysis of HTS data, GobyWeb requires significantly less storage space, runs analyses efficiently on a parallel grid, scales gracefully to process tens or hundreds of multi-gigabyte samples, yet can be used effectively by researchers who are comfortable using a web browser. We conducted performance evaluations of the software and found it to either outperform or have similar performance to analysis programs developed for specialized analyses of HTS data. We found that most biologists who took a one-hour GobyWeb training session were readily able to analyze RNA-Seq data with state of the art analysis tools. GobyWeb can be obtained at http://gobyweb.campagnelab.org and is freely available for non-commercial use. GobyWeb plugins are distributed in source code and licensed under the open source LGPL3 license to facilitate code inspection, reuse and independent extensions http://github.com/CampagneLaboratory/gobyweb2-plugins. PMID:23936070
Gillard, Steven; Borschmann, Rohan; Turner, Kati; Goodrich‐Purnell, Norman; Lovell, Kathleen; Chambers, Mary
2010-01-01
Abstract Background Interest in the involvement of members of the public in health services research is increasingly focussed on evaluation of the impact of involvement on the research process and the production of knowledge about health. Service user involvement in mental health research is well‐established, yet empirical studies into the impact of involvement are lacking. Objective To investigate the potential to provide empirical evidence of the impact of service user researchers (SURs) on the research process. Design The study uses a range of secondary analyses of interview transcripts from a qualitative study of the experiences of psychiatric patients detained under the Mental Health Act (1983) to compare the way in which SURs and conventional university researchers (URs) conduct and analyse qualitative interviews. Results Analyses indicated some differences in the ways in which service user‐ and conventional URs conducted qualitative interviews. SURs were much more likely to code (analyse) interview transcripts in terms of interviewees’ experiences and feelings, while conventional URs coded the same transcripts largely in terms of processes and procedures related to detention. The limitations of a secondary analysis based on small numbers of researchers are identified and discussed. Conclusions The study demonstrates the potential to develop a methodologically robust approach to evaluate empirically the impact of SURs on research process and findings, and is indicative of the potential benefits of collaborative research for informing evidence‐based practice in mental health services. PMID:20536538
Becker, Betsy Jane; Aloe, Ariel M; Duvendack, Maren; Stanley, T D; Valentine, Jeffrey C; Fretheim, Atle; Tugwell, Peter
2017-09-01
To outline issues of importance to analytic approaches to the synthesis of quasi-experiments (QEs) and to provide a statistical model for use in analysis. We drew on studies of statistics, epidemiology, and social-science methodology to outline methods for synthesis of QE studies. The design and conduct of QEs, effect sizes from QEs, and moderator variables for the analysis of those effect sizes were discussed. Biases, confounding, design complexities, and comparisons across designs offer serious challenges to syntheses of QEs. Key components of meta-analyses of QEs were identified, including the aspects of QE study design to be coded and analyzed. Of utmost importance are the design and statistical controls implemented in the QEs. Such controls and any potential sources of bias and confounding must be modeled in analyses, along with aspects of the interventions and populations studied. Because of such controls, effect sizes from QEs are more complex than those from randomized experiments. A statistical meta-regression model that incorporates important features of the QEs under review was presented. Meta-analyses of QEs provide particular challenges, but thorough coding of intervention characteristics and study methods, along with careful analysis, should allow for sound inferences. Copyright © 2017 Elsevier Inc. All rights reserved.
Injury risks of EMS responders: evidence from the National Fire Fighter Near-Miss Reporting System
Taylor, Jennifer A; Davis, Andrea L; Barnes, Brittany; Lacovara, Alicia V; Patel, Reema
2015-01-01
Objectives We analysed near-miss and injury events reported to the National Fire Fighter Near-Miss Reporting System (NFFNMRS) to investigate the workplace hazards and safety concerns of Emergency Medical Services (EMS) responders in the USA. Methods We reviewed 769 ‘non-fire emergency event’ reports from the NFFNMRS using a mixed methods approach. We identified 185 emergency medical calls and analysed their narrative text fields. We assigned Mechanism of Near-Miss/Injury and Nature of Injury codes and then tabulated frequencies (quantitative). We coded major themes regarding work hazards and safety concerns reported by the EMS responders (qualitative). Results Of the 185 emergency medical calls, the most commonly identified Mechanisms of Near-Miss/Injury to EMS responders was Assaults, followed by Struck-by Motor Vehicle, and Motor Vehicle Collision. The most commonly identified weapon used in an assault was a firearm. We identified 5 major domains of workplace hazards and safety concerns: Assaults by Patients, Risks from Motor Vehicles, Personal Protective Equipment, Relationships between Emergency Responders, and Policies, Procedures and Practices. Conclusions Narrative text from the NFFNMRS is a rich source of data that can be analysed quantitatively and qualitatively to provide insight into near-misses and injuries sustained by EMS responders. Near-miss reporting systems are critical components for occupational hazard surveillance. PMID:26068510
Transcriptional profiling of murine osteoblast differentiation based on RNA-seq expression analyses.
Khayal, Layal Abo; Grünhagen, Johannes; Provazník, Ivo; Mundlos, Stefan; Kornak, Uwe; Robinson, Peter N; Ott, Claus-Eric
2018-04-11
Osteoblastic differentiation is a multistep process characterized by osteogenic induction of mesenchymal stem cells, which then differentiate into proliferative pre-osteoblasts that produce copious amounts of extracellular matrix, followed by stiffening of the extracellular matrix, and matrix mineralization by hydroxylapatite deposition. Although these processes have been well characterized biologically, a detailed transcriptional analysis of murine primary calvaria osteoblast differentiation based on RNA sequencing (RNA-seq) analyses has not previously been reported. Here, we used RNA-seq to obtain expression values of 29,148 genes at four time points as murine primary calvaria osteoblasts differentiate in vitro until onset of mineralization was clearly detectable by microscopic inspection. Expression of marker genes confirmed osteogenic differentiation. We explored differential expression of 1386 protein-coding genes using unsupervised clustering and GO analyses. 100 differentially expressed lncRNAs were investigated by co-expression with protein-coding genes that are localized within the same topologically associated domain. Additionally, we monitored expression of 237 genes that are silent or active at distinct time points and compared differential exon usage. Our data represent an in-depth profiling of murine primary calvaria osteoblast differentiation by RNA-seq and contribute to our understanding of genetic regulation of this key process in osteoblast biology. Copyright © 2018 Elsevier Inc. All rights reserved.
KENO-VI Primer: A Primer for Criticality Calculations with SCALE/KENO-VI Using GeeWiz
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Stephen M
2008-09-01
The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory is widely used and accepted around the world for criticality safety analyses. The well-known KENO-VI three-dimensional Monte Carlo criticality computer code is one of the primary criticality safety analysis tools in SCALE. The KENO-VI primer is designed to help a new user understand and use the SCALE/KENO-VI Monte Carlo code for nuclear criticality safety analyses. It assumes that the user has a college education in a technical field. There is no assumption of familiarity with Monte Carlo codes in general or with SCALE/KENO-VImore » in particular. The primer is designed to teach by example, with each example illustrating two or three features of SCALE/KENO-VI that are useful in criticality analyses. The primer is based on SCALE 6, which includes the Graphically Enhanced Editing Wizard (GeeWiz) Windows user interface. Each example uses GeeWiz to provide the framework for preparing input data and viewing output results. Starting with a Quickstart section, the primer gives an overview of the basic requirements for SCALE/KENO-VI input and allows the user to quickly run a simple criticality problem with SCALE/KENO-VI. The sections that follow Quickstart include a list of basic objectives at the beginning that identifies the goal of the section and the individual SCALE/KENO-VI features that are covered in detail in the sample problems in that section. Upon completion of the primer, a new user should be comfortable using GeeWiz to set up criticality problems in SCALE/KENO-VI. The primer provides a starting point for the criticality safety analyst who uses SCALE/KENO-VI. Complete descriptions are provided in the SCALE/KENO-VI manual. Although the primer is self-contained, it is intended as a companion volume to the SCALE/KENO-VI documentation. (The SCALE manual is provided on the SCALE installation DVD.) The primer provides specific examples of using SCALE/KENO-VI for criticality analyses; the SCALE/KENO-VI manual provides information on the use of SCALE/KENO-VI and all its modules. The primer also contains an appendix with sample input files.« less
Ince, Robin A A; Jaworska, Katarzyna; Gross, Joachim; Panzeri, Stefano; van Rijsbergen, Nicola J; Rousselet, Guillaume A; Schyns, Philippe G
2016-08-22
A key to understanding visual cognition is to determine "where", "when", and "how" brain responses reflect the processing of the specific visual features that modulate categorization behavior-the "what". The N170 is the earliest Event-Related Potential (ERP) that preferentially responds to faces. Here, we demonstrate that a paradigmatic shift is necessary to interpret the N170 as the product of an information processing network that dynamically codes and transfers face features across hemispheres, rather than as a local stimulus-driven event. Reverse-correlation methods coupled with information-theoretic analyses revealed that visibility of the eyes influences face detection behavior. The N170 initially reflects coding of the behaviorally relevant eye contralateral to the sensor, followed by a causal communication of the other eye from the other hemisphere. These findings demonstrate that the deceptively simple N170 ERP hides a complex network information processing mechanism involving initial coding and subsequent cross-hemispheric transfer of visual features. © The Author 2016. Published by Oxford University Press.
Ribeiro, Aridiane Alves; Arantes, Cássia Irene Spinelli; Gualda, Dulce Maria Rosa; Rossi, Lídia Aparecida
2017-06-01
This case study aimed to interpret the underlying historical and cultural aspects of the provision of care at an indigenous healthcare service facility. This is an interpretive, case study-type research with qualitative approach, which was conducted in 2012 at the Indigenous Health Support Center (CASAI) of the State of Mato Grosso do Sul, Brazil. Data were collected by means systematic observation, documentary analyses and semi-structured interviews with ten health professionals. Data review was performed according to an approach based on social anthropology and health anthropology. The anthropological concepts of social code and ethnocentrism underpinned the interpretation of outcomes. Two categories were identified: CASAI, a space between streets and village; Ethnocentrism and indigenous health care. Healthcare practice and current social code are influenced by each other. The street social code prevails in the social environment under study. The institutional organization and professionals' appreciation of the indigenous biological body are decisive to provision of care under the streets social code perspective. Professionals' concepts evidence ethnocentrism in healthcare. Workers, however, try to adopt a relativized view vis-à-vis indigenous people at CASAI.
Burnup calculations and chemical analysis of irradiated fuel samples studied in LWR-PROTEUS phase II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimm, P.; Guenther-Leopold, I.; Berger, H. D.
2006-07-01
The isotopic compositions of 5 UO{sub 2} samples irradiated in a Swiss PWR power plant, which were investigated in the LWR-PROTEUS Phase II programme, were calculated using the CASMO-4 and BOXER assembly codes. The burnups of the samples range from 50 to 90 MWd/kg. The results for a large number of actinide and fission product nuclides were compared to those of chemical analyses performed using a combination of chromatographic separation and mass spectrometry. A good agreement of calculated and measured concentrations is found for many of the nuclides investigated with both codes. The concentrations of the Pu isotopes are mostlymore » predicted within {+-}10%, the two codes giving quite different results, except for {sup 242}Pu. Relatively significant deviations are found for some isotopes of Cs and Sm, and large discrepancies are observed for Eu and Gd. The overall quality of the predictions by the two codes is comparable, and the deviations from the experimental data do not generally increase with burnup. (authors)« less
Investigation of neutral particle dynamics in Aditya tokamak plasma with DEGAS2 code
NASA Astrophysics Data System (ADS)
Dey, Ritu; Ghosh, Joydeep; Chowdhuri, M. B.; Manchanda, R.; Banerjee, S.; Ramaiya, N.; Sharma, Deepti; Srinivasan, R.; Stotler, D. P.; Aditya Team
2017-08-01
Neutral particle behavior in Aditya tokamak, which has a circular poloidal ring limiter at one particular toroidal location, has been investigated using DEGAS2 code. The code is based on the calculation using Monte Carlo algorithms and is mainly used in tokamaks with divertor configuration. This code has been successfully implemented in Aditya tokamak with limiter configuration. The penetration of neutral hydrogen atom is studied with various atomic and molecular contributions and it is found that the maximum contribution comes from the dissociation processes. For the same, H α spectrum is also simulated and matched with the experimental one. The dominant contribution around 64% comes from molecular dissociation processes and neutral particle is generated by those processes have energy of ~2.0 eV. Furthermore, the variation of neutral hydrogen density and H α emissivity profile are analysed for various edge temperature profiles and found that there is not much changes in H α emission at the plasma edge with the variation of edge temperature (7-40 eV).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J; Gehl, S M
1979-01-01
GRASS-SST and FASTGRASS are mechanistic computer codes for predicting fission-gas behavior in UO/sub 2/-base fuels during steady-state and transient conditions. FASTGRASS was developed in order to satisfy the need for a fast-running alternative to GRASS-SST. Althrough based on GRASS-SST, FASTGRASS is approximately an order of magnitude quicker in execution. The GRASS-SST transient analysis has evolved through comparisons of code predictions with the fission-gas release and physical phenomena that occur during reactor operation and transient direct-electrical-heating (DEH) testing of irradiated light-water reactor fuel. The FASTGRASS calculational procedure is described in this paper, along with models of key physical processes included inmore » both FASTGRASS and GRASS-SST. Predictions of fission-gas release obtained from GRASS-SST and FASTGRASS analyses are compared with experimental observations from a series of DEH tests. The major conclusions is that the computer codes should include an improved model for the evolution of the grain-edge porosity.« less
Complete Mitochondrial Genome of Echinostoma hortense (Digenea: Echinostomatidae).
Liu, Ze-Xuan; Zhang, Yan; Liu, Yu-Ting; Chang, Qiao-Cheng; Su, Xin; Fu, Xue; Yue, Dong-Mei; Gao, Yuan; Wang, Chun-Ren
2016-04-01
Echinostoma hortense (Digenea: Echinostomatidae) is one of the intestinal flukes with medical importance in humans. However, the mitochondrial (mt) genome of this fluke has not been known yet. The present study has determined the complete mt genome sequences of E. hortense and assessed the phylogenetic relationships with other digenean species for which the complete mt genome sequences are available in GenBank using concatenated amino acid sequences inferred from 12 protein-coding genes. The mt genome of E. hortense contained 12 protein-coding genes, 22 transfer RNA genes, 2 ribosomal RNA genes, and 1 non-coding region. The length of the mt genome of E. hortense was 14,994 bp, which was somewhat smaller than those of other trematode species. Phylogenetic analyses based on concatenated nucleotide sequence datasets for all 12 protein-coding genes using maximum parsimony (MP) method showed that E. hortense and Hypoderaeum conoideum gathered together, and they were closer to each other than to Fasciolidae and other echinostomatid trematodes. The availability of the complete mt genome sequences of E. hortense provides important genetic markers for diagnostics, population genetics, and evolutionary studies of digeneans.
Complete Mitochondrial Genome of Echinostoma hortense (Digenea: Echinostomatidae)
Liu, Ze-Xuan; Zhang, Yan; Liu, Yu-Ting; Chang, Qiao-Cheng; Su, Xin; Fu, Xue; Yue, Dong-Mei; Gao, Yuan; Wang, Chun-Ren
2016-01-01
Echinostoma hortense (Digenea: Echinostomatidae) is one of the intestinal flukes with medical importance in humans. However, the mitochondrial (mt) genome of this fluke has not been known yet. The present study has determined the complete mt genome sequences of E. hortense and assessed the phylogenetic relationships with other digenean species for which the complete mt genome sequences are available in GenBank using concatenated amino acid sequences inferred from 12 protein-coding genes. The mt genome of E. hortense contained 12 protein-coding genes, 22 transfer RNA genes, 2 ribosomal RNA genes, and 1 non-coding region. The length of the mt genome of E. hortense was 14,994 bp, which was somewhat smaller than those of other trematode species. Phylogenetic analyses based on concatenated nucleotide sequence datasets for all 12 protein-coding genes using maximum parsimony (MP) method showed that E. hortense and Hypoderaeum conoideum gathered together, and they were closer to each other than to Fasciolidae and other echinostomatid trematodes. The availability of the complete mt genome sequences of E. hortense provides important genetic markers for diagnostics, population genetics, and evolutionary studies of digeneans. PMID:27180575
Investigation of neutral particle dynamics in Aditya tokamak plasma with DEGAS2 code
Dey, Ritu; Ghosh, Joydeep; Chowdhuri, M. B.; ...
2017-06-09
Neutral particle behavior in Aditya tokamak, which has a circular poloidal ring limiter at one particular toroidal location, has been investigated using DEGAS2 code. The code is based on the calculation using Monte Carlo algorithms and is mainly used in tokamaks with divertor configuration. This code has been successfully implemented in Aditya tokamak with limiter configuration. The penetration of neutral hydrogen atom is studied with various atomic and molecular contributions and it is found that the maximum contribution comes from the dissociation processes. For the same, H α spectrum is also simulated which was matched with the experimental one. Themore » dominant contribution around 64% comes from molecular dissociation processes and neutral particle is generated by those processes have energy of ~ 2.0 eV. Furthermore, the variation of neutral hydrogen density and H α emissivity profile are analysed for various edge temperature profiles and found that there is not much changes in H α emission at the plasma edge with the variation of edge temperature (7 to 40 eV).« less
Chiavassa, S; Lemosquet, A; Aubineau-Lanièce, I; de Carlan, L; Clairand, I; Ferrer, L; Bardiès, M; Franck, D; Zankl, M
2005-01-01
This paper aims at comparing dosimetric assessments performed with three Monte Carlo codes: EGS4, MCNP4c2 and MCNPX2.5e, using a realistic voxel phantom, namely the Zubal phantom, in two configurations of exposure. The first one deals with an external irradiation corresponding to the example of a radiological accident. The results are obtained using the EGS4 and the MCNP4c2 codes and expressed in terms of the mean absorbed dose (in Gy per source particle) for brain, lungs, liver and spleen. The second one deals with an internal exposure corresponding to the treatment of a medullary thyroid cancer by 131I-labelled radiopharmaceutical. The results are obtained by EGS4 and MCNPX2.5e and compared in terms of S-values (expressed in mGy per kBq and per hour) for liver, kidney, whole body and thyroid. The results of these two studies are presented and differences between the codes are analysed and discussed.
Development of PRIME for irradiation performance analysis of U-Mo/Al dispersion fuel
NASA Astrophysics Data System (ADS)
Jeong, Gwan Yoon; Kim, Yeon Soo; Jeong, Yong Jin; Park, Jong Man; Sohn, Dong-Seong
2018-04-01
A prediction code for the thermo-mechanical performance of research reactor fuel (PRIME) has been developed with the implementation of developed models to analyze the irradiation behavior of U-Mo dispersion fuel. The code is capable of predicting the two-dimensional thermal and mechanical performance of U-Mo dispersion fuel during irradiation. A finite element method was employed to solve the governing equations for thermal and mechanical equilibria. Temperature- and burnup-dependent material properties of the fuel meat constituents and cladding were used. The numerical solution schemes in PRIME were verified by benchmarking solutions obtained using a commercial finite element analysis program (ABAQUS). The code was validated using irradiation data from RERTR, HAMP-1, and E-FUTURE tests. The measured irradiation data used in the validation were IL thickness, volume fractions of fuel meat constituents for the thermal analysis, and profiles of the plate thickness changes and fuel meat swelling for the mechanical analysis. The prediction results were in good agreement with the measurement data for both thermal and mechanical analyses, confirming the validity of the code.
Space coding for sensorimotor transformations can emerge through unsupervised learning.
De Filippo De Grazia, Michele; Cutini, Simone; Lisi, Matteo; Zorzi, Marco
2012-08-01
The posterior parietal cortex (PPC) is fundamental for sensorimotor transformations because it combines multiple sensory inputs and posture signals into different spatial reference frames that drive motor programming. Here, we present a computational model mimicking the sensorimotor transformations occurring in the PPC. A recurrent neural network with one layer of hidden neurons (restricted Boltzmann machine) learned a stochastic generative model of the sensory data without supervision. After the unsupervised learning phase, the activity of the hidden neurons was used to compute a motor program (a population code on a bidimensional map) through a simple linear projection and delta rule learning. The average motor error, calculated as the difference between the expected and the computed output, was less than 3°. Importantly, analyses of the hidden neurons revealed gain-modulated visual receptive fields, thereby showing that space coding for sensorimotor transformations similar to that observed in the PPC can emerge through unsupervised learning. These results suggest that gain modulation is an efficient coding strategy to integrate visual and postural information toward the generation of motor commands.
Analysis of JT-60SA operational scenarios
NASA Astrophysics Data System (ADS)
Garzotti, L.; Barbato, E.; Garcia, J.; Hayashi, N.; Voitsekhovitch, I.; Giruzzi, G.; Maget, P.; Romanelli, M.; Saarelma, S.; Stankiewitz, R.; Yoshida, M.; Zagórski, R.
2018-02-01
Reference scenarios for the JT-60SA tokamak have been simulated with one-dimensional transport codes to assess the stationary state of the flat-top phase and provide a profile database for further physics studies (e.g. MHD stability, gyrokinetic analysis) and diagnostics design. The types of scenario considered vary from pulsed standard H-mode to advanced non-inductive steady-state plasmas. In this paper we present the results obtained with the ASTRA, CRONOS, JINTRAC and TOPICS codes equipped with the Bohm/gyro-Bohm, CDBM and GLF23 transport models. The scenarios analysed here are: a standard ELMy H-mode, a hybrid scenario and a non-inductive steady state plasma, with operational parameters from the JT-60SA research plan. Several simulations of the scenarios under consideration have been performed with the above mentioned codes and transport models. The results from the different codes are in broad agreement and the main plasma parameters generally agree well with the zero dimensional estimates reported previously. The sensitivity of the results to different transport models and, in some cases, to the ELM/pedestal model has been investigated.
Meydan, Chanan; Bekenstein, Uriya; Soreq, Hermona
2018-01-01
Sepsis and metabolic syndrome (MetS) are both inflammation-related entities with high impact for human health and the consequences of concussions. Both represent imbalanced parasympathetic/cholinergic response to insulting triggers and variably uncontrolled inflammation that indicates shared upstream regulators, including short microRNAs (miRs) and long non-coding RNAs (lncRNAs). These may cross talk across multiple systems, leading to complex molecular and clinical outcomes. Notably, biomedical and RNA-sequencing based analyses both highlight new links between the acquired and inherited pathogenic, cardiac and inflammatory traits of sepsis/MetS. Those include the HOTAIR and MIAT lncRNAs and their targets, such as miR-122, -150, -155, -182, -197, -375, -608 and HLA-DRA. Implicating non-coding RNA regulators in sepsis and MetS may delineate novel high-value biomarkers and targets for intervention.
Experimental and analytical comparison of flowfields in a 110 N (25 lbf) H2/O2 rocket
NASA Technical Reports Server (NTRS)
Reed, Brian D.; Penko, Paul F.; Schneider, Steven J.; Kim, Suk C.
1991-01-01
A gaseous hydrogen/gaseous oxygen 110 N (25 lbf) rocket was examined through the RPLUS code using the full Navier-Stokes equations with finite rate chemistry. Performance tests were conducted on the rocket in an altitude test facility. Preliminary parametric analyses were performed for a range of mixture ratios and fuel film cooling pcts. It is shown that the computed values of specific impulse and characteristic exhaust velocity follow the trend of the experimental data. Specific impulse computed by the code is lower than the comparable test values by about two to three percent. The computed characteristic exhaust velocity values are lower than the comparable test values by three to four pct. Thrust coefficients computed by the code are found to be within two pct. of the measured values. It is concluded that the discrepancy between computed and experimental performance values could not be attributed to experimental uncertainty.
A qualitative study on physicians' perceptions of specialty characteristics.
Park, Kwi Hwa; Jun, Soo-Koung; Park, Ie Byung
2016-09-01
There has been limited research on physicians' perceptions of the specialty characteristics that are needed to sustain a successful career in medical specialties in Korea. Medical Specialty Preference Inventory in the United States or SCI59 (specialty choice inventory) in the United Kingdom are implemented to help medical students plan their careers. The purpose of this study was to explore the characteristics of the major specialties in Korea. Twelve physicians from different specialties participated in an exploratory study consisting of qualitative interviews about the personal ability and emotional characteristics and job attributes of each specialty. The collected data were analysed with content analysis methods. Twelve codes were extracted for ability & skill attributes, 23 codes for emotion & attitude attributes, and 12 codes for job attributes. Each specialty shows a different profile in terms of its characteristic attributes. The findings have implications for the design of career planning programs for medical students.
Multidisciplinary Aerospace Systems Optimization: Computational AeroSciences (CAS) Project
NASA Technical Reports Server (NTRS)
Kodiyalam, S.; Sobieski, Jaroslaw S. (Technical Monitor)
2001-01-01
The report describes a method for performing optimization of a system whose analysis is so expensive that it is impractical to let the optimization code invoke it directly because excessive computational cost and elapsed time might result. In such situation it is imperative to have user control the number of times the analysis is invoked. The reported method achieves that by two techniques in the Design of Experiment category: a uniform dispersal of the trial design points over a n-dimensional hypersphere and a response surface fitting, and the technique of krigging. Analyses of all the trial designs whose number may be set by the user are performed before activation of the optimization code and the results are stored as a data base. That code is then executed and referred to the above data base. Two applications, one of the airborne laser system, and one of an aircraft optimization illustrate the method application.
Category-dependent and category-independent goal-value codes in human ventromedial prefrontal cortex
McNamee, Daniel; Rangel, Antonio; O’Doherty, John P
2013-01-01
To choose between manifestly distinct options, it is suggested that the brain assigns values to goals using a common currency. Although previous studies have reported activity in ventromedial prefrontal cortex (vmPFC) correlating with the value of different goal stimuli, it remains unclear whether such goal-value representations are independent of the associated stimulus categorization, as required by a common currency. Using multivoxel pattern analyses on functional magnetic resonance imaging (fMRI) data, we found a region of medial prefrontal cortex to contain a distributed goal-value code that is independent of stimulus category. More ventrally in the vmPFC, we found spatially distinct areas of the medial orbitofrontal cortex to contain unique category-dependent distributed value codes for food and consumer items. These results implicate the medial prefrontal cortex in the implementation of a common currency and suggest a ventral versus dorsal topographical organization of value signals in the vmPFC. PMID:23416449
Finite element analysis of wirelessly interrogated implantable bio-MEMS
NASA Astrophysics Data System (ADS)
Dissanayake, Don W.; Al-Sarawi, Said F.; Lu, Tien-Fu; Abbott, Derek
2008-12-01
Wirelessly interrogated bio-MEMS devices are becoming more popular due to many challenges, such as improving the diagnosis, monitoring, and patient wellbeing. The authors present here a passive, low power and small area device, which can be interrogated wirelessly using a uniquely coded signal for a secure and reliable operation. The proposed new approach relies on converting the interrogating coded signal to surface acoustic wave that is then correlated with an embedded code. The suggested method is implemented to operate a micropump, which consist of a specially designed corrugated microdiaphragm to modulate the fluid flow in microchannels. Finite Element Analysis of the micropump operation is presented and a performance was analysed. Design parameters of the diaphragm design were finetuned for optimal performance and different polymer based materials were used in various parts of the micropump to allow for better flexibility and high reliability.
NASA Astrophysics Data System (ADS)
Litts, Breanne K.; Kafai, Yasmin B.; Lui, Debora A.; Walker, Justice T.; Widman, Sari A.
2017-10-01
Learning about circuitry by connecting a battery, light bulb, and wires is a common activity in many science classrooms. In this paper, we expand students' learning about circuitry with electronic textiles, which use conductive thread instead of wires and sewable LEDs instead of lightbulbs, by integrating programming sensor inputs and light outputs and examining how the two domains interact. We implemented an electronic textiles unit with 23 high school students ages 16-17 years who learned how to craft and code circuits with the LilyPad Arduino, an electronic textile construction kit. Our analyses not only confirm significant increases in students' understanding of functional circuits but also showcase students' ability in designing and remixing program code for controlling circuits. In our discussion, we address opportunities and challenges of introducing codeable circuit design for integrating maker activities that include engineering and computing into classrooms.
A flooding induced station blackout analysis for a pressurized water reactor using the RISMC toolkit
Mandelli, Diego; Prescott, Steven; Smith, Curtis; ...
2015-05-17
In this paper we evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: the RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., component/system activation) and to perform statistical analyses. In our case, the simulation of the flooding is performed by using an advanced smooth particle hydrodynamics code calledmore » NEUTRINO. The obtained results allow the user to investigate and quantify the impact of timing and sequencing of events on system safety. The impact of power uprate is determined in terms of both core damage probability and safety margins.« less
Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis
NASA Technical Reports Server (NTRS)
Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.
2007-01-01
To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.
NASA Technical Reports Server (NTRS)
Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.
2006-01-01
Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.
Yang, Cheng-Huei; Luo, Ching-Hsing; Yang, Cheng-Hong; Chuang, Li-Yeh
2004-01-01
Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, including mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for disabled persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. This restriction is a major hindrance. Therefore, a switch adaptive automatic recognition method with a high recognition rate is needed. The proposed system combines counter-propagation networks with a variable degree variable step size LMS algorithm. It is divided into five stages: space recognition, tone recognition, learning process, adaptive processing, and character recognition. Statistical analyses demonstrated that the proposed method elicited a better recognition rate in comparison to alternative methods in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyack, B.E.; Dhir, V.K.; Gieseke, J.A.
1992-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. The newest version of MELCOR is Version 1.8.1, July 1991. MELCOR development has reached the point that the United States Nuclear Regulatory Commission sponsored a broad technical review by recognized experts to determine or confirm the technical adequacy of the code for the serious and complex analyses it is expected to perform. For this purpose, an eight-member MELCOR Peer Review Committee was organized. The Committee has completed its review of the MELCOR code: the review process and findingsmore » of the MELCOR Peer Review Committee are documented in this report. The Committee has determined that recommendations in five areas are appropriate: (1) MELCOR numerics, (2) models missing from MELCOR Version 1.8.1, (3) existing MELCOR models needing revision, (4) the need for expanded MELCOR assessment, and (5) documentation.« less
NASA Technical Reports Server (NTRS)
Stagliano, T. R.; Witmer, E. A.; Rodal, J. J. A.
1979-01-01
Finite element modeling alternatives as well as the utility and limitations of the two dimensional structural response computer code CIVM-JET 4B for predicting the transient, large deflection, elastic plastic, structural responses of two dimensional beam and/or ring structures which are subjected to rigid fragment impact were investigated. The applicability of the CIVM-JET 4B analysis and code for the prediction of steel containment ring response to impact by complex deformable fragments from a trihub burst of a T58 turbine rotor was studied. Dimensional analysis considerations were used in a parametric examination of data from engine rotor burst containment experiments and data from sphere beam impact experiments. The use of the CIVM-JET 4B computer code for making parametric structural response studies on both fragment-containment structure and fragment-deflector structure was illustrated. Modifications to the analysis/computation procedure were developed to alleviate restrictions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, A. J.; Fanning, T. H.
The United States has extensive experience with the design, construction, and operation of sodium cooled fast reactors (SFRs) over the last six decades. Despite the closure of various facilities, the U.S. continues to dedicate research and development (R&D) efforts to the design of innovative experimental, prototype, and commercial facilities. Accordingly, in support of the rich operating history and ongoing design efforts, the U.S. has been developing and maintaining a series of tools with capabilities that envelope all facets of SFR design and safety analyses. This paper provides an overview of the current U.S. SFR analysis toolset, including codes such asmore » SAS4A/SASSYS-1, MC2-3, SE2-ANL, PERSENT, NUBOW-3D, and LIFE-METAL, as well as the higher-fidelity tools (e.g. PROTEUS) being integrated into the toolset. Current capabilities of the codes are described and key ongoing development efforts are highlighted for some codes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Daniel; Vesselinov, Velimir V.
MADSpython (Model analysis and decision support tools in Python) is a code in Python that streamlines the process of using data and models for analysis and decision support using the code MADS. MADS is open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11-035). MADS can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. The Python scripts in MADSpython facilitate the generation of input and output file needed by MADS as wells as the external simulators which include FEHM and PFLOTRAN. MADSpython enables a number of data-more » and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. MADSpython will be released under GPL V3 license. MADSpython will be distributed as a Git repo at gitlab.com and github.com. MADSpython manual and documentation will be posted at http://madspy.lanl.gov.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sesigur, Haluk; Cili, Feridun
Seismic isolation is an effective design strategy to mitigate the seismic hazard wherein the structure and its contents are protected from the damaging effects of an earthquake. This paper presents the Hangar Project in Sabiha Goekcen Airport which is located in Istanbul, Turkey. Seismic isolation system where the isolation layer arranged at the top of the columns is selected. The seismic hazard analysis, superstructure design, isolator design and testing were based on the Uniform Building Code (1997) and met all requirements of the Turkish Earthquake Code (2007). The substructure which has the steel vertical trusses on facades and RC Hmore » shaped columns in the middle axis of the building was designed with an R factor limited to 2.0 in accordance with Turkish Earthquake Code. In order to verify the effectiveness of the isolation system, nonlinear static and dynamic analyses are performed. The analysis revealed that isolated building has lower base shear (approximately 1/4) against the non-isolated structure.« less
CUDA Fortran acceleration for the finite-difference time-domain method
NASA Astrophysics Data System (ADS)
Hadi, Mohammed F.; Esmaeili, Seyed A.
2013-05-01
A detailed description of programming the three-dimensional finite-difference time-domain (FDTD) method to run on graphical processing units (GPUs) using CUDA Fortran is presented. Two FDTD-to-CUDA thread-block mapping designs are investigated and their performances compared. Comparative assessment of trade-offs between GPU's shared memory and L1 cache is also discussed. This presentation is for the benefit of FDTD programmers who work exclusively with Fortran and are reluctant to port their codes to C in order to utilize GPU computing. The derived CUDA Fortran code is compared with an optimized CPU version that runs on a workstation-class CPU to present a realistic GPU to CPU run time comparison and thus help in making better informed investment decisions on FDTD code redesigns and equipment upgrades. All analyses are mirrored with CUDA C simulations to put in perspective the present state of CUDA Fortran development.
A Simple Secure Hash Function Scheme Using Multiple Chaotic Maps
NASA Astrophysics Data System (ADS)
Ahmad, Musheer; Khurana, Shruti; Singh, Sushmita; AlSharari, Hamed D.
2017-06-01
The chaotic maps posses high parameter sensitivity, random-like behavior and one-way computations, which favor the construction of cryptographic hash functions. In this paper, we propose to present a novel hash function scheme which uses multiple chaotic maps to generate efficient variable-sized hash functions. The message is divided into four parts, each part is processed by a different 1D chaotic map unit yielding intermediate hash code. The four codes are concatenated to two blocks, then each block is processed through 2D chaotic map unit separately. The final hash value is generated by combining the two partial hash codes. The simulation analyses such as distribution of hashes, statistical properties of confusion and diffusion, message and key sensitivity, collision resistance and flexibility are performed. The results reveal that the proposed anticipated hash scheme is simple, efficient and holds comparable capabilities when compared with some recent chaos-based hash algorithms.
Henry, Kenneth S.; Kale, Sushrut; Heinz, Michael G.
2014-01-01
While changes in cochlear frequency tuning are thought to play an important role in the perceptual difficulties of people with sensorineural hearing loss (SNHL), the possible role of temporal processing deficits remains less clear. Our knowledge of temporal envelope coding in the impaired cochlea is limited to two studies that examined auditory-nerve fiber responses to narrowband amplitude modulated stimuli. In the present study, we used Wiener-kernel analyses of auditory-nerve fiber responses to broadband Gaussian noise in anesthetized chinchillas to quantify changes in temporal envelope coding with noise-induced SNHL. Temporal modulation transfer functions (TMTFs) and temporal windows of sensitivity to acoustic stimulation were computed from 2nd-order Wiener kernels and analyzed to estimate the temporal precision, amplitude, and latency of envelope coding. Noise overexposure was associated with slower (less negative) TMTF roll-off with increasing modulation frequency and reduced temporal window duration. The results show that at equal stimulus sensation level, SNHL increases the temporal precision of envelope coding by 20–30%. Furthermore, SNHL increased the amplitude of envelope coding by 50% in fibers with CFs from 1–2 kHz and decreased mean response latency by 0.4 ms. While a previous study of envelope coding demonstrated a similar increase in response amplitude, the present study is the first to show enhanced temporal precision. This new finding may relate to the use of a more complex stimulus with broad frequency bandwidth and a dynamic temporal envelope. Exaggerated neural coding of fast envelope modulations may contribute to perceptual difficulties in people with SNHL by acting as a distraction from more relevant acoustic cues, especially in fluctuating background noise. Finally, the results underscore the value of studying sensory systems with more natural, real-world stimuli. PMID:24596545
Cenik, Can; Chua, Hon Nian; Singh, Guramrit; Akef, Abdalla; Snyder, Michael P; Palazzo, Alexander F; Moore, Melissa J; Roth, Frederick P
2017-03-01
Introns are found in 5' untranslated regions (5'UTRs) for 35% of all human transcripts. These 5'UTR introns are not randomly distributed: Genes that encode secreted, membrane-bound and mitochondrial proteins are less likely to have them. Curiously, transcripts lacking 5'UTR introns tend to harbor specific RNA sequence elements in their early coding regions. To model and understand the connection between coding-region sequence and 5'UTR intron status, we developed a classifier that can predict 5'UTR intron status with >80% accuracy using only sequence features in the early coding region. Thus, the classifier identifies transcripts with 5 ' proximal- i ntron- m inus-like-coding regions ("5IM" transcripts). Unexpectedly, we found that the early coding sequence features defining 5IM transcripts are widespread, appearing in 21% of all human RefSeq transcripts. The 5IM class of transcripts is enriched for non-AUG start codons, more extensive secondary structure both preceding the start codon and near the 5' cap, greater dependence on eIF4E for translation, and association with ER-proximal ribosomes. 5IM transcripts are bound by the exon junction complex (EJC) at noncanonical 5' proximal positions. Finally, N 1 -methyladenosines are specifically enriched in the early coding regions of 5IM transcripts. Taken together, our analyses point to the existence of a distinct 5IM class comprising ∼20% of human transcripts. This class is defined by depletion of 5' proximal introns, presence of specific RNA sequence features associated with low translation efficiency, N 1 -methyladenosines in the early coding region, and enrichment for noncanonical binding by the EJC. © 2017 Cenik et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Designing and Assessing Learning
ERIC Educational Resources Information Center
Quan, Hong; Liu, Dandan; Cun, Xiangqin; Lu, Yingchun
2009-01-01
This paper analyses the design, implementation and assessment of a level 2 module for non-English major students in higher vocational and professional education. 1132001 is a code of module that uses active methods to teach college English in China. It specifically reflects on the module's advantage and defect for developing and improving learning…
Second Language Socialization and Learner Agency: Adoptive Family Talk
ERIC Educational Resources Information Center
Fogle, Lyn Wright
2012-01-01
This book examines how Russian-speaking adoptees in three US families actively shape opportunities for language learning and identity construction in everyday interactions. By focusing on a different practice in each family (i.e. narrative talk about the day, metalinguistic discourse or languaging, and code-switching), the analyses uncover…
The Relationship between Simultaneous-Successive Processing and Academic Achievement.
ERIC Educational Resources Information Center
Merritt, Frank M.; McCallum, Steve
The Luria-Das Information Processing Model of human learning holds that information is analysed and coded within the brain in either a simultaneous or a successive fashion. Simultaneous integration refers to the synthesis of separate elements into groups, often with spatial characteristics; successive integration means that information is…
NASA Technical Reports Server (NTRS)
Hulka, J. R.; Jones, G. W.
2010-01-01
Liquid rocket engines using oxygen and methane propellants are being considered by the National Aeronautics and Space Administration (NASA) for in-space vehicles. This propellant combination has not been previously used in a flight-qualified engine system, so limited test data and analysis results are available at this stage of early development. NASA has funded several hardware-oriented activities with oxygen and methane propellants over the past several years with the Propulsion and Cryogenic Advanced Development (PCAD) project, under the Exploration Technology Development Program. As part of this effort, the NASA Marshall Space Flight Center has conducted combustion, performance, and combustion stability analyses of several of the configurations. This paper summarizes the analyses of combustion and performance as a follow-up to a paper published in the 2008 JANNAF/LPS meeting. Combustion stability analyses are presented in a separate paper. The current paper includes test and analysis results of coaxial element injectors using liquid oxygen and liquid methane or gaseous methane propellants. Several thrust chamber configurations have been modeled, including thrust chambers with multi-element swirl coax element injectors tested at the NASA MSFC, and a uni-element chamber with shear and swirl coax injectors tested at The Pennsylvania State University. Configurations were modeled with two one-dimensional liquid rocket combustion analysis codes, the Rocket Combustor Interaction Design and Analysis (ROCCID), and the Coaxial Injector Combustion Model (CICM). Significant effort was applied to show how these codes can be used to model combustion and performance with oxygen/methane propellants a priori, and what anchoring or calibrating features need to be applied or developed in the future. This paper describes the test hardware configurations, presents the results of all the analyses, and compares the results from the two analytical methods
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
NASA Astrophysics Data System (ADS)
Papadimitriou, P.; Skorek, T.
THESUS is a thermohydraulic code for the calculation of steady state and transient processes of two-phase cryogenic flows. The physical model is based on four conservation equations with separate liquid and gas phase mass conservation equations. The thermohydraulic non-equilibrium is calculated by means of evaporation and condensation models. The mechanical non-equilibrium is modeled by a full-range drift-flux model. Also heat conduction in solid structures and heat exchange for the full spectrum of heat transfer regimes can be simulated. Test analyses of two-channel chilldown experiments and comparisons with the measured data have been performed.
Probabilistic evaluation of fuselage-type composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1992-01-01
A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Banerjee, P. K.
1987-01-01
This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Sections Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of computer codes that permit more accurate and efficient three-dimensional analyses of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components.
Requirements for Next Generation Comprehensive Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Data, Anubhav
2008-01-01
The unique demands of rotorcraft aeromechanics analysis have led to the development of software tools that are described as comprehensive analyses. The next generation of rotorcraft comprehensive analyses will be driven and enabled by the tremendous capabilities of high performance computing, particularly modular and scaleable software executed on multiple cores. Development of a comprehensive analysis based on high performance computing both demands and permits a new analysis architecture. This paper describes a vision of the requirements for this next generation of comprehensive analyses of rotorcraft. The requirements are described and substantiated for what must be included and justification provided for what should be excluded. With this guide, a path to the next generation code can be found.
Detecting well-being via computerized content analysis of brief diary entries.
Tov, William; Ng, Kok Leong; Lin, Han; Qiu, Lin
2013-12-01
Two studies evaluated the correspondence between self-reported well-being and codings of emotion and life content by the Linguistic Inquiry and Word Count (LIWC; Pennebaker, Booth, & Francis, 2011). Open-ended diary responses were collected from 206 participants daily for 3 weeks (Study 1) and from 139 participants twice a week for 8 weeks (Study 2). LIWC negative emotion consistently correlated with self-reported negative emotion. LIWC positive emotion correlated with self-reported positive emotion in Study 1 but not in Study 2. No correlations were observed with global life satisfaction. Using a co-occurrence coding method to combine LIWC emotion codings with life-content codings, we estimated the frequency of positive and negative events in 6 life domains (family, friends, academics, health, leisure, and money). Domain-specific event frequencies predicted self-reported satisfaction in all domains in Study 1 but not consistently in Study 2. We suggest that the correspondence between LIWC codings and self-reported well-being is affected by the number of writing samples collected per day as well as the target period (e.g., past day vs. past week) assessed by the self-report measure. Extensions and possible implications for the analyses of similar types of open-ended data (e.g., social media messages) are discussed. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
JASMIN: Japanese-American study of muon interactions and neutron detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakashima, Hiroshi; /JAEA, Ibaraki; Mokhov, N.V.
Experimental studies of shielding and radiation effects at Fermi National Accelerator Laboratory (FNAL) have been carried out under collaboration between FNAL and Japan, aiming at benchmarking of simulation codes and study of irradiation effects for upgrade and design of new high-energy accelerator facilities. The purposes of this collaboration are (1) acquisition of shielding data in a proton beam energy domain above 100GeV; (2) further evaluation of predictive accuracy of the PHITS and MARS codes; (3) modification of physics models and data in these codes if needed; (4) establishment of irradiation field for radiation effect tests; and (5) development of amore » code module for improved description of radiation effects. A series of experiments has been performed at the Pbar target station and NuMI facility, using irradiation of targets with 120 GeV protons for antiproton and neutrino production, as well as the M-test beam line (M-test) for measuring nuclear data and detector responses. Various nuclear and shielding data have been measured by activation methods with chemical separation techniques as well as by other detectors such as a Bonner ball counter. Analyses with the experimental data are in progress for benchmarking the PHITS and MARS15 codes. In this presentation recent activities and results are reviewed.« less
Seligman, Sarah C; Giovannetti, Tania; Sestito, John; Libon, David J
2014-01-01
Mild functional difficulties have been associated with early cognitive decline in older adults and increased risk for conversion to dementia in mild cognitive impairment, but our understanding of this decline has been limited by a dearth of objective methods. This study evaluated the reliability and validity of a new system to code subtle errors on an established performance-based measure of everyday action and described preliminary findings within the context of a theoretical model of action disruption. Here 45 older adults completed the Naturalistic Action Test (NAT) and neuropsychological measures. NAT performance was coded for overt errors, and subtle action difficulties were scored using a novel coding system. An inter-rater reliability coefficient was calculated. Validity of the coding system was assessed using a repeated-measures ANOVA with NAT task (simple versus complex) and error type (overt versus subtle) as within-group factors. Correlation/regression analyses were conducted among overt NAT errors, subtle NAT errors, and neuropsychological variables. The coding of subtle action errors was reliable and valid, and episodic memory breakdown predicted subtle action disruption. Results suggest that the NAT can be useful in objectively assessing subtle functional decline. Treatments targeting episodic memory may be most effective in addressing early functional impairment in older age.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.
1993-01-01
This report discusses probabilistic fracture mechanics (PFM) analysis which is a major element of the comprehensive probabilistic methodology endorsed by the NRC for evaluation of the integrity of Pressurized Water Reactor (PWR) pressure vessels subjected to pressurized-thermal-shock (PTS) transients. It is anticipated that there will be an increasing need for an improved and validated PTS PFM code which is accepted by the NRC and utilities, as more plants approach the PTS screening criteria and are required to perform plant-specific analyses. The NRC funded Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratories is currently developing the FAVOR (Fracturemore » Analysis of Vessels: Oak Ridge) PTS PFM code, which is intended to meet this need. The FAVOR code incorporates the most important features of both OCA-P and VISA-II and contains some new capabilities such as PFM global modeling methodology, the capability to approximate the effects of thermal streaming on circumferential flaws located inside a plume region created by fluid and thermal stratification, a library of stress intensity factor influence coefficients, generated by the NQA-1 certified ABAQUS computer code, for an adequate range of two and three dimensional inside surface flaws, the flexibility to generate a variety of output reports, and user friendliness.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.
1993-04-01
This report discusses probabilistic fracture mechanics (PFM) analysis which is a major element of the comprehensive probabilistic methodology endorsed by the NRC for evaluation of the integrity of Pressurized Water Reactor (PWR) pressure vessels subjected to pressurized-thermal-shock (PTS) transients. It is anticipated that there will be an increasing need for an improved and validated PTS PFM code which is accepted by the NRC and utilities, as more plants approach the PTS screening criteria and are required to perform plant-specific analyses. The NRC funded Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratories is currently developing the FAVOR (Fracturemore » Analysis of Vessels: Oak Ridge) PTS PFM code, which is intended to meet this need. The FAVOR code incorporates the most important features of both OCA-P and VISA-II and contains some new capabilities such as PFM global modeling methodology, the capability to approximate the effects of thermal streaming on circumferential flaws located inside a plume region created by fluid and thermal stratification, a library of stress intensity factor influence coefficients, generated by the NQA-1 certified ABAQUS computer code, for an adequate range of two and three dimensional inside surface flaws, the flexibility to generate a variety of output reports, and user friendliness.« less
Causes of Death Data in the Global Burden of Disease Estimates for Ischemic and Hemorrhagic Stroke
Truelsen, Thomas; Krarup, Lars-Henrik; Iversen, Helle; Mensah, George A.; Feigin, Valery; Sposato, Luciano; Naghavi, Mohsen
2015-01-01
Background Stroke mortality estimates in the Global Burden of Disease (GBD) study are based on routine mortality statistics and redistribution of ill-defined codes that cannot be a cause of death, the so-called “garbage codes”. This study describes the contribution of these codes to stroke mortality estimates. Methods All available mortality data were compiled and non-specific cause codes were redistributed based on literature review and statistical methods. Ill-defined codes were redistributed to their specific cause of disease by age, sex, country, and year. The reassignment was done based on the international classification of diseases and the pathology behind each code by checking multiple causes of death and literature review. Results Unspecified stroke, and primary and secondary hypertension are leading contributing “garbage codes” to stroke mortality estimates for intracranial hemorrhagic stroke and ischemic stroke. There were marked differences in the fraction of death assigned to ischemic stroke and hemorrhagic stroke for unspecified stroke and hypertension between GBD regions and between age groups. Conclusions A large proportion of stroke fatalities is derived from the redistribution of “unspecified stroke” and “hypertension” with marked regional differences. Future advancements in stroke certification, data collections, and statistical analyses may improve the estimation of the global stroke burden. PMID:26505189
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca
Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behaviormore » of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.« less
Standard terminology and labeling of ocular tissue for transplantation.
Armitage, W John; Ashford, Paul; Crow, Barbara; Dahl, Patricia; DeMatteo, Jennifer; Distler, Pat; Gopinathan, Usha; Madden, Peter W; Mannis, Mark J; Moffatt, S Louise; Ponzin, Diego; Tan, Donald
2013-06-01
To develop an internationally agreed terminology for describing ocular tissue grafts to improve the accuracy and reliability of information transfer, to enhance tissue traceability, and to facilitate the gathering of comparative global activity data, including denominator data for use in biovigilance analyses. ICCBBA, the international standards organization for terminology, coding, and labeling of blood, cells, and tissues, approached the major Eye Bank Associations to form an expert advisory group. The group met by regular conference calls to develop a standard terminology, which was released for public consultation and amended accordingly. The terminology uses broad definitions (Classes) with modifying characteristics (Attributes) to define each ocular tissue product. The terminology may be used within the ISBT 128 system to label tissue products with standardized bar codes enabling the electronic capture of critical data in the collection, processing, and distribution of tissues. Guidance on coding and labeling has also been developed. The development of a standard terminology for ocular tissue marks an important step for improving traceability and reducing the risk of mistakes due to transcription errors. ISBT 128 computer codes have been assigned and may now be used to label ocular tissues. Eye banks are encouraged to adopt this standard terminology and move toward full implementation of ISBT 128 nomenclature, coding, and labeling.
Reducing the genetic code induces massive rearrangement of the proteome
O’Donoghue, Patrick; Prat, Laure; Kucklick, Martin; Schäfer, Johannes G.; Riedel, Katharina; Rinehart, Jesse; Söll, Dieter; Heinemann, Ilka U.
2014-01-01
Expanding the genetic code is an important aim of synthetic biology, but some organisms developed naturally expanded genetic codes long ago over the course of evolution. Less than 1% of all sequenced genomes encode an operon that reassigns the stop codon UAG to pyrrolysine (Pyl), a genetic code variant that results from the biosynthesis of Pyl-tRNAPyl. To understand the selective advantage of genetically encoding more than 20 amino acids, we constructed a markerless tRNAPyl deletion strain of Methanosarcina acetivorans (ΔpylT) that cannot decode UAG as Pyl or grow on trimethylamine. Phenotypic defects in the ΔpylT strain were evident in minimal medium containing methanol. Proteomic analyses of wild type (WT) M. acetivorans and ΔpylT cells identified 841 proteins from >7,000 significant peptides detected by MS/MS. Protein production from UAG-containing mRNAs was verified for 19 proteins. Translation of UAG codons was verified by MS/MS for eight proteins, including identification of a Pyl residue in PylB, which catalyzes the first step of Pyl biosynthesis. Deletion of tRNAPyl globally altered the proteome, leading to >300 differentially abundant proteins. Reduction of the genetic code from 21 to 20 amino acids led to significant down-regulation in translation initiation factors, amino acid metabolism, and methanogenesis from methanol, which was offset by a compensatory (100-fold) up-regulation in dimethyl sulfide metabolic enzymes. The data show how a natural proteome adapts to genetic code reduction and indicate that the selective value of an expanded genetic code is related to carbon source range and metabolic efficiency. PMID:25404328
Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.
Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen
2014-02-01
The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.
Quality of head injury coding from autopsy reports with AIS © 2005 update 2008.
Schick, Sylvia; Humrich, Anton; Graw, Matthias
2018-02-28
ABSTACT Objective: Coding injuries from autopsy reports of traffic accident victims according to Abbreviated Injury Scale AIS © 2005 update 2008 [1] is quite time consuming. The suspicion arose, that many issues leading to discussion between coder and control reader were based on information required by the AIS that was not documented in the autopsy reports. To quantify this suspicion, we introduced an AIS-detail-indicator (AIS-DI). To each injury in the AIS Codebook one letter from A to N was assigned indicating the level of detail. Rules were formulated to receive repeatable assignments. This scheme was applied to a selection of 149 multiply injured traffic fatalities. The frequencies of "not A" codes were calculated for each body region and it was analysed, why the most detailed level A had not been coded. As a first finding, the results of the head region are presented. 747 AIS head injury codes were found in 137 traffic fatalities, and 60% of these injuries were coded with an AIS-DI of level A. There are three different explanations for codes of AIS-DI "not A": Group 1 "Missing information in autopsy report" (5%), Group 2 "Clinical data required by AIS" (20%), and Group 3 "AIS system determined" (15%). Groups 1 and 2 show consequences for the ISS in 25 cases. Other body regions might perform differently. The AIS-DI can indicate the quality of the underlying data basis and, depending on the aims of different AIS users it can be a helpful tool for quality checks.
Hjerpe, Per; Boström, Kristina Bengtsson; Lindblad, Ulf; Merlo, Juan
2012-01-01
Objective To investigate the impact on ICD coding behaviour of a new case-mix reimbursement system based on coded patient diagnoses. The main hypothesis was that after the introduction of the new system the coding of chronic diseases like hypertension and cancer would increase and the variance in propensity for coding would decrease on both physician and health care centre (HCC) levels. Design Cross-sectional multilevel logistic regression analyses were performed in periods covering the time before and after the introduction of the new reimbursement system. Setting Skaraborg primary care, Sweden. Subjects All patients (n = 76 546 to 79 826) 50 years of age and older visiting 468 to 627 physicians at the 22 public HCCs in five consecutive time periods of one year each. Main outcome measures Registered codes for hypertension and cancer diseases in Skaraborg primary care database (SPCD). Results After the introduction of the new reimbursement system the adjusted prevalence of hypertension and cancer in SPCD increased from 17.4% to 32.2% and from 0.79% to 2.32%, respectively, probably partly due to an increased diagnosis coding of indirect patient contacts. The total variance in the propensity for coding declined simultaneously at the physician level for both diagnosis groups. Conclusions Changes in the healthcare reimbursement system may directly influence the contents of a research database that retrieves data from clinical practice. This should be taken into account when using such a database for research purposes, and the data should be validated for each diagnosis. PMID:23130878
Multilevel modelling: Beyond the basic applications.
Wright, Daniel B; London, Kamala
2009-05-01
Over the last 30 years statistical algorithms have been developed to analyse datasets that have a hierarchical/multilevel structure. Particularly within developmental and educational psychology these techniques have become common where the sample has an obvious hierarchical structure, like pupils nested within a classroom. We describe two areas beyond the basic applications of multilevel modelling that are important to psychology: modelling the covariance structure in longitudinal designs and using generalized linear multilevel modelling as an alternative to methods from signal detection theory (SDT). Detailed code for all analyses is described using packages for the freeware R.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swinstrom, Kirsten; Caldwell, Roy; Fourcade, H. Matthew
2005-09-07
We report the first complete mitochondrial genome sequences of stomatopods and compare their features to each other and to those of other crustaceans. Phylogenetic analyses of the concatenated mitochondrial protein-coding sequences were used to explore relationships within the Stomatopoda, within the malacostracan crustaceans, and among crustaceans and insects. Although these analyses support the monophyly of both Malacostraca and, within it, Stomatopoda, it also confirms the view of a paraphyletic Crustacea, with Malacostraca being more closely related to insects than to the branchiopod crustaceans.
Chemical and Solar Electric Propulsion Systems Analyses for Mars Sample Return Missions
NASA Technical Reports Server (NTRS)
Donahue, Benjamin B.; Green, Shaun E.; Coverstone, Victoria L.; Woo, Byoungsam
2004-01-01
Conceptual in-space transfer stages, including those utilizing solar electric propulsion, chemical propulsion, and chemical propulsion with aerobraking or aerocapture assist at Mars, were evaluated. Roundtrip Mars sample return mission vehicles were analyzed to determine how specific system technology selections influence payload delivery capability. Results show how specific engine, thruster, propellant, capture mode, trip time and launch vehicle technology choices would contribute to increasing payload or decreasing the size of the required launch vehicles. Heliocentric low-thrust trajectory analyses for Solar Electric Transfer were generated with the SEPTOP code.
2014-01-01
Background Next-generation sequencing has provided a wealth of plastid genome sequence data from an increasingly diverse set of green plants (Viridiplantae). Although these data have helped resolve the phylogeny of numerous clades (e.g., green algae, angiosperms, and gymnosperms), their utility for inferring relationships across all green plants is uncertain. Viridiplantae originated 700-1500 million years ago and may comprise as many as 500,000 species. This clade represents a major source of photosynthetic carbon and contains an immense diversity of life forms, including some of the smallest and largest eukaryotes. Here we explore the limits and challenges of inferring a comprehensive green plant phylogeny from available complete or nearly complete plastid genome sequence data. Results We assembled protein-coding sequence data for 78 genes from 360 diverse green plant taxa with complete or nearly complete plastid genome sequences available from GenBank. Phylogenetic analyses of the plastid data recovered well-supported backbone relationships and strong support for relationships that were not observed in previous analyses of major subclades within Viridiplantae. However, there also is evidence of systematic error in some analyses. In several instances we obtained strongly supported but conflicting topologies from analyses of nucleotides versus amino acid characters, and the considerable variation in GC content among lineages and within single genomes affected the phylogenetic placement of several taxa. Conclusions Analyses of the plastid sequence data recovered a strongly supported framework of relationships for green plants. This framework includes: i) the placement of Zygnematophyceace as sister to land plants (Embryophyta), ii) a clade of extant gymnosperms (Acrogymnospermae) with cycads + Ginkgo sister to remaining extant gymnosperms and with gnetophytes (Gnetophyta) sister to non-Pinaceae conifers (Gnecup trees), and iii) within the monilophyte clade (Monilophyta), Equisetales + Psilotales are sister to Marattiales + leptosporangiate ferns. Our analyses also highlight the challenges of using plastid genome sequences in deep-level phylogenomic analyses, and we provide suggestions for future analyses that will likely incorporate plastid genome sequence data for thousands of species. We particularly emphasize the importance of exploring the effects of different partitioning and character coding strategies. PMID:24533922
Coleman, Craig I; Vaitsiakhovich, Tatsiana; Nguyen, Elaine; Weeda, Erin R; Sood, Nitesh A; Bunz, Thomas J; Schaefer, Bernhard; Meinecke, Anna-Katharina; Eriksson, Daniel
2018-01-01
Schemas to identify bleeding-related hospitalizations in claims data differ in billing codes used and coding positions allowed. We assessed agreement across bleeding-related hospitalization coding schemas for claims analyses of nonvalvular atrial fibrillation (NVAF) patients on oral anticoagulation (OAC). We hypothesized that prior coding schemas used to identify bleeding-related hospitalizations in claim database studies would provide varying levels of agreement in incidence rates. Within MarketScan data, we identified adults, newly started on OAC for NVAF from January 2012 to June 2015. Billing code schemas developed by Cunningham et al., the US Food and Drug Administration (FDA) Mini-Sentinel program, and Yao et al. were used to identify bleeding-related hospitalizations as a surrogate for major bleeding. Bleeds were subcategorized as intracranial hemorrhage (ICH), gastrointestinal (GI), or other. Schema agreement was assessed by comparing incidence, rates of events/100 person-years (PYs), and Cohen's kappa statistic. We identified 151 738 new-users of OAC with NVAF (CHA2DS2-VASc score = 3, [interquartile range = 2-4] and median HAS-BLED score = 3 [interquartile range = 2-3]). The Cunningham, FDA Mini-Sentinel, and Yao schemas identified any bleeding-related hospitalizations in 1.87% (95% confidence interval [CI]: 1.81-1.94), 2.65% (95% CI: 2.57-2.74), and 4.66% (95% CI: 4.55-4.76) of patients (corresponding rates = 3.45, 4.90, and 8.65 events/100 PYs). Kappa agreement across schemas was weak-to-moderate (κ = 0.47-0.66) for any bleeding hospitalization. Near-perfect agreement (κ = 0.99) was observed with the FDA Mini-Sentinel and Yao schemas for ICH-related hospitalizations, but agreement was weak when comparing Cunningham to FDA Mini-Sentinel or Yao (κ = 0.52-0.53). FDA Mini-Sentinel and Yao agreement was moderate (κ = 0.62) for GI bleeding, but agreement was weak when comparing Cunningham to FDA Mini-Sentinel or Yao (κ = 0.44-0.56). For other bleeds, agreement across schemas was minimal (κ = 0.14-0.38). We observed varying levels of agreement among 3 bleeding-related hospitalizations schemas in NVAF patients. © 2018 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnold, Bill Walter; Chang, Fu-lin; Mattie, Patrick D.
2006-02-01
Sandia National Laboratories (SNL) and Taiwan's Institute for Nuclear Energy Research (INER) have teamed together to evaluate several candidate sites for Low-Level Radioactive Waste (LLW) disposal in Taiwan. Taiwan currently has three nuclear power plants, with another under construction. Taiwan also has a research reactor, as well as medical and industrial wastes to contend with. Eventually the reactors will be decomissioned. Operational and decommissioning wastes will need to be disposed in a licensed disposal facility starting in 2014. Taiwan has adopted regulations similar to the US Nuclear Regulatory Commission's (NRC's) low-level radioactive waste rules (10 CFR 61) to govern themore » disposal of LLW. Taiwan has proposed several potential sites for the final disposal of LLW that is now in temporary storage on Lanyu Island and on-site at operating nuclear power plants, and for waste generated in the future through 2045. The planned final disposal facility will have a capacity of approximately 966,000 55-gallon drums. Taiwan is in the process of evaluating the best candidate site to pursue for licensing. Among these proposed sites there are basically two disposal concepts: shallow land burial and cavern disposal. A representative potential site for shallow land burial is located on a small island in the Taiwan Strait with basalt bedrock and interbedded sedimentary rocks. An engineered cover system would be constructed to limit infiltration for shallow land burial. A representative potential site for cavern disposal is located along the southeastern coast of Taiwan in a tunnel system that would be about 500 to 800 m below the surface. Bedrock at this site consists of argillite and meta-sedimentary rocks. Performance assessment analyses will be performed to evaluate future performance of the facility and the potential dose/risk to exposed populations. Preliminary performance assessment analyses will be used in the site-selection process and to aid in design of the disposal system. Final performance assessment analyses will be used in the regulatory process of licensing a site. The SNL/INER team has developed a performance assessment methodology that is used to simulate processes associated with the potential release of radionuclides to evaluate these sites. The following software codes are utilized in the performance assessment methodology: GoldSim (to implement a probabilistic analysis that will explicitly address uncertainties); the NRC's Breach, Leach, and Transport - Multiple Species (BLT-MS) code (to simulate waste-container degradation, waste-form leaching, and transport through the host rock); the Finite Element Heat and Mass Transfer code (FEHM) (to simulate groundwater flow and estimate flow velocities); the Hydrologic Evaluation of Landfill performance Model (HELP) code (to evaluate infiltration through the disposal cover); the AMBER code (to evaluate human health exposures); and the NRC's Disposal Unit Source Term -- Multiple Species (DUST-MS) code (to screen applicable radionuclides). Preliminary results of the evaluations of the two disposal concept sites are presented.« less