Development of PRIME for irradiation performance analysis of U-Mo/Al dispersion fuel
NASA Astrophysics Data System (ADS)
Jeong, Gwan Yoon; Kim, Yeon Soo; Jeong, Yong Jin; Park, Jong Man; Sohn, Dong-Seong
2018-04-01
A prediction code for the thermo-mechanical performance of research reactor fuel (PRIME) has been developed with the implementation of developed models to analyze the irradiation behavior of U-Mo dispersion fuel. The code is capable of predicting the two-dimensional thermal and mechanical performance of U-Mo dispersion fuel during irradiation. A finite element method was employed to solve the governing equations for thermal and mechanical equilibria. Temperature- and burnup-dependent material properties of the fuel meat constituents and cladding were used. The numerical solution schemes in PRIME were verified by benchmarking solutions obtained using a commercial finite element analysis program (ABAQUS). The code was validated using irradiation data from RERTR, HAMP-1, and E-FUTURE tests. The measured irradiation data used in the validation were IL thickness, volume fractions of fuel meat constituents for the thermal analysis, and profiles of the plate thickness changes and fuel meat swelling for the mechanical analysis. The prediction results were in good agreement with the measurement data for both thermal and mechanical analyses, confirming the validity of the code.
Development of PRIME for irradiation performance analysis of U-Mo/Al dispersion fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, Gwan Yoon; Kim, Yeon Soo; Jeong, Yong Jin
A prediction code for the thermo-mechanical performance of research reactor fuel (PRIME) has been developed with the implementation of developed models to analyze the irradiation behavior of U-Mo dispersion fuel. The code is capable of predicting the two-dimensional thermal and mechanical performance of U-Mo dispersion fuel during irradiation. A finite element method was employed to solve the governing equations for thermal and mechanical equilibria. Temperature-and burnup-dependent material properties of the fuel meat constituents and cladding were used. The numerical solution schemes in PRIME were verified by benchmarking solutions obtained using a commercial finite element analysis program (ABAQUS).The code was validatedmore » using irradiation data from RERTR, HAMP-1, and E-FUTURE tests. The measured irradiation data used in the validation were IL thickness, volume fractions of fuel meat constituents for the thermal analysis, and profiles of the plate thickness changes and fuel meat swelling for the mechanical analysis. The prediction results were in good agreement with the measurement data for both thermal and mechanical analyses, confirming the validity of the code. (c) 2018 Elsevier B.V. All rights reserved.« less
Progressive fracture of fiber composites
NASA Technical Reports Server (NTRS)
Irvin, T. B.; Ginty, C. A.
1983-01-01
Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.
Multi-Region Boundary Element Analysis for Coupled Thermal-Fracturing Processes in Geomaterials
NASA Astrophysics Data System (ADS)
Shen, Baotang; Kim, Hyung-Mok; Park, Eui-Seob; Kim, Taek-Kon; Wuttke, Manfred W.; Rinne, Mikael; Backers, Tobias; Stephansson, Ove
2013-01-01
This paper describes a boundary element code development on coupled thermal-mechanical processes of rock fracture propagation. The code development was based on the fracture mechanics code FRACOD that has previously been developed by Shen and Stephansson (Int J Eng Fracture Mech 47:177-189, 1993) and FRACOM (A fracture propagation code—FRACOD, User's manual. FRACOM Ltd. 2002) and simulates complex fracture propagation in rocks governed by both tensile and shear mechanisms. For the coupled thermal-fracturing analysis, an indirect boundary element method, namely the fictitious heat source method, was implemented in FRACOD to simulate the temperature change and thermal stresses in rocks. This indirect method is particularly suitable for the thermal-fracturing coupling in FRACOD where the displacement discontinuity method is used for mechanical simulation. The coupled code was also extended to simulate multiple region problems in which rock mass, concrete linings and insulation layers with different thermal and mechanical properties were present. Both verification and application cases were presented where a point heat source in a 2D infinite medium and a pilot LNG underground cavern were solved and studied using the coupled code. Good agreement was observed between the simulation results, analytical solutions and in situ measurements which validates an applicability of the developed coupled code.
Evaluation of the finite element fuel rod analysis code (FRANCO)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, K.; Feltus, M.A.
1994-12-31
Knowledge of temperature distribution in a nuclear fuel rod is required to predict the behavior of fuel elements during operating conditions. The thermal and mechanical properties and performance characteristics are strongly dependent on the temperature, which can vary greatly inside the fuel rod. A detailed model of fuel rod behavior can be described by various numerical methods, including the finite element approach. The finite element method has been successfully used in many engineering applications, including nuclear piping and reactor component analysis. However, fuel pin analysis has traditionally been carried out with finite difference codes, with the exception of Electric Powermore » Research Institute`s FREY code, which was developed for mainframe execution. This report describes FRANCO, a finite element fuel rod analysis code capable of computing temperature disrtibution and mechanical deformation of a single light water reactor fuel rod.« less
Fracture Analysis of Welded Type 304 Stainless Steel Pipe
1986-11-01
American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code . In order to accomplish these objectives, a series of seven full...Mechanical Engineers Boiler and Pressure Vessel Code , Section XI IWB-3640 (Winter Addenda 1983). 5. Ranganath, S., and U.S. Mehta, "Engineering Methods for
On 3-D inelastic analysis methods for hot section components (base program)
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Bak, M. J.; Nakazawa, S.; Banerjee, P. K.
1986-01-01
A 3-D Inelastic Analysis Method program is described. This program consists of a series of new computer codes embodying a progression of mathematical models (mechanics of materials, special finite element, boundary element) for streamlined analysis of: (1) combustor liners, (2) turbine blades, and (3) turbine vanes. These models address the effects of high temperatures and thermal/mechanical loadings on the local (stress/strain)and global (dynamics, buckling) structural behavior of the three selected components. Three computer codes, referred to as MOMM (Mechanics of Materials Model), MHOST (Marc-Hot Section Technology), and BEST (Boundary Element Stress Technology), have been developed and are briefly described in this report.
Application of the DART Code for the Assessment of Advanced Fuel Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.; Totev, T.
2007-07-01
The Dispersion Analysis Research Tool (DART) code is a dispersion fuel analysis code that contains mechanistically-based fuel and reaction-product swelling models, a one dimensional heat transfer analysis, and mechanical deformation models. DART has been used to simulate the irradiation behavior of uranium oxide, uranium silicide, and uranium molybdenum aluminum dispersion fuels, as well as their monolithic counterparts. The thermal-mechanical DART code has been validated against RERTR tests performed in the ATR for irradiation data on interaction thickness, fuel, matrix, and reaction product volume fractions, and plate thickness changes. The DART fission gas behavior model has been validated against UO{sub 2}more » fission gas release data as well as measured fission gas-bubble size distributions. Here DART is utilized to analyze various aspects of the observed bubble growth in U-Mo/Al interaction product. (authors)« less
NASA Astrophysics Data System (ADS)
Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.
2018-01-01
The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/
Michael Frei, Dominik; Hodneland, Erlend; Rios-Mondragon, Ivan; Burtey, Anne; Neumann, Beate; Bulkescher, Jutta; Schölermann, Julia; Pepperkok, Rainer; Gerdes, Hans-Hermann; Kögel, Tanja
2015-01-01
Contact-dependent intercellular transfer (codeIT) of cellular constituents can have functional consequences for recipient cells, such as enhanced survival and drug resistance. Pathogenic viruses, prions and bacteria can also utilize this mechanism to spread to adjacent cells and potentially evade immune detection. However, little is known about the molecular mechanism underlying this intercellular transfer process. Here, we present a novel microscopy-based screening method to identify regulators and cargo of codeIT. Single donor cells, carrying fluorescently labelled endocytic organelles or proteins, are co-cultured with excess acceptor cells. CodeIT is quantified by confocal microscopy and image analysis in 3D, preserving spatial information. An siRNA-based screening using this method revealed the involvement of several myosins and small GTPases as codeIT regulators. Our data indicates that cellular protrusions and tubular recycling endosomes are important for codeIT. We automated image acquisition and analysis to facilitate large-scale chemical and genetic screening efforts to identify key regulators of codeIT. PMID:26271723
Jet-A reaction mechanism study for combustion application
NASA Technical Reports Server (NTRS)
Lee, Chi-Ming; Kundu, Krishna; Acosta, Waldo
1991-01-01
Simplified chemical kinetic reaction mechanisms for the combustion of Jet A fuel was studied. Initially, 40 reacting species and 118 elementary chemical reactions were chosen based on a literature review. Through a sensitivity analysis with the use of LSENS General Kinetics and Sensitivity Analysis Code, 16 species and 21 elementary chemical reactions were determined from this study. This mechanism is first justified by comparison of calculated ignition delay time with the available shock tube data, then it is validated by comparison of calculated emissions from the plug flow reactor code with in-house flame tube data.
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
NASA Astrophysics Data System (ADS)
Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.
2017-04-01
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.
ERIC Educational Resources Information Center
Jackson, Suzanne F.; Kolla, Gillian
2012-01-01
In attempting to use a realistic evaluation approach to explore the role of Community Parents in early parenting programs in Toronto, a novel technique was developed to analyze the links between contexts (C), mechanisms (M) and outcomes (O) directly from experienced practitioner interviews. Rather than coding the interviews into themes in terms of…
TRANSURANUS: a fuel rod analysis code ready for use
NASA Astrophysics Data System (ADS)
Lassmann, K.
1992-06-01
TRANSURANUS is a computer program for the thermal and mechanical analysis of fuel rods in nuclear reactors and was developed at the European Institute for Transuranium Elements (TUI). The TRANSURANUS code consists of a clearly defined mechanical-mathematical framework into which physical models can easily be incorporated. Besides its flexibility for different fuel rod designs the TRANSURANUS code can deal with very different situations, as given for instance in an experiment, under normal, off-normal and accident conditions. The time scale of the problems to be treated may range from milliseconds to years. The code has a comprehensive material data bank for oxide, mixed oxide, carbide and nitride fuels, Zircaloy and steel claddings and different coolants. During its development great effort was spent on obtaining an extremely flexible tool which is easy to handle, exhibiting very fast running times. The total development effort is approximately 40 man-years. In recent years the interest to use this code grew and the code is in use in several organisations, both research and private industry. The code is now available to all interested parties. The paper outlines the main features and capabilities of the TRANSURANUS code, its validation and treats also some practical aspects.
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.
1995-01-01
NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.
MODELLING OF FUEL BEHAVIOUR DURING LOSS-OF-COOLANT ACCIDENTS USING THE BISON CODE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pastore, G.; Novascone, S. R.; Williamson, R. L.
2015-09-01
This work presents recent developments to extend the BISON code to enable fuel performance analysis during LOCAs. This newly developed capability accounts for the main physical phenomena involved, as well as the interactions among them and with the global fuel rod thermo-mechanical analysis. Specifically, new multiphysics models are incorporated in the code to describe (1) transient fission gas behaviour, (2) rapid steam-cladding oxidation, (3) Zircaloy solid-solid phase transition, (4) hydrogen generation and transport through the cladding, and (5) Zircaloy high-temperature non-linear mechanical behaviour and failure. Basic model characteristics are described, and a demonstration BISON analysis of a LWR fuel rodmore » undergoing a LOCA accident is presented. Also, as a first step of validation, the code with the new capability is applied to the simulation of experiments investigating cladding behaviour under LOCA conditions. The comparison of the results with the available experimental data of cladding failure due to burst is presented.« less
Metal Matrix Laminate Tailoring (MMLT) code: User's manual
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Morel, M. R.; Saravanos, D. A.
1993-01-01
The User's Manual for the Metal Matrix Laminate Tailoring (MMLT) program is presented. The code is capable of tailoring the fabrication process, constituent characteristics, and laminate parameters (individually or concurrently) for a wide variety of metal matrix composite (MMC) materials, to improve the performance and identify trends or behavior of MMC's under different thermo-mechanical loading conditions. This document is meant to serve as a guide in the use of the MMLT code. Detailed explanations of the composite mechanics and tailoring analysis are beyond the scope of this document, and may be found in the references. MMLT was developed by the Structural Mechanics Branch at NASA Lewis Research Center (LeRC).
Validation of Intensive Care and Mechanical Ventilation Codes in Medicare Data.
Wunsch, Hannah; Kramer, Andrew; Gershengorn, Hayley B
2017-07-01
To assess the reliability of codes relevant to critically ill patients in administrative data. Retrospective cohort study linking data from Acute Physiology and Chronic Health Evaluation Outcomes, a clinical database of ICU patients with data from Medicare Provider Analysis and Review. We linked data based on matching for sex, date of birth, hospital, and date of admission to hospital. Forty-six hospitals in the United States participating in Acute Physiology and Chronic Health Evaluation Outcomes. All patients in Acute Physiology and Chronic Health Evaluation Outcomes greater than or equal to 65 years old who could be linked with hospitalization records in Medicare Provider Analysis and Review from January 1, 2009, through September 30, 2012. Of 62,451 patients in the Acute Physiology and Chronic Health Evaluation Outcomes dataset, 80.1% were matched with data in Medicare Provider Analysis and Review. All but 2.7% of Acute Physiology and Chronic Health Evaluation Outcomes ICU patients had either an ICU or coronary care unit charge in Medicare Provider Analysis and Review. In Acute Physiology and Chronic Health Evaluation Outcomes, 37.0% received mechanical ventilation during the ICU stay versus 24.1% in Medicare Provider Analysis and Review. The Medicare Provider Analysis and Review procedure codes for mechanical ventilation had high specificity (96.0%; 95% CI, 95.8-96.2), but only moderate sensitivity (58.4%; 95% CI, 57.7-59.1), with a positive predictive value of 89.6% (95% CI, 89.1-90.1) and negative predictive value of 79.7% (95% CI, 79.4-80.1). For patients with mechanical ventilation codes, Medicare Provider Analysis and Review overestimated the percentage with a duration greater than 96 hours (36.6% vs 27.3% in Acute Physiology and Chronic Health Evaluation Outcomes). There was discordance in the hospital discharge status (alive or dead) for only 0.47% of all linked records (κ = 1.00). Medicare Provider Analysis and Review data contain robust information on hospital mortality for patients admitted to the ICU but have limited ability to identify all patients who received mechanical ventilation during a critical illness. Estimates of use of mechanical ventilation in the United States should likely be revised upward.
Multiphysics Code Demonstrated for Propulsion Applications
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Melis, Matthew E.
1998-01-01
The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.
Multi-scale modeling of irradiation effects in spallation neutron source materials
NASA Astrophysics Data System (ADS)
Yoshiie, T.; Ito, T.; Iwase, H.; Kaneko, Y.; Kawai, M.; Kishida, I.; Kunieda, S.; Sato, K.; Shimakawa, S.; Shimizu, F.; Hashimoto, S.; Hashimoto, N.; Fukahori, T.; Watanabe, Y.; Xu, Q.; Ishino, S.
2011-07-01
Changes in mechanical property of Ni under irradiation by 3 GeV protons were estimated by multi-scale modeling. The code consisted of four parts. The first part was based on the Particle and Heavy-Ion Transport code System (PHITS) code for nuclear reactions, and modeled the interactions between high energy protons and nuclei in the target. The second part covered atomic collisions by particles without nuclear reactions. Because the energy of the particles was high, subcascade analysis was employed. The direct formation of clusters and the number of mobile defects were estimated using molecular dynamics (MD) and kinetic Monte-Carlo (kMC) methods in each subcascade. The third part considered damage structural evolutions estimated by reaction kinetic analysis. The fourth part involved the estimation of mechanical property change using three-dimensional discrete dislocation dynamics (DDD). Using the above four part code, stress-strain curves for high energy proton irradiated Ni were obtained.
Network analysis of human diseases using Korean nationwide claims data.
Kim, Jin Hee; Son, Ki Young; Shin, Dong Wook; Kim, Sang Hyuk; Yun, Jae Won; Shin, Jung Hyun; Kang, Mi So; Chung, Eui Heon; Yoo, Kyoung Hun; Yun, Jae Moon
2016-06-01
To investigate disease-disease associations by conducting a network analysis using Korean nationwide claims data. We used the claims data from the Health Insurance Review and Assessment Service-National Patient Sample for the year 2011. Among the 2049 disease codes in the claims data, 1154 specific disease codes were used and combined into 795 representative disease codes. We analyzed for 381 representative codes, which had a prevalence of >0.1%. For disease code pairs of a combination of 381 representative disease codes, P values were calculated by using the χ(2) test and the degrees of associations were expressed as odds ratios (ORs). For 5515 (7.62%) statistically significant disease-disease associations with a large effect size (OR>5), we constructed a human disease network consisting of 369 nodes and 5515 edges. The human disease network shows the distribution of diseases in the disease network and the relationships between diseases or disease groups, demonstrating that diseases are associated with each other, forming a complex disease network. We reviewed 5515 disease-disease associations and classified them according to underlying mechanisms. Several disease-disease associations were identified, but the evidence of these associations is not sufficient and the mechanisms underlying these associations have not been clarified yet. Further research studies are needed to investigate these associations and their underlying mechanisms. Human disease network analysis using claims data enriches the understanding of human diseases and provides new insights into disease-disease associations that can be useful in future research. Copyright © 2016 Elsevier Inc. All rights reserved.
Modeling of rolling element bearing mechanics. Computer program user's manual
NASA Technical Reports Server (NTRS)
Greenhill, Lyn M.; Merchant, David H.
1994-01-01
This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.
Stress Analysis and Fracture in Nanolaminate Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A stress analysis is performed on a nanolaminate subjected to bending. A composite mechanics computer code that is based on constituent properties and nanoelement formulation is used to evaluate the nanolaminate stresses. The results indicate that the computer code is sufficient for the analysis. The results also show that when a stress concentration is present, the nanolaminate stresses exceed their corresponding matrix-dominated strengths and the nanofiber fracture strength.
NASA Technical Reports Server (NTRS)
Bittker, David A.
1996-01-01
A generalized version of the NASA Lewis general kinetics code, LSENS, is described. The new code allows the use of global reactions as well as molecular processes in a chemical mechanism. The code also incorporates the capability of performing sensitivity analysis calculations for a perfectly stirred reactor rapidly and conveniently at the same time that the main kinetics calculations are being done. The GLSENS code has been extensively tested and has been found to be accurate and efficient. Nine example problems are presented and complete user instructions are given for the new capabilities. This report is to be used in conjunction with the documentation for the original LSENS code.
Computational Methods for Structural Mechanics and Dynamics
NASA Technical Reports Server (NTRS)
Stroud, W. Jefferson (Editor); Housner, Jerrold M. (Editor); Tanner, John A. (Editor); Hayduk, Robert J. (Editor)
1989-01-01
Topics addressed include: transient dynamics; transient finite element method; transient analysis in impact and crash dynamic studies; multibody computer codes; dynamic analysis of space structures; multibody mechanics and manipulators; spatial and coplanar linkage systems; flexible body simulation; multibody dynamics; dynamical systems; and nonlinear characteristics of joints.
Computer codes developed and under development at Lewis
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1992-01-01
The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.
2015-11-01
induced residual stresses and distortions from weld simulations in the SYSWELD software code in structural Finite Element Analysis ( FEA ) simulations...performed in the Abaqus FEA code is presented. The translation of these results is accomplished using a newly developed Python script. Full details of...Local Weld Model in Structural FEA ....................................................15 CONCLUSIONS
electromagnetics, eddy current, computer codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gartling, David
TORO Version 4 is designed for finite element analysis of steady, transient and time-harmonic, multi-dimensional, quasi-static problems in electromagnetics. The code allows simulation of electrostatic fields, steady current flows, magnetostatics and eddy current problems in plane or axisymmetric, two-dimensional geometries. TORO is easily coupled to heat conduction and solid mechanics codes to allow multi-physics simulations to be performed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barani, T.; Bruschi, E.; Pizzocri, D.
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
Electro-Thermal-Mechanical Simulation Capability Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, D
This is the Final Report for LDRD 04-ERD-086, 'Electro-Thermal-Mechanical Simulation Capability'. The accomplishments are well documented in five peer-reviewed publications and six conference presentations and hence will not be detailed here. The purpose of this LDRD was to research and develop numerical algorithms for three-dimensional (3D) Electro-Thermal-Mechanical simulations. LLNL has long been a world leader in the area of computational mechanics, and recently several mechanics codes have become 'multiphysics' codes with the addition of fluid dynamics, heat transfer, and chemistry. However, these multiphysics codes do not incorporate the electromagnetics that is required for a coupled Electro-Thermal-Mechanical (ETM) simulation. There aremore » numerous applications for an ETM simulation capability, such as explosively-driven magnetic flux compressors, electromagnetic launchers, inductive heating and mixing of metals, and MEMS. A robust ETM simulation capability will enable LLNL physicists and engineers to better support current DOE programs, and will prepare LLNL for some very exciting long-term DoD opportunities. We define a coupled Electro-Thermal-Mechanical (ETM) simulation as a simulation that solves, in a self-consistent manner, the equations of electromagnetics (primarily statics and diffusion), heat transfer (primarily conduction), and non-linear mechanics (elastic-plastic deformation, and contact with friction). There is no existing parallel 3D code for simulating ETM systems at LLNL or elsewhere. While there are numerous magnetohydrodynamic codes, these codes are designed for astrophysics, magnetic fusion energy, laser-plasma interaction, etc. and do not attempt to accurately model electromagnetically driven solid mechanics. This project responds to the Engineering R&D Focus Areas of Simulation and Energy Manipulation, and addresses the specific problem of Electro-Thermal-Mechanical simulation for design and analysis of energy manipulation systems such as magnetic flux compression generators and railguns. This project compliments ongoing DNT projects that have an experimental emphasis. Our research efforts have been encapsulated in the Diablo and ALE3D simulation codes. This new ETM capability already has both internal and external users, and has spawned additional research in plasma railgun technology. By developing this capability Engineering has become a world-leader in ETM design, analysis, and simulation. This research has positioned LLNL to be able to compete for new business opportunities with the DoD in the area of railgun design. We currently have a three-year $1.5M project with the Office of Naval Research to apply our ETM simulation capability to railgun bore life issues and we expect to be a key player in the railgun community.« less
Beyond Molecular Codes: Simple Rules to Wire Complex Brains
Hassan, Bassem A.; Hiesinger, P. Robin
2015-01-01
Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480
Computational fluid mechanics utilizing the variational principle of modeling damping seals
NASA Technical Reports Server (NTRS)
Abernathy, J. M.; Farmer, R.
1985-01-01
An analysis for modeling damping seals for use in Space Shuttle main engine turbomachinery is being produced. Development of a computational fluid mechanics code for turbulent, incompressible flow is required.
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
Barani, T.; Bruschi, E.; Pizzocri, D.; ...
2017-01-03
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
NASA Technical Reports Server (NTRS)
Nguyen, H. L.; Ying, S.-J.
1990-01-01
Jet-A spray combustion has been evaluated in gas turbine combustion with the use of propane chemical kinetics as the first approximation for the chemical reactions. Here, the numerical solutions are obtained by using the KIVA-2 computer code. The KIVA-2 code is the most developed of the available multidimensional combustion computer programs for application of the in-cylinder combustion dynamics of internal combustion engines. The released version of KIVA-2 assumes that 12 chemical species are present; the code uses an Arrhenius kinetic-controlled combustion model governed by a four-step global chemical reaction and six equilibrium reactions. Researchers efforts involve the addition of Jet-A thermophysical properties and the implementation of detailed reaction mechanisms for propane oxidation. Three different detailed reaction mechanism models are considered. The first model consists of 131 reactions and 45 species. This is considered as the full mechanism which is developed through the study of chemical kinetics of propane combustion in an enclosed chamber. The full mechanism is evaluated by comparing calculated ignition delay times with available shock tube data. However, these detailed reactions occupy too much computer memory and CPU time for the computation. Therefore, it only serves as a benchmark case by which to evaluate other simplified models. Two possible simplified models were tested in the existing computer code KIVA-2 for the same conditions as used with the full mechanism. One model is obtained through a sensitivity analysis using LSENS, the general kinetics and sensitivity analysis program code of D. A. Bittker and K. Radhakrishnan. This model consists of 45 chemical reactions and 27 species. The other model is based on the work published by C. K. Westbrook and F. L. Dryer.
Di Giulio, Massimo
2016-06-21
I analyze the mechanism on which are based the majority of theories that put to the center of the origin of the genetic code the physico-chemical properties of amino acids. As this mechanism is based on excessive mutational steps, I conclude that it could not have been operative or if operative it would not have allowed a full realization of predictions of these theories, because this mechanism contained, evidently, a high indeterminacy. I make that disapproving the four-column theory of the origin of the genetic code (Higgs, 2009) and reply to the criticism that was directed towards the coevolution theory of the origin of the genetic code. In this context, I suggest a new hypothesis that clarifies the mechanism by which the domains of codons of the precursor amino acids would have evolved, as predicted by the coevolution theory. This mechanism would have used particular elongation factors that would have constrained the evolution of all amino acids belonging to a given biosynthetic family to the progenitor pre-tRNA, that for first recognized, the first codons that evolved in a certain codon domain of a determined precursor amino acid. This happened because the elongation factors recognized two characteristics of the progenitor pre-tRNAs of precursor amino acids, which prevented the elongation factors from recognizing the pre-tRNAs belonging to biosynthetic families of different precursor amino acids. Finally, I analyze by means of Fisher's exact test, the distribution, within the genetic code, of the biosynthetic classes of amino acids and the ones of polarity values of amino acids. This analysis would seem to support the biosynthetic classes of amino acids over the ones of polarity values, as the main factor that led to the structuring of the genetic code, with the physico-chemical properties of amino acids playing only a subsidiary role in this evolution. As a whole, the full analysis brings to the conclusion that the coevolution theory of the origin of the genetic code would be a theory highly corroborated. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Bak, M. J.; Nakazawa, S.; Banerjee, P. K.
1984-01-01
A 3-D inelastic analysis methods program consists of a series of computer codes embodying a progression of mathematical models (mechanics of materials, special finite element, boundary element) for streamlined analysis of combustor liners, turbine blades, and turbine vanes. These models address the effects of high temperatures and thermal/mechanical loadings on the local (stress/strain) and global (dynamics, buckling) structural behavior of the three selected components. These models are used to solve 3-D inelastic problems using linear approximations in the sense that stresses/strains and temperatures in generic modeling regions are linear functions of the spatial coordinates, and solution increments for load, temperature and/or time are extrapolated linearly from previous information. Three linear formulation computer codes, referred to as MOMM (Mechanics of Materials Model), MHOST (MARC-Hot Section Technology), and BEST (Boundary Element Stress Technology), were developed and are described.
NONLINEAR SYSTEMS, LINEAR SYSTEMS, SUBROUTINES , SOIL MECHANICS, INTERFACES, DYNAMICS, LOADS(FORCES), FORCE(MECHANICS), DAMPING, ACCELERATION, ELASTIC...PROPERTIES, PLASTIC PROPERTIES, CRACKS , REINFORCING MATERIALS , COMPOSITE MATERIALS , FAILURE(MECHANICS), MECHANICAL PROPERTIES, INSTRUCTION MANUALS, DIGITAL COMPUTERS...STRESSES, *COMPUTER PROGRAMS), (*STRUCTURES, STRESSES), (*DATA PROCESSING, STRUCTURAL PROPERTIES), SOILS , STRAIN(MECHANICS), MATHEMATICAL MODELS
Modeling of rolling element bearing mechanics. Theoretical manual
NASA Technical Reports Server (NTRS)
Merchant, David H.; Greenhill, Lyn M.
1994-01-01
This report documents the theoretical basis for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings; duplex angular contact ball bearings; and cylindrical roller bearings. The model includes the effects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program; and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. A companion report addresses the input instructions for and features of the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.
Computer models and output, Spartan REM: Appendix B
NASA Technical Reports Server (NTRS)
Marlowe, D. S.; West, E. J.
1984-01-01
A computer model of the Spartan Release Engagement Mechanism (REM) is presented in a series of numerical charts and engineering drawings. A crack growth analysis code is used to predict the fracture mechanics of critical components.
Neural Coding Mechanisms in Gustation.
1980-09-15
world is composed of four primary tastes ( sweet , sour, salty , and bitter), and that each of these is carried by a separate and private neural line, thus...ted sweet -sour- salty -bitter types. The mathematical method of analysis was hierarchical cluster analysis based on the responses of many neurons (20 to...block number) Taste Neural coding Neural organization Stimulus organization Olfaction AB TRACT M~ea -i .rvm~ .1* N necffas and idmatity by block mmnbwc
ERIC Educational Resources Information Center
Bantum, Erin O'Carroll; Owen, Jason E.
2009-01-01
Psychological interventions provide linguistic data that are particularly useful for testing mechanisms of action and improving intervention methodologies. For this study, emotional expression in an Internet-based intervention for women with breast cancer (n = 63) was analyzed via rater coding and 2 computerized coding methods (Linguistic Inquiry…
Prevalence of transcription promoters within archaeal operons and coding sequences
Koide, Tie; Reiss, David J; Bare, J Christopher; Pang, Wyming Lee; Facciotti, Marc T; Schmid, Amy K; Pan, Min; Marzolf, Bruz; Van, Phu T; Lo, Fang-Yin; Pratap, Abhishek; Deutsch, Eric W; Peterson, Amelia; Martin, Dan; Baliga, Nitin S
2009-01-01
Despite the knowledge of complex prokaryotic-transcription mechanisms, generalized rules, such as the simplified organization of genes into operons with well-defined promoters and terminators, have had a significant role in systems analysis of regulatory logic in both bacteria and archaea. Here, we have investigated the prevalence of alternate regulatory mechanisms through genome-wide characterization of transcript structures of ∼64% of all genes, including putative non-coding RNAs in Halobacterium salinarum NRC-1. Our integrative analysis of transcriptome dynamics and protein–DNA interaction data sets showed widespread environment-dependent modulation of operon architectures, transcription initiation and termination inside coding sequences, and extensive overlap in 3′ ends of transcripts for many convergently transcribed genes. A significant fraction of these alternate transcriptional events correlate to binding locations of 11 transcription factors and regulators (TFs) inside operons and annotated genes—events usually considered spurious or non-functional. Using experimental validation, we illustrate the prevalence of overlapping genomic signals in archaeal transcription, casting doubt on the general perception of rigid boundaries between coding sequences and regulatory elements. PMID:19536208
Prevalence of transcription promoters within archaeal operons and coding sequences.
Koide, Tie; Reiss, David J; Bare, J Christopher; Pang, Wyming Lee; Facciotti, Marc T; Schmid, Amy K; Pan, Min; Marzolf, Bruz; Van, Phu T; Lo, Fang-Yin; Pratap, Abhishek; Deutsch, Eric W; Peterson, Amelia; Martin, Dan; Baliga, Nitin S
2009-01-01
Despite the knowledge of complex prokaryotic-transcription mechanisms, generalized rules, such as the simplified organization of genes into operons with well-defined promoters and terminators, have had a significant role in systems analysis of regulatory logic in both bacteria and archaea. Here, we have investigated the prevalence of alternate regulatory mechanisms through genome-wide characterization of transcript structures of approximately 64% of all genes, including putative non-coding RNAs in Halobacterium salinarum NRC-1. Our integrative analysis of transcriptome dynamics and protein-DNA interaction data sets showed widespread environment-dependent modulation of operon architectures, transcription initiation and termination inside coding sequences, and extensive overlap in 3' ends of transcripts for many convergently transcribed genes. A significant fraction of these alternate transcriptional events correlate to binding locations of 11 transcription factors and regulators (TFs) inside operons and annotated genes-events usually considered spurious or non-functional. Using experimental validation, we illustrate the prevalence of overlapping genomic signals in archaeal transcription, casting doubt on the general perception of rigid boundaries between coding sequences and regulatory elements.
Tang, Wan; Chen, Min; Ni, Jin; Yang, Ximin
2011-01-01
The traditional Radio Frequency Identification (RFID) system, in which the information maintained in tags is passive and static, has no intelligent decision-making ability to suit application and environment dynamics. The Second-Generation RFID (2G-RFID) system, referred as 2G-RFID-sys, is an evolution of the traditional RFID system to ensure better quality of service in future networks. Due to the openness of the active mobile codes in the 2G-RFID system, the realization of conveying intelligence brings a critical issue: how can we make sure the backend system will interpret and execute mobile codes in the right way without misuse so as to avoid malicious attacks? To address this issue, this paper expands the concept of Role-Based Access Control (RBAC) by introducing context-aware computing, and then designs a secure middleware for backend systems, named Two-Level Security Enhancement Mechanism or 2L-SEM, in order to ensure the usability and validity of the mobile code through contextual authentication and role analysis. According to the given contextual restrictions, 2L-SEM can filtrate the illegal and invalid mobile codes contained in tags. Finally, a reference architecture and its typical application are given to illustrate the implementation of 2L-SEM in a 2G-RFID system, along with the simulation results to evaluate how the proposed mechanism can guarantee secure execution of mobile codes for the system. PMID:22163983
Tang, Wan; Chen, Min; Ni, Jin; Yang, Ximin
2011-01-01
The traditional Radio Frequency Identification (RFID) system, in which the information maintained in tags is passive and static, has no intelligent decision-making ability to suit application and environment dynamics. The Second-Generation RFID (2G-RFID) system, referred as 2G-RFID-sys, is an evolution of the traditional RFID system to ensure better quality of service in future networks. Due to the openness of the active mobile codes in the 2G-RFID system, the realization of conveying intelligence brings a critical issue: how can we make sure the backend system will interpret and execute mobile codes in the right way without misuse so as to avoid malicious attacks? To address this issue, this paper expands the concept of Role-Based Access Control (RBAC) by introducing context-aware computing, and then designs a secure middleware for backend systems, named Two-Level Security Enhancement Mechanism or 2L-SEM, in order to ensure the usability and validity of the mobile code through contextual authentication and role analysis. According to the given contextual restrictions, 2L-SEM can filtrate the illegal and invalid mobile codes contained in tags. Finally, a reference architecture and its typical application are given to illustrate the implementation of 2L-SEM in a 2G-RFID system, along with the simulation results to evaluate how the proposed mechanism can guarantee secure execution of mobile codes for the system.
Modeling of thermo-mechanical and irradiation behavior of mixed oxide fuel for sodium fast reactors
NASA Astrophysics Data System (ADS)
Karahan, Aydın; Buongiorno, Jacopo
2010-01-01
An engineering code to model the irradiation behavior of UO2-PuO2 mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named fuel engineering and structural analysis tool (FEAST-OXIDE). FEAST-OXIDE has several modules working in coupled form with an explicit numerical algorithm. These modules describe: (1) fission gas release and swelling, (2) fuel chemistry and restructuring, (3) temperature distribution, (4) fuel-clad chemical interaction and (5) fuel-clad mechanical analysis. Given the fuel pin geometry, composition and irradiation history, FEAST-OXIDE can analyze fuel and cladding thermo-mechanical behavior at both steady-state and design-basis transient scenarios. The code was written in FORTRAN-90 program language. The mechanical analysis module implements the LIFE algorithm. Fission gas release and swelling behavior is described by the OGRES and NEFIG models. However, the original OGRES model has been extended to include the effects of joint oxide gain (JOG) formation on fission gas release and swelling. A detailed fuel chemistry model has been included to describe the cesium radial migration and JOG formation, oxygen and plutonium radial distribution and the axial migration of cesium. The fuel restructuring model includes the effects of as-fabricated porosity migration, irradiation-induced fuel densification, grain growth, hot pressing and fuel cracking and relocation. Finally, a kinetics model is included to predict the clad wastage formation. FEAST-OXIDE predictions have been compared to the available FFTF, EBR-II and JOYO databases, as well as the LIFE-4 code predictions. The agreement was found to be satisfactory for steady-state and slow-ramp over-power accidents.
Grizzly Usage and Theory Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, B. W.; Backman, M.; Chakraborty, P.
2016-03-01
Grizzly is a multiphysics simulation code for characterizing the behavior of nuclear power plant (NPP) structures, systems and components (SSCs) subjected to a variety of age-related aging mechanisms. Grizzly simulates both the progression of aging processes, as well as the capacity of aged components to safely perform. This initial beta release of Grizzly includes capabilities for engineering-scale thermo-mechanical analysis of reactor pressure vessels (RPVs). Grizzly will ultimately include capabilities for a wide range of components and materials. Grizzly is in a state of constant development, and future releases will broaden the capabilities of this code for RPV analysis, as wellmore » as expand it to address degradation in other critical NPP components.« less
NASA Astrophysics Data System (ADS)
Porter, Ian Edward
A nuclear reactor systems code has the ability to model the system response in an accident scenario based on known initial conditions at the onset of the transient. However, there has been a tendency for these codes to lack the detailed thermo-mechanical fuel rod response models needed for accurate prediction of fuel rod failure. This proposed work will couple today's most widely used steady-state (FRAPCON) and transient (FRAPTRAN) fuel rod models with a systems code TRACE for best-estimate modeling of system response in accident scenarios such as a loss of coolant accident (LOCA). In doing so, code modifications will be made to model gamma heating in LWRs during steady-state and accident conditions and to improve fuel rod thermal/mechanical analysis by allowing axial nodalization of burnup-dependent phenomena such as swelling, cladding creep and oxidation. With the ability to model both burnup-dependent parameters and transient fuel rod response, a fuel dispersal study will be conducted using a hypothetical accident scenario under both PWR and BWR conditions to determine the amount of fuel dispersed under varying conditions. Due to the fuel fragmentation size and internal rod pressure both being dependent on burnup, this analysis will be conducted at beginning, middle and end of cycle to examine the effects that cycle time can play on fuel rod failure and dispersal. Current fuel rod and system codes used by the Nuclear Regulatory Commission (NRC) are compilations of legacy codes with only commonly used light water reactor materials, Uranium Dioxide (UO2), Mixed Oxide (U/PuO 2) and zirconium alloys. However, the events at Fukushima Daiichi and Three Mile Island accident have shown the need for exploration into advanced materials possessing improved accident tolerance. This work looks to further modify the NRC codes to include silicon carbide (SiC), an advanced cladding material proposed by current DOE funded research on accident tolerant fuels (ATF). Several additional fuels will also be analyzed, including uranium nitride (UN), uranium carbide (UC) and uranium silicide (U3Si2). Focusing on the system response in an accident scenario, an emphasis is placed on the fracture mechanics of the ceramic cladding by design the fuel rods to eliminate pellet cladding mechanical interaction (PCMI). The time to failure and how much of the fuel in the reactor fails with an advanced fuel design will be analyzed and compared to the current UO2/Zircaloy design using a full scale reactor model.
Java Source Code Analysis for API Migration to Embedded Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winter, Victor; McCoy, James A.; Guerrero, Jonathan
Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less
Bobrova, E V; Liakhovetskiĭ, V A; Borshchevskaia, E R
2011-01-01
The dependence of errors during reproduction of a sequence of hand movements without visual feedback on the previous right- and left-hand performance ("prehistory") and on positions in space of sequence elements (random or ordered by the explicit rule) was analyzed. It was shown that the preceding information about the ordered positions of the sequence elements was used during right-hand movements, whereas left-hand movements were performed with involvement of the information about the random sequence. The data testify to a central mechanism of the analysis of spatial structure of sequence elements. This mechanism activates movement coding specific for the left hemisphere (vector coding) in case of an ordered sequence structure and positional coding specific for the right hemisphere in case of a random sequence structure.
Progressive Fracture of Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Minnetyan, Levon
2008-01-01
A new approach is described for evaluating fracture in composite structures. This approach is independent of classical fracture mechanics parameters like fracture toughness. It relies on computational simulation and is programmed in a stand-alone integrated computer code. It is multiscale, multifunctional because it includes composite mechanics for the composite behavior and finite element analysis for predicting the structural response. It contains seven modules; layered composite mechanics (micro, macro, laminate), finite element, updating scheme, local fracture, global fracture, stress based failure modes, and fracture progression. The computer code is called CODSTRAN (Composite Durability Structural ANalysis). It is used in the present paper to evaluate the global fracture of four composite shell problems and one composite built-up structure. Results show that the composite shells and the built-up composite structure global fracture are enhanced when internal pressure is combined with shear loads.
NASA Astrophysics Data System (ADS)
Urquiza, Eugenio
This work presents a comprehensive thermal hydraulic analysis of a compact heat exchanger using offset strip fins. The thermal hydraulics analysis in this work is followed by a finite element analysis (FEA) to predict the mechanical stresses experienced by an intermediate heat exchanger (IHX) during steady-state operation and selected flow transients. In particular, the scenario analyzed involves a gas-to-liquid IHX operating between high pressure helium and liquid or molten salt. In order to estimate the stresses in compact heat exchangers a comprehensive thermal and hydraulic analysis is needed. Compact heat exchangers require very small flow channels and fins to achieve high heat transfer rates and thermal effectiveness. However, studying such small features computationally contributes little to the understanding of component level phenomena and requires prohibitive computational effort using computational fluid dynamics (CFD). To address this issue, the analysis developed here uses an effective porous media (EPM) approach; this greatly reduces the computation time and produces results with the appropriate resolution [1]. This EPM fluid dynamics and heat transfer computational code has been named the Compact Heat Exchanger Explicit Thermal and Hydraulics (CHEETAH) code. CHEETAH solves for the two-dimensional steady-state and transient temperature and flow distributions in the IHX including the complicating effects of temperature-dependent fluid thermo-physical properties. Temperature- and pressure-dependent fluid properties are evaluated by CHEETAH and the thermal effectiveness of the IHX is also calculated. Furthermore, the temperature distribution can then be imported into a finite element analysis (FEA) code for mechanical stress analysis using the EPM methods developed earlier by the University of California, Berkeley, for global and local stress analysis [2]. These simulation tools will also allow the heat exchanger design to be improved through an iterative design process which will lead to a design with a reduced pressure drop, increased thermal effectiveness, and improved mechanical performance as it relates to creep deformation and transient thermal stresses.
Statistical mechanics of broadcast channels using low-density parity-check codes.
Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David
2003-03-01
We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.
1993-01-01
This report discusses probabilistic fracture mechanics (PFM) analysis which is a major element of the comprehensive probabilistic methodology endorsed by the NRC for evaluation of the integrity of Pressurized Water Reactor (PWR) pressure vessels subjected to pressurized-thermal-shock (PTS) transients. It is anticipated that there will be an increasing need for an improved and validated PTS PFM code which is accepted by the NRC and utilities, as more plants approach the PTS screening criteria and are required to perform plant-specific analyses. The NRC funded Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratories is currently developing the FAVOR (Fracturemore » Analysis of Vessels: Oak Ridge) PTS PFM code, which is intended to meet this need. The FAVOR code incorporates the most important features of both OCA-P and VISA-II and contains some new capabilities such as PFM global modeling methodology, the capability to approximate the effects of thermal streaming on circumferential flaws located inside a plume region created by fluid and thermal stratification, a library of stress intensity factor influence coefficients, generated by the NQA-1 certified ABAQUS computer code, for an adequate range of two and three dimensional inside surface flaws, the flexibility to generate a variety of output reports, and user friendliness.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.
1993-04-01
This report discusses probabilistic fracture mechanics (PFM) analysis which is a major element of the comprehensive probabilistic methodology endorsed by the NRC for evaluation of the integrity of Pressurized Water Reactor (PWR) pressure vessels subjected to pressurized-thermal-shock (PTS) transients. It is anticipated that there will be an increasing need for an improved and validated PTS PFM code which is accepted by the NRC and utilities, as more plants approach the PTS screening criteria and are required to perform plant-specific analyses. The NRC funded Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratories is currently developing the FAVOR (Fracturemore » Analysis of Vessels: Oak Ridge) PTS PFM code, which is intended to meet this need. The FAVOR code incorporates the most important features of both OCA-P and VISA-II and contains some new capabilities such as PFM global modeling methodology, the capability to approximate the effects of thermal streaming on circumferential flaws located inside a plume region created by fluid and thermal stratification, a library of stress intensity factor influence coefficients, generated by the NQA-1 certified ABAQUS computer code, for an adequate range of two and three dimensional inside surface flaws, the flexibility to generate a variety of output reports, and user friendliness.« less
3D Material Response Analysis of PICA Pyrolysis Experiments
NASA Technical Reports Server (NTRS)
Oliver, A. Brandon
2017-01-01
The PICA decomposition experiments of Bessire and Minton are investigated using 3D material response analysis. The steady thermoelectric equations have been added to the CHAR code to enable analysis of the Joule-heated experiments and the DAKOTA optimization code is used to define the voltage boundary condition that yields the experimentally observed temperature response. This analysis has identified a potential spatial non-uniformity in the PICA sample temperature driven by the cooled copper electrodes and thermal radiation from the surface of the test article (Figure 1). The non-uniformity leads to a variable heating rate throughout the sample volume that has an effect on the quantitative results of the experiment. Averaging the results of integrating a kinetic reaction mechanism with the heating rates seen across the sample volume yield a shift of peak species production to lower temperatures that is more significant for higher heating rates (Figure 2) when compared to integrating the same mechanism at the reported heating rate. The analysis supporting these conclusions will be presented along with a proposed analysis procedure that permits quantitative use of the existing data. Time permitting, a status on the in-development kinetic decomposition mechanism based on this data will be presented as well.
Sierra/Solid Mechanics 4.48 User's Guide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merewether, Mark Thomas; Crane, Nathan K; de Frias, Gabriel Jose
Sierra/SolidMechanics (Sierra/SM) is a Lagrangian, three-dimensional code for finite element analysis of solids and structures. It provides capabilities for explicit dynamic, implicit quasistatic and dynamic analyses. The explicit dynamics capabilities allow for the efficient and robust solution of models with extensive contact subjected to large, suddenly applied loads. For implicit problems, Sierra/SM uses a multi-level iterative solver, which enables it to effectively solve problems with large deformations, nonlinear material behavior, and contact. Sierra/SM has a versatile library of continuum and structural elements, and a large library of material models. The code is written for parallel computing environments enabling scalable solutionsmore » of extremely large problems for both implicit and explicit analyses. It is built on the SIERRA Framework, which facilitates coupling with other SIERRA mechanics codes. This document describes the functionality and input syntax for Sierra/SM.« less
Simulation Studies for Inspection of the Benchmark Test with PATRASH
NASA Astrophysics Data System (ADS)
Shimosaki, Y.; Igarashi, S.; Machida, S.; Shirakata, M.; Takayama, K.; Noda, F.; Shigaki, K.
2002-12-01
In order to delineate the halo-formation mechanisms in a typical FODO lattice, a 2-D simulation code PATRASH (PArticle TRAcking in a Synchrotron for Halo analysis) has been developed. The electric field originating from the space charge is calculated by the Hybrid Tree code method. Benchmark tests utilizing three simulation codes of ACCSIM, PATRASH and SIMPSONS were carried out. These results have been confirmed to be fairly in agreement with each other. The details of PATRASH simulation are discussed with some examples.
TEMPEST code simulations of hydrogen distribution in reactor containment structures. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trent, D.S.; Eyler, L.L.
The mass transport version of the TEMPEST computer code was used to simulate hydrogen distribution in geometric configurations relevant to reactor containment structures. Predicted results of Battelle-Frankfurt hydrogen distribution tests 1 to 6, and 12 are presented. Agreement between predictions and experimental data is good. Best agreement is obtained using the k-epsilon turbulence model in TEMPEST in flow cases where turbulent diffusion and stable stratification are dominant mechanisms affecting transport. The code's general analysis capabilities are summarized.
Scalable Implementation of Finite Elements by NASA _ Implicit (ScIFEi)
NASA Technical Reports Server (NTRS)
Warner, James E.; Bomarito, Geoffrey F.; Heber, Gerd; Hochhalter, Jacob D.
2016-01-01
Scalable Implementation of Finite Elements by NASA (ScIFEN) is a parallel finite element analysis code written in C++. ScIFEN is designed to provide scalable solutions to computational mechanics problems. It supports a variety of finite element types, nonlinear material models, and boundary conditions. This report provides an overview of ScIFEi (\\Sci-Fi"), the implicit solid mechanics driver within ScIFEN. A description of ScIFEi's capabilities is provided, including an overview of the tools and features that accompany the software as well as a description of the input and output le formats. Results from several problems are included, demonstrating the efficiency and scalability of ScIFEi by comparing to finite element analysis using a commercial code.
Multiscale Multifunctional Progressive Fracture of Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Minnetyan, L.
2012-01-01
A new approach is described for evaluating fracture in composite structures. This approach is independent of classical fracture mechanics parameters like fracture toughness. It relies on computational simulation and is programmed in a stand-alone integrated computer code. It is multiscale, multifunctional because it includes composite mechanics for the composite behavior and finite element analysis for predicting the structural response. It contains seven modules; layered composite mechanics (micro, macro, laminate), finite element, updating scheme, local fracture, global fracture, stress based failure modes, and fracture progression. The computer code is called CODSTRAN (Composite Durability Structural ANalysis). It is used in the present paper to evaluate the global fracture of four composite shell problems and one composite built-up structure. Results show that the composite shells. Global fracture is enhanced when internal pressure is combined with shear loads. The old reference denotes that nothing has been added to this comprehensive report since then.
Multidisciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Multi-Disciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song
1997-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Peter, Frank J.; Dalton, Larry J.; Plummer, David W.
2002-01-01
A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.
Sasaki, Akinori; Hiraoka, Eiji; Homma, Yosuke; Takahashi, Osamu; Norisue, Yasuhiro; Kawai, Koji; Fujitani, Shigeki
2017-01-01
Code status discussion is associated with a decrease in invasive procedures among terminally ill cancer patients. We investigated the association between code status discussion on admission and incidence of invasive procedures, cardiopulmonary resuscitation (CPR), and opioid use among inpatients with advanced stages of cancer and noncancer diseases. We performed a retrospective cohort study in a single center, Ito Municipal Hospital, Japan. Participants were patients who were admitted to the Department of Internal Medicine between October 1, 2013 and August 30, 2015, with advanced-stage cancer and noncancer. We collected demographic data and inquired the presence or absence of code status discussion within 24 hours of admission and whether invasive procedures, including central venous catheter placement, intubation with mechanical ventilation, and CPR for cardiac arrest, and opioid treatment were performed. We investigated the factors associated with CPR events by using multivariate logistic regression analysis. Among the total 232 patients, code status was discussed with 115 patients on admission, of which 114 (99.1%) patients had do-not-resuscitate (DNR) orders. The code status was not discussed with the remaining 117 patients on admission, of which 69 (59%) patients had subsequent code status discussion with resultant DNR orders. Code status discussion on admission decreased the incidence of central venous catheter placement, intubation with mechanical ventilation, and CPR in both cancer and noncancer patients. It tended to increase the rate of opioid use. Code status discussion on admission was the only factor associated with the decreased use of CPR ( P <0.001, odds ratio =0.03, 95% CI =0.004-0.21), which was found by using multivariate logistic regression analysis. Code status discussion on admission is associated with a decrease in invasive procedures and CPR in cancer and noncancer patients. Physicians should be educated about code status discussion to improve end-of-life care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balkey, K.; Witt, F.J.; Bishop, B.A.
1995-06-01
Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less
Current Research on Non-Coding Ribonucleic Acid (RNA).
Wang, Jing; Samuels, David C; Zhao, Shilin; Xiang, Yu; Zhao, Ying-Yong; Guo, Yan
2017-12-05
Non-coding ribonucleic acid (RNA) has without a doubt captured the interest of biomedical researchers. The ability to screen the entire human genome with high-throughput sequencing technology has greatly enhanced the identification, annotation and prediction of the functionality of non-coding RNAs. In this review, we discuss the current landscape of non-coding RNA research and quantitative analysis. Non-coding RNA will be categorized into two major groups by size: long non-coding RNAs and small RNAs. In long non-coding RNA, we discuss regular long non-coding RNA, pseudogenes and circular RNA. In small RNA, we discuss miRNA, transfer RNA, piwi-interacting RNA, small nucleolar RNA, small nuclear RNA, Y RNA, single recognition particle RNA, and 7SK RNA. We elaborate on the origin, detection method, and potential association with disease, putative functional mechanisms, and public resources for these non-coding RNAs. We aim to provide readers with a complete overview of non-coding RNAs and incite additional interest in non-coding RNA research.
Changing Beliefs about Trauma: A Qualitative Study of Cognitive Processing Therapy.
Price, Jennifer L; MacDonald, Helen Z; Adair, Kathryn C; Koerner, Naomi; Monson, Candice M
2016-03-01
Controlled qualitative methods complement quantitative treatment outcome research and enable a more thorough understanding of the effects of therapy and the suspected mechanisms of action. Thematic analyses were used to examine outcomes of cognitive processing therapy (CPT) for posttraumatic stress disorder (PTSD) in a randomized controlled trial of individuals diagnosed with military-related PTSD (n = 15). After sessions 1 and 11, participants wrote "impact statements" describing their appraisals of their trauma and beliefs potentially impacted by traumatic events. Trained raters coded each of these statements using a thematic coding scheme. An analysis of thematic coding revealed positive changes over the course of therapy in participants' perspective on their trauma and their future, supporting the purported mechanisms of CPT. Implications of this research for theory and clinical practice are discussed.
Structural design, analysis, and code evaluation of an odd-shaped pressure vessel
NASA Astrophysics Data System (ADS)
Rezvani, M. A.; Ziada, H. H.
1992-12-01
An effort to design, analyze, and evaluate a rectangular pressure vessel is described. Normally pressure vessels are designed in circular or spherical shapes to prevent stress concentrations. In this case, because of operational limitations, the choice of vessels was limited to a rectangular pressure box with a removable cover plate. The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code is used as a guideline for pressure containments whose width or depth exceeds 15.24 cm (6.0 in.) and where pressures will exceed 103.4 KPa (15.0 lbf/in(sup 2)). This evaluation used Section 8 of this Code, hereafter referred to as the Code. The dimensions and working pressure of the subject vessel fall within the pressure vessel category of the Code. The Code design guidelines and rules do not directly apply to this vessel. Therefore, finite-element methodology was used to analyze the pressure vessel, and the Code then was used in qualifying the vessel to be stamped to the Code. Section 8, Division 1 of the Code was used for evaluation. This action was justified by selecting a material for which fatigue damage would not be a concern. The stress analysis results were then checked against the Code, and the thicknesses adjusted to satisfy Code requirements. Although not directly applicable, the Code design formulas for rectangular vessels were also considered and presented.
A Clustering-Based Approach to Enriching Code Foraging Environment.
Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu
2016-09-01
Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.
Mechanical Design and Analysis of LCLS II 2 K Cold Box
NASA Astrophysics Data System (ADS)
Yang, S.; Dixon, K.; Laverdure, N.; Rath, D.; Bevins, M.; Bai, H.; Kaminski, S.; Ravindranath, V.
2017-12-01
The mechanical design and analysis of the LCLS II 2 K cold box are presented. Its feature and functionality are discussed. ASME B31.3 was used to design its internal piping, and compliance of the piping code was ensured through flexibility analysis. The 2 K cold box was analyzed using ANSYS 17.2; the requirements of the applicable codes—ASME Section VIII Division 2 and ASCE 7-10—were satisfied. Seismic load was explicitly considered in both analyses.
1991-08-01
specifications are taken primarily from the 1983 version of the ASME Boiler and Pressure Vessel Code . Other design requirements were developea from standard safe...rules and practices of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code to provide a safe and reliable system
REASSESSING MECHANISM AS A PREDICTOR OF PEDIATRIC INJURY MORTALITY
Beck, Haley; Mittal, Sushil; Madigan, David; Burd, Randall S.
2015-01-01
Background The use of mechanism of injury as a predictor of injury outcome presents practical challenges because this variable may be missing or inaccurate in many databases. The purpose of this study was to determine the importance of mechanism of injury as a predictor of mortality among injured children. Methods The records of children (<15 years old) sustaining a blunt injury were obtained from the National Trauma Data Bank. Models predicting injury mortality were developed using mechanism of injury and injury coding using either Abbreviated Injury Scale post-dot values (low-dimensional injury coding) or injury ICD-9 codes and their two-way interactions (high-dimensional injury coding). Model performance with and without inclusion of mechanism of injury was compared for both coding schemes, and the relative importance of mechanism of injury as a variable in each model type was evaluated. Results Among 62,569 records, a mortality rate of 0.9% was observed. Inclusion of mechanism of injury improved model performance when using low-dimensional injury coding but was associated with no improvement when using high-dimensional injury coding. Mechanism of injury contributed to 28% of model variance when using low-dimensional injury coding and <1% when high-dimensional injury coding was used. Conclusions Although mechanism of injury may be an important predictor of injury mortality among children sustaining blunt trauma, its importance as a predictor of mortality depends on approach used for injury coding. Mechanism of injury is not an essential predictor of outcome after injury when coding schemes are used that better characterize injuries sustained after blunt pediatric trauma. PMID:26197948
NASA Astrophysics Data System (ADS)
Davis, S. J.; Egolf, T. A.
1980-07-01
Acoustic characteristics predicted using a recently developed computer code were correlated with measured acoustic data for two helicopter rotors. The analysis, is based on a solution of the Ffowcs-Williams-Hawkings (FW-H) equation and includes terms accounting for both the thickness and loading components of the rotational noise. Computations are carried out in the time domain and assume free field conditions. Results of the correlation show that the Farrassat/Nystrom analysis, when using predicted airload data as input, yields fair but encouraging correlation for the first 6 harmonics of blade passage. It also suggests that although the analysis represents a valuable first step towards developing a truly comprehensive helicopter rotor noise prediction capability, further work remains to be done identifying and incorporating additional noise mechanisms into the code.
Clean Energy in City Codes: A Baseline Analysis of Municipal Codification across the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Jeffrey J.; Aznar, Alexandra; Dane, Alexander
Municipal governments in the United States are well positioned to influence clean energy (energy efficiency and alternative energy) and transportation technology and strategy implementation within their jurisdictions through planning, programs, and codification. Municipal governments are leveraging planning processes and programs to shape their energy futures. There is limited understanding in the literature related to codification, the primary way that municipal governments enact enforceable policies. The authors fill the gap in the literature by documenting the status of municipal codification of clean energy and transportation across the United States. More directly, we leverage online databases of municipal codes to develop nationalmore » and state-specific representative samples of municipal governments by population size. Our analysis finds that municipal governments with the authority to set residential building energy codes within their jurisdictions frequently do so. In some cases, communities set codes higher than their respective state governments. Examination of codes across the nation indicates that municipal governments are employing their code as a policy mechanism to address clean energy and transportation.« less
Analysis of internal flows relative to the space shuttle main engine
NASA Technical Reports Server (NTRS)
1987-01-01
Cooperative efforts between the Lockheed-Huntsville Computational Mechanics Group and the NASA-MSFC Computational Fluid Dynamics staff has resulted in improved capabilities for numerically simulating incompressible flows generic to the Space Shuttle Main Engine (SSME). A well established and documented CFD code was obtained, modified, and applied to laminar and turbulent flows of the type occurring in the SSME Hot Gas Manifold. The INS3D code was installed on the NASA-MSFC CRAY-XMP computer system and is currently being used by NASA engineers. Studies to perform a transient analysis of the FPB were conducted. The COBRA/TRAC code is recommended for simulating the transient flow of oxygen into the LOX manifold. Property data for modifying the code to represent LOX/GOX flow was collected. The ALFA code was developed and recommended for representing the transient combustion in the preburner. These two codes will couple through the transient boundary conditions to simulate the startup and/or shutdown of the fuel preburner. A study, NAS8-37461, is currently being conducted to implement this modeling effort.
Mechanisms of Temporal Pattern Discrimination by Human Observers
1994-02-15
Research Center Department of Psychology University of Florida Gainesville, Florida 32611 15 February 1994 Final Technical Report for Period 1 October 1990...Center tfpdCbE Department of Psychology ________ AFOSR/NL Gr. &OORESS (City. Stteco and ZIP Code) 7b. ADDRESS (City’. State and ZIP Code) University of...contrasting novice and experienced performance. Journal of Experimental Psychology : Human Perception and Performance, 18, 50-71. Berg, B. G. (1989). Analysis
Finite-block-length analysis in classical and quantum information theory.
Hayashi, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.
Finite-block-length analysis in classical and quantum information theory
HAYASHI, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962
Progressive Failure And Life Prediction of Ceramic and Textile Composites
NASA Technical Reports Server (NTRS)
Xue, David Y.; Shi, Yucheng; Katikala, Madhu; Johnston, William M., Jr.; Card, Michael F.
1998-01-01
An engineering approach to predict the fatigue life and progressive failure of multilayered composite and textile laminates is presented. Analytical models which account for matrix cracking, statistical fiber failures and nonlinear stress-strain behavior have been developed for both composites and textiles. The analysis method is based on a combined micromechanics, fracture mechanics and failure statistics analysis. Experimentally derived empirical coefficients are used to account for the interface of fiber and matrix, fiber strength, and fiber-matrix stiffness reductions. Similar approaches were applied to textiles using Repeating Unit Cells. In composite fatigue analysis, Walker's equation is applied for matrix fatigue cracking and Heywood's formulation is used for fiber strength fatigue degradation. The analysis has been compared with experiment with good agreement. Comparisons were made with Graphite-Epoxy, C/SiC and Nicalon/CAS composite materials. For textile materials, comparisons were made with triaxial braided and plain weave materials under biaxial or uniaxial tension. Fatigue predictions were compared with test data obtained from plain weave C/SiC materials tested at AS&M. Computer codes were developed to perform the analysis. Composite Progressive Failure Analysis for Laminates is contained in the code CPFail. Micromechanics Analysis for Textile Composites is contained in the code MicroTex. Both codes were adapted to run as subroutines for the finite element code ABAQUS and CPFail-ABAQUS and MicroTex-ABAQUS. Graphic user interface (GUI) was developed to connect CPFail and MicroTex with ABAQUS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it
2014-10-06
This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where themore » masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.« less
NASA Technical Reports Server (NTRS)
Davis, S. J.; Egolf, T. A.
1980-01-01
Acoustic characteristics predicted using a recently developed computer code were correlated with measured acoustic data for two helicopter rotors. The analysis, is based on a solution of the Ffowcs-Williams-Hawkings (FW-H) equation and includes terms accounting for both the thickness and loading components of the rotational noise. Computations are carried out in the time domain and assume free field conditions. Results of the correlation show that the Farrassat/Nystrom analysis, when using predicted airload data as input, yields fair but encouraging correlation for the first 6 harmonics of blade passage. It also suggests that although the analysis represents a valuable first step towards developing a truly comprehensive helicopter rotor noise prediction capability, further work remains to be done identifying and incorporating additional noise mechanisms into the code.
3D measurement using combined Gray code and dual-frequency phase-shifting approach
NASA Astrophysics Data System (ADS)
Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Liu, Xin
2018-04-01
The combined Gray code and phase-shifting approach is a commonly used 3D measurement technique. In this technique, an error that equals integer multiples of the phase-shifted fringe period, i.e. period jump error, often exists in the absolute analog code, which can lead to gross measurement errors. To overcome this problem, the present paper proposes 3D measurement using a combined Gray code and dual-frequency phase-shifting approach. Based on 3D measurement using the combined Gray code and phase-shifting approach, one set of low-frequency phase-shifted fringe patterns with an odd-numbered multiple of the original phase-shifted fringe period is added. Thus, the absolute analog code measured value can be obtained by the combined Gray code and phase-shifting approach, and the low-frequency absolute analog code measured value can also be obtained by adding low-frequency phase-shifted fringe patterns. Then, the corrected absolute analog code measured value can be obtained by correcting the former by the latter, and the period jump errors can be eliminated, resulting in reliable analog code unwrapping. For the proposed approach, we established its measurement model, analyzed its measurement principle, expounded the mechanism of eliminating period jump errors by error analysis, and determined its applicable conditions. Theoretical analysis and experimental results show that the proposed approach can effectively eliminate period jump errors, reliably perform analog code unwrapping, and improve the measurement accuracy.
Gschwind, Michael K
2013-07-23
Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.
Guidelines for VCCT-Based Interlaminar Fatigue and Progressive Failure Finite Element Analysis
NASA Technical Reports Server (NTRS)
Deobald, Lyle R.; Mabson, Gerald E.; Engelstad, Steve; Prabhakar, M.; Gurvich, Mark; Seneviratne, Waruna; Perera, Shenal; O'Brien, T. Kevin; Murri, Gretchen; Ratcliffe, James;
2017-01-01
This document is intended to detail the theoretical basis, equations, references and data that are necessary to enhance the functionality of commercially available Finite Element codes, with the objective of having functionality better suited for the aerospace industry in the area of composite structural analysis. The specific area of focus will be improvements to composite interlaminar fatigue and progressive interlaminar failure. Suggestions are biased towards codes that perform interlaminar Linear Elastic Fracture Mechanics (LEFM) using Virtual Crack Closure Technique (VCCT)-based algorithms [1,2]. All aspects of the science associated with composite interlaminar crack growth are not fully developed and the codes developed to predict this mode of failure must be programmed with sufficient flexibility to accommodate new functional relationships as the science matures.
NASA Technical Reports Server (NTRS)
Amundsen, R. M.; Feldhaus, W. S.; Little, A. D.; Mitchum, M. V.
1995-01-01
Electronic integration of design and analysis processes was achieved and refined at Langley Research Center (LaRC) during the development of an optical bench for a laser-based aerospace experiment. Mechanical design has been integrated with thermal, structural and optical analyses. Electronic import of the model geometry eliminates the repetitive steps of geometry input to develop each analysis model, leading to faster and more accurate analyses. Guidelines for integrated model development are given. This integrated analysis process has been built around software that was already in use by designers and analysis at LaRC. The process as currently implemented used Pro/Engineer for design, Pro/Manufacturing for fabrication, PATRAN for solid modeling, NASTRAN for structural analysis, SINDA-85 and P/Thermal for thermal analysis, and Code V for optical analysis. Currently, the only analysis model to be built manually is the Code V model; all others can be imported for the Pro/E geometry. The translator from PATRAN results to Code V optical analysis (PATCOD) was developed and tested at LaRC. Directions for use of the translator or other models are given.
Computational simulation of progressive fracture in fiber composites
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.
NASA Technical Reports Server (NTRS)
Hou, Gene
1998-01-01
Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.
Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method
NASA Astrophysics Data System (ADS)
Yuan, Zhe; Zhang, Yiming; Zheng, Qijia
2018-02-01
An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.
78 FR 37721 - Approval of American Society of Mechanical Engineers' Code Cases
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-24
...-0359] RIN 3150-AI72 Approval of American Society of Mechanical Engineers' Code Cases AGENCY: Nuclear... mandatory American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel (BPV) Code and... Guide'' series. In a notice of proposed rulemaking, ``Approval of American Society of Mechanical...
NASA Astrophysics Data System (ADS)
Karahan, Aydın; Buongiorno, Jacopo
2010-01-01
An engineering code to predict the irradiation behavior of U-Zr and U-Pu-Zr metallic alloy fuel pins and UO2-PuO2 mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named Fuel Engineering and Structural analysis Tool (FEAST). FEAST has several modules working in coupled form with an explicit numerical algorithm. These modules describe fission gas release and fuel swelling, fuel chemistry and restructuring, temperature distribution, fuel-clad chemical interaction, and fuel and clad mechanical analysis including transient creep-fracture for the clad. Given the fuel pin geometry, composition and irradiation history, FEAST can analyze fuel and clad thermo-mechanical behavior at both steady-state and design-basis (non-disruptive) transient scenarios. FEAST was written in FORTRAN-90 and has a simple input file similar to that of the LWR fuel code FRAPCON. The metal-fuel version is called FEAST-METAL, and is described in this paper. The oxide-fuel version, FEAST-OXIDE is described in a companion paper. With respect to the old Argonne National Laboratory code LIFE-METAL and other same-generation codes, FEAST-METAL emphasizes more mechanistic, less empirical models, whenever available. Specifically, fission gas release and swelling are modeled with the GRSIS algorithm, which is based on detailed tracking of fission gas bubbles within the metal fuel. Migration of the fuel constituents is modeled by means of thermo-transport theory. Fuel-clad chemical interaction models based on precipitation kinetics were developed for steady-state operation and transients. Finally, a transient intergranular creep-fracture model for the clad, which tracks the nucleation and growth of the cavities at the grain boundaries, was developed for and implemented in the code. Reducing the empiricism in the constitutive models should make it more acceptable to extrapolate FEAST-METAL to new fuel compositions and higher burnup, as envisioned in advanced sodium reactors. FEAST-METAL was benchmarked against the open-literature EBR-II database for steady state and furnace tests (transients). The results show that the code is able to predict important phenomena such as clad strain, fission gas release, clad wastage, clad failure time, axial fuel slug deformation and fuel constituent redistribution, satisfactorily.
Computational mechanics analysis tools for parallel-vector supercomputers
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning
1993-01-01
Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-23
... NUCLEAR REGULATORY COMMISSION 10 CFR Part 50 [NRC-2008-0554] RIN 3150-AI35 American Society of Mechanical Engineers (ASME) Codes and New and Revised ASME Code Cases; Corrections AGENCY: Nuclear Regulatory... the American Society of Mechanical Engineers, Three Park Avenue, New York, NY 10016, phone (800) 843...
24 CFR 200.925c - Model codes.
Code of Federal Regulations, 2011 CFR
2011-04-01
... DEVELOPMENT GENERAL INTRODUCTION TO FHA PROGRAMS Minimum Property Standards § 200.925c Model codes. (a... Plumbing Code, 1993 Edition, and the BOCA National Mechanical Code, 1993 Edition, excluding Chapter I, Administration, for the Building, Plumbing and Mechanical Codes and the references to fire retardant treated wood...
24 CFR 200.925c - Model codes.
Code of Federal Regulations, 2010 CFR
2010-04-01
... DEVELOPMENT GENERAL INTRODUCTION TO FHA PROGRAMS Minimum Property Standards § 200.925c Model codes. (a... Plumbing Code, 1993 Edition, and the BOCA National Mechanical Code, 1993 Edition, excluding Chapter I, Administration, for the Building, Plumbing and Mechanical Codes and the references to fire retardant treated wood...
High compression image and image sequence coding
NASA Technical Reports Server (NTRS)
Kunt, Murat
1989-01-01
The digital representation of an image requires a very large number of bits. This number is even larger for an image sequence. The goal of image coding is to reduce this number, as much as possible, and reconstruct a faithful duplicate of the original picture or image sequence. Early efforts in image coding, solely guided by information theory, led to a plethora of methods. The compression ratio reached a plateau around 10:1 a couple of years ago. Recent progress in the study of the brain mechanism of vision and scene analysis has opened new vistas in picture coding. Directional sensitivity of the neurones in the visual pathway combined with the separate processing of contours and textures has led to a new class of coding methods capable of achieving compression ratios as high as 100:1 for images and around 300:1 for image sequences. Recent progress on some of the main avenues of object-based methods is presented. These second generation techniques make use of contour-texture modeling, new results in neurophysiology and psychophysics and scene analysis.
2015-09-23
Round Robin Propellant Testing for Development of AOP-4717 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...area code) N/A Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. 239.18 0 Air Force Dynamic Mechanical Analysis of NATO Round Robin ...the clamps are tight at the coldest temperature. • Long tests such as the frequency sweep sequences prescribed in this round robin may be
Energy and technology review: Engineering modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabayan, H.S.; Goudreau, G.L.; Ziolkowski, R.W.
1986-10-01
This report presents information concerning: Modeling Canonical Problems in Electromagnetic Coupling Through Apertures; Finite-Element Codes for Computing Electrostatic Fields; Finite-Element Modeling of Electromagnetic Phenomena; Modeling Microwave-Pulse Compression in a Resonant Cavity; Lagrangian Finite-Element Analysis of Penetration Mechanics; Crashworthiness Engineering; Computer Modeling of Metal-Forming Processes; Thermal-Mechanical Modeling of Tungsten Arc Welding; Modeling Air Breakdown Induced by Electromagnetic Fields; Iterative Techniques for Solving Boltzmann's Equations for p-Type Semiconductors; Semiconductor Modeling; and Improved Numerical-Solution Techniques in Large-Scale Stress Analysis.
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali
1993-01-01
A two-dimensional finite element fracture mechanics analysis of a space shuttle main engine (SSME) turbine blade firtree was performed using the MARC finite element code. The analysis was conducted under combined effects of thermal and mechanical loads at steady-state conditions. Data from a typical engine stand cycle of the SSME were used to run a heat transfer analysis and, subsequently, a thermal structural fracture mechanics analysis. Temperature and stress contours for the firtree under these operating conditions were generated. High stresses were found at the firtree lobes where crack initiation was triggered. A life assessment of the firtree was done by assuming an initial and a final crack size.
NASA Technical Reports Server (NTRS)
Moncada, Albert M.; Chattopadhyay, Aditi; Bednarcyk, Brett A.; Arnold, Steven M.
2008-01-01
Predicting failure in a composite can be done with ply level mechanisms and/or micro level mechanisms. This paper uses the Generalized Method of Cells and High-Fidelity Generalized Method of Cells micromechanics theories, coupled with classical lamination theory, as implemented within NASA's Micromechanics Analysis Code with Generalized Method of Cells. The code is able to implement different failure theories on the level of both the fiber and the matrix constituents within a laminate. A comparison is made among maximum stress, maximum strain, Tsai-Hill, and Tsai-Wu failure theories. To verify the failure theories the Worldwide Failure Exercise (WWFE) experiments have been used. The WWFE is a comprehensive study that covers a wide range of polymer matrix composite laminates. The numerical results indicate good correlation with the experimental results for most of the composite layups, but also point to the need for more accurate resin damage progression models.
Interfacing modules for integrating discipline specific structural mechanics codes
NASA Technical Reports Server (NTRS)
Endres, Ned M.
1989-01-01
An outline of the organization and capabilities of the Engine Structures Computational Simulator (Simulator) at NASA Lewis Research Center is given. One of the goals of the research at Lewis is to integrate various discipline specific structural mechanics codes into a software system which can be brought to bear effectively on a wide range of engineering problems. This system must possess the qualities of being effective and efficient while still remaining user friendly. The simulator was initially designed for the finite element simulation of gas jet engine components. Currently, the simulator has been restricted to only the analysis of high pressure turbine blades and the accompanying rotor assembly, although the current installation can be expanded for other applications. The simulator presently assists the user throughout its procedures by performing information management tasks, executing external support tasks, organizing analysis modules and executing these modules in the user defined order while maintaining processing continuity.
NASA Astrophysics Data System (ADS)
Mosunova, N. A.
2018-05-01
The article describes the basic models included in the EUCLID/V1 integrated code intended for safety analysis of liquid metal (sodium, lead, and lead-bismuth) cooled fast reactors using fuel rods with a gas gap and pellet dioxide, mixed oxide or nitride uranium-plutonium fuel under normal operation, under anticipated operational occurrences and accident conditions by carrying out interconnected thermal-hydraulic, neutronics, and thermal-mechanical calculations. Information about the Russian and foreign analogs of the EUCLID/V1 integrated code is given. Modeled objects, equation systems in differential form solved in each module of the EUCLID/V1 integrated code (the thermal-hydraulic, neutronics, fuel rod analysis module, and the burnup and decay heat calculation modules), the main calculated quantities, and also the limitations on application of the code are presented. The article also gives data on the scope of functions performed by the integrated code's thermal-hydraulic module, using which it is possible to describe both one- and twophase processes occurring in the coolant. It is shown that, owing to the availability of the fuel rod analysis module in the integrated code, it becomes possible to estimate the performance of fuel rods in different regimes of the reactor operation. It is also shown that the models implemented in the code for calculating neutron-physical processes make it possible to take into account the neutron field distribution over the fuel assembly cross section as well as other features important for the safety assessment of fast reactors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donnelly, H.; Fullwood, R.; Glancy, J.
This is the second volume of a two volume report on the VISA method for evaluating safeguards at fixed-site facilities. This volume contains appendices that support the description of the VISA concept and the initial working version of the method, VISA-1, presented in Volume I. The information is separated into four appendices, each describing details of one of the four analysis modules that comprise the analysis sections of the method. The first appendix discusses Path Analysis methodology, applies it to a Model Fuel Facility, and describes the computer codes that are being used. Introductory material on Path Analysis given inmore » Chapter 3.2.1 and Chapter 4.2.1 of Volume I. The second appendix deals with Detection Analysis, specifically the schemes used in VISA-1 for classifying adversaries and the methods proposed for evaluating individual detection mechanisms in order to build the data base required for detection analysis. Examples of evaluations on identity-access systems, SNM portal monitors, and intrusion devices are provided. The third appendix describes the Containment Analysis overt-segment path ranking, the Monte Carlo engagement model, the network simulation code, the delay mechanism data base, and the results of a sensitivity analysis. The last appendix presents general equations used in Interruption Analysis for combining covert-overt segments and compares them with equations given in Volume I, Chapter 3.« less
Lu, Xiangfeng; Peloso, Gina M; Liu, Dajiang J; Wu, Ying; Zhang, He; Zhou, Wei; Li, Jun; Tang, Clara Sze-Man; Dorajoo, Rajkumar; Li, Huaixing; Long, Jirong; Guo, Xiuqing; Xu, Ming; Spracklen, Cassandra N; Chen, Yang; Liu, Xuezhen; Zhang, Yan; Khor, Chiea Chuen; Liu, Jianjun; Sun, Liang; Wang, Laiyuan; Gao, Yu-Tang; Hu, Yao; Yu, Kuai; Wang, Yiqin; Cheung, Chloe Yu Yan; Wang, Feijie; Huang, Jianfeng; Fan, Qiao; Cai, Qiuyin; Chen, Shufeng; Shi, Jinxiu; Yang, Xueli; Zhao, Wanting; Sheu, Wayne H-H; Cherny, Stacey Shawn; He, Meian; Feranil, Alan B; Adair, Linda S; Gordon-Larsen, Penny; Du, Shufa; Varma, Rohit; Chen, Yii-Der Ida; Shu, Xiao-Ou; Lam, Karen Siu Ling; Wong, Tien Yin; Ganesh, Santhi K; Mo, Zengnan; Hveem, Kristian; Fritsche, Lars G; Nielsen, Jonas Bille; Tse, Hung-Fat; Huo, Yong; Cheng, Ching-Yu; Chen, Y Eugene; Zheng, Wei; Tai, E Shyong; Gao, Wei; Lin, Xu; Huang, Wei; Abecasis, Goncalo; Kathiresan, Sekar; Mohlke, Karen L; Wu, Tangchun; Sham, Pak Chung; Gu, Dongfeng; Willer, Cristen J
2017-12-01
Most genome-wide association studies have been of European individuals, even though most genetic variation in humans is seen only in non-European samples. To search for novel loci associated with blood lipid levels and clarify the mechanism of action at previously identified lipid loci, we used an exome array to examine protein-coding genetic variants in 47,532 East Asian individuals. We identified 255 variants at 41 loci that reached chip-wide significance, including 3 novel loci and 14 East Asian-specific coding variant associations. After a meta-analysis including >300,000 European samples, we identified an additional nine novel loci. Sixteen genes were identified by protein-altering variants in both East Asians and Europeans, and thus are likely to be functional genes. Our data demonstrate that most of the low-frequency or rare coding variants associated with lipids are population specific, and that examining genomic data across diverse ancestries may facilitate the identification of functional genes at associated loci.
Lu, Xiangfeng; Peloso, Gina M; Liu, Dajiang J.; Wu, Ying; Zhang, He; Zhou, Wei; Li, Jun; Tang, Clara Sze-man; Dorajoo, Rajkumar; Li, Huaixing; Long, Jirong; Guo, Xiuqing; Xu, Ming; Spracklen, Cassandra N.; Chen, Yang; Liu, Xuezhen; Zhang, Yan; Khor, Chiea Chuen; Liu, Jianjun; Sun, Liang; Wang, Laiyuan; Gao, Yu-Tang; Hu, Yao; Yu, Kuai; Wang, Yiqin; Cheung, Chloe Yu Yan; Wang, Feijie; Huang, Jianfeng; Fan, Qiao; Cai, Qiuyin; Chen, Shufeng; Shi, Jinxiu; Yang, Xueli; Zhao, Wanting; Sheu, Wayne H.-H.; Cherny, Stacey Shawn; He, Meian; Feranil, Alan B.; Adair, Linda S.; Gordon-Larsen, Penny; Du, Shufa; Varma, Rohit; da Chen, Yii-Der I; Shu, XiaoOu; Lam, Karen Siu Ling; Wong, Tien Yin; Ganesh, Santhi K.; Mo, Zengnan; Hveem, Kristian; Fritsche, Lars; Nielsen, Jonas Bille; Tse, Hung-fat; Huo, Yong; Cheng, Ching-Yu; Chen, Y. Eugene; Zheng, Wei; Tai, E Shyong; Gao, Wei; Lin, Xu; Huang, Wei; Abecasis, Goncalo; Consortium, GLGC; Kathiresan, Sekar; Mohlke, Karen L.; Wu, Tangchun; Sham, Pak Chung; Gu, Dongfeng; Willer, Cristen J
2017-01-01
Most genome-wide association studies have been conducted in European individuals, even though most genetic variation in humans is seen only in non-European samples. To search for novel loci associated with blood lipid levels and clarify the mechanism of action at previously identified lipid loci, we examined protein-coding genetic variants in 47,532 East Asian individuals using an exome array. We identified 255 variants at 41 loci reaching chip-wide significance, including 3 novel loci and 14 East Asian-specific coding variant associations. After meta-analysis with > 300,000 European samples, we identified an additional 9 novel loci. The same 16 genes were identified by the protein-altering variants in both East Asians and Europeans, likely pointing to the functional genes. Our data demonstrate that most of the low-frequency or rare coding variants associated with lipids are population-specific, and that examining genomic data across diverse ancestries may facilitate the identification of functional genes at associated loci. PMID:29083407
SurfKin: an ab initio kinetic code for modeling surface reactions.
Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K
2014-10-05
In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.
Deciphering Neural Codes of Memory during Sleep.
Chen, Zhe; Wilson, Matthew A
2017-05-01
Memories of experiences are stored in the cerebral cortex. Sleep is critical for the consolidation of hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms in memory consolidation and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches to the analysis of sleep-associated neural codes (SANCs). We focus on two analysis paradigms for sleep-associated memory and propose a new unsupervised learning framework ('memory first, meaning later') for unbiased assessment of SANCs. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Thompson, E.
1979-01-01
A finite element computer code for the analysis of mantle convection is described. The coupled equations for creeping viscous flow and heat transfer can be solved for either a transient analysis or steady-state analysis. For transient analyses, either a control volume or a control mass approach can be used. Non-Newtonian fluids with viscosities which have thermal and spacial dependencies can be easily incorporated. All material parameters may be written as function statements by the user or simply specified as constants. A wide range of boundary conditions, both for the thermal analysis and the viscous flow analysis can be specified. For steady-state analyses, elastic strain rates can be included. Although this manual was specifically written for users interested in mantle convection, the code is equally well suited for analysis in a number of other areas including metal forming, glacial flows, and creep of rock and soil.
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Putt, Charles W.
1997-01-01
The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.
Rhodes, Gillian; Nishimura, Mayu; de Heering, Adelaide; Jeffery, Linda; Maurer, Daphne
2017-05-01
Faces are adaptively coded relative to visual norms that are updated by experience, and this adaptive coding is linked to face recognition ability. Here we investigated whether adaptive coding of faces is disrupted in individuals (adolescents and adults) who experience face recognition difficulties following visual deprivation from congenital cataracts in infancy. We measured adaptive coding using face identity aftereffects, where smaller aftereffects indicate less adaptive updating of face-coding mechanisms by experience. We also examined whether the aftereffects increase with adaptor identity strength, consistent with norm-based coding of identity, as in typical populations, or whether they show a different pattern indicating some more fundamental disruption of face-coding mechanisms. Cataract-reversal patients showed significantly smaller face identity aftereffects than did controls (Experiments 1 and 2). However, their aftereffects increased significantly with adaptor strength, consistent with norm-based coding (Experiment 2). Thus we found reduced adaptability but no fundamental disruption of norm-based face-coding mechanisms in cataract-reversal patients. Our results suggest that early visual experience is important for the normal development of adaptive face-coding mechanisms. © 2016 John Wiley & Sons Ltd.
DARKDROID: Exposing the Dark Side of Android Marketplaces
2016-06-01
Moreover, our approaches can detect apps containing both intentional and unintentional vulnerabilities, such as unsafe code loading mechanisms and...Security, Static Analysis, Dynamic Analysis, Malware Detection , Vulnerability Scanning 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18...applications in a DoD context. ................... 1 1.2.2 Develop sophisticated whole-system static analyses to detect malicious Android applications
The College Football Student-Athlete's Academic Experience: Network Analysis and Model Development
ERIC Educational Resources Information Center
Young, Kyle McLendon
2010-01-01
A grounded theory research study employing network analysis as a means of facilitating the latter stages of the coding process was conducted at a selective university that competes at the highest level of college football. The purpose of the study was to develop a better understanding of how interactive dynamics and controlling mechanisms, such as…
NASA Technical Reports Server (NTRS)
Hofmann, R.
1980-01-01
The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.
Numerical Analysis in Fracture Mechanics.
1983-01-20
pressuriza- tion has also been solved [66] by the HEMP code. The advantage of such supercode, however, lies in its ability to analyze elastic- plastic ...analyzing the elasto-dynamic and elastic- plastic dynamic states In fracturing 2- and 3-D prob’ems. The use of a super finite difference code to study...the finite difference elastic- plastic result of Jacobs in 1950 [2J which was followed by others In the 1960’s [3 - 5). Swedlow et al [6], on the other a
Structural mechanics simulations
NASA Technical Reports Server (NTRS)
Biffle, Johnny H.
1992-01-01
Sandia National Laboratory has a very broad structural capability. Work has been performed in support of reentry vehicles, nuclear reactor safety, weapons systems and components, nuclear waste transport, strategic petroleum reserve, nuclear waste storage, wind and solar energy, drilling technology, and submarine programs. The analysis environment contains both commercial and internally developed software. Included are mesh generation capabilities, structural simulation codes, and visual codes for examining simulation results. To effectively simulate a wide variety of physical phenomena, a large number of constitutive models have been developed.
Lotrecchiano, Gaetano R
2013-08-01
Since the concept of team science gained recognition among biomedical researchers, social scientists have been challenged with investigating evidence of team mechanisms and functional dynamics within transdisciplinary teams. Identification of these mechanisms has lacked substantial research using grounded theory models to adequately describe their dynamical qualities. Research trends continue to favor the measurement of teams by isolating occurrences of production over relational mechanistic team tendencies. This study uses a social constructionist-grounded multilevel mixed methods approach to identify social dynamics and mechanisms within a transdisciplinary team. A National Institutes of Health-funded research team served as a sample. Data from observations, interviews, and focus groups were qualitatively coded to generate micro/meso level analyses. Social mechanisms operative within this biomedical scientific team were identified. Dynamics that support such mechanisms were documented and explored. Through theoretical and emergent coding, four social mechanisms dominated in the analysis-change, kinship, tension, and heritage. Each contains relational social dynamics. This micro/meso level study suggests such mechanisms and dynamics are key features of team science and as such can inform problems of integration, praxis, and engagement in teams. © 2013 Wiley Periodicals, Inc.
Prediction of properties of intraply hybrid composites
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.
1979-01-01
Equations based on the mixtures rule are presented for predicting the physical, thermal, hygral, and mechanical properties of unidirectional intraply hybrid composites (UIHC) from the corresponding properties of their constituent composites. Bounds were derived for uniaxial longitudinal strengths, tension, compression, and flexure of UIHC. The equations predict shear and flexural properties which agree with experimental data from UIHC. Use of these equations in a composites mechanics computer code predicted flexural moduli which agree with experimental data from various intraply hybrid angleplied laminates (IHAL). It is indicated, briefly, how these equations can be used in conjunction with composite mechanics and structural analysis during the analysis/design process.
A Sequential Fluid-mechanic Chemical-kinetic Model of Propane HCCI Combustion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aceves, S M; Flowers, D L; Martinez-Frias, J
2000-11-29
We have developed a methodology for predicting combustion and emissions in a Homogeneous Charge Compression Ignition (HCCI) Engine. This methodology combines a detailed fluid mechanics code with a detailed chemical kinetics code. Instead of directly linking the two codes, which would require an extremely long computational time, the methodology consists of first running the fluid mechanics code to obtain temperature profiles as a function of time. These temperature profiles are then used as input to a multi-zone chemical kinetics code. The advantage of this procedure is that a small number of zones (10) is enough to obtain accurate results. Thismore » procedure achieves the benefits of linking the fluid mechanics and the chemical kinetics codes with a great reduction in the computational effort, to a level that can be handled with current computers. The success of this procedure is in large part a consequence of the fact that for much of the compression stroke the chemistry is inactive and thus has little influence on fluid mechanics and heat transfer. Then, when chemistry is active, combustion is rather sudden, leaving little time for interaction between chemistry and fluid mixing and heat transfer. This sequential methodology has been capable of explaining the main characteristics of HCCI combustion that have been observed in experiments. In this paper, we use our model to explore an HCCI engine running on propane. The paper compares experimental and numerical pressure traces, heat release rates, and hydrocarbon and carbon monoxide emissions. The results show an excellent agreement, even in parameters that are difficult to predict, such as chemical heat release rates. Carbon monoxide emissions are reasonably well predicted, even though it is intrinsically difficult to make good predictions of CO emissions in HCCI engines. The paper includes a sensitivity study on the effect of the heat transfer correlation on the results of the analysis. Importantly, the paper also shows a numerical study on how parameters such as swirl rate, crevices and ceramic walls could help in reducing HC and CO emissions from HCCI engines.« less
Research on offense and defense technology for iOS kernel security mechanism
NASA Astrophysics Data System (ADS)
Chu, Sijun; Wu, Hao
2018-04-01
iOS is a strong and widely used mobile device system. It's annual profits make up about 90% of the total profits of all mobile phone brands. Though it is famous for its security, there have been many attacks on the iOS operating system, such as the Trident apt attack in 2016. So it is important to research the iOS security mechanism and understand its weaknesses and put forward targeted protection and security check framework. By studying these attacks and previous jailbreak tools, we can see that an attacker could only run a ROP code and gain kernel read and write permissions based on the ROP after exploiting kernel and user layer vulnerabilities. However, the iOS operating system is still protected by the code signing mechanism, the sandbox mechanism, and the not-writable mechanism of the system's disk area. This is far from the steady, long-lasting control that attackers expect. Before iOS 9, breaking these security mechanisms was usually done by modifying the kernel's important data structures and security mechanism code logic. However, after iOS 9, the kernel integrity protection mechanism was added to the 64-bit operating system and none of the previous methods were adapted to the new versions of iOS [1]. But this does not mean that attackers can not break through. Therefore, based on the analysis of the vulnerability of KPP security mechanism, this paper implements two possible breakthrough methods for kernel security mechanism for iOS9 and iOS10. Meanwhile, we propose a defense method based on kernel integrity detection and sensitive API call detection to defense breakthrough method mentioned above. And we make experiments to prove that this method can prevent and detect attack attempts or invaders effectively and timely.
Coupled multi-disciplinary composites behavior simulation
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.; Murthy, Pappu L. N.; Chamis, Christos C.
1993-01-01
The capabilities of the computer code CSTEM (Coupled Structural/Thermal/Electro-Magnetic Analysis) are discussed and demonstrated. CSTEM computationally simulates the coupled response of layered multi-material composite structures subjected to simultaneous thermal, structural, vibration, acoustic, and electromagnetic loads and includes the effect of aggressive environments. The composite material behavior and structural response is determined at its various inherent scales: constituents (fiber/matrix), ply, laminate, and structural component. The thermal and mechanical properties of the constituents are considered to be nonlinearly dependent on various parameters such as temperature and moisture. The acoustic and electromagnetic properties also include dependence on vibration and electromagnetic wave frequencies, respectively. The simulation is based on a three dimensional finite element analysis in conjunction with composite mechanics and with structural tailoring codes, and with acoustic and electromagnetic analysis methods. An aircraft engine composite fan blade is selected as a typical structural component to demonstrate the CSTEM capabilities. Results of various coupled multi-disciplinary heat transfer, structural, vibration, acoustic, and electromagnetic analyses for temperature distribution, stress and displacement response, deformed shape, vibration frequencies, mode shapes, acoustic noise, and electromagnetic reflection from the fan blade are discussed for their coupled effects in hot and humid environments. Collectively, these results demonstrate the effectiveness of the CSTEM code in capturing the coupled effects on the various responses of composite structures subjected to simultaneous multiple real-life loads.
Psychometric Properties of the System for Coding Couples’ Interactions in Therapy - Alcohol
Owens, Mandy D.; McCrady, Barbara S.; Borders, Adrienne Z.; Brovko, Julie M.; Pearson, Matthew R.
2014-01-01
Few systems are available for coding in-session behaviors for couples in therapy. Alcohol Behavior Couples Therapy (ABCT) is an empirically supported treatment, but little is known about its mechanisms of behavior change. In the current study, an adapted version of the Motivational Interviewing for Significant Others coding system was developed into the System for Coding Couples’ Interactions in Therapy – Alcohol (SCCIT-A), which was used to code couples’ interactions and behaviors during ABCT. Results showed good inter-rater reliability of the SCCIT-A and provided evidence that the SCCIT-A may be a promising measure for understanding couples in therapy. A three factor model of the SCCIT-A was examined (Positive, Negative, and Change Talk/Counter-Change Talk) using a confirmatory factor analysis, but model fit was poor. Due to poor model fit, ratios were computed for Positive/Negative ratings and for Change Talk/Counter-Change Talk codes based on previous research in the couples and Motivational Interviewing literature. Post-hoc analyses examined correlations between specific SCCIT-A codes and baseline characteristics and indicated some concurrent validity. Correlations were run between ratios and baseline characteristics; ratios may be an alternative to using the factors from the SCCIT-A. Reliability and validity analyses suggest that the SCCIT-A has the potential to be a useful measure for coding in-session behaviors of both partners in couples therapy and could be used to identify mechanisms of behavior change for ABCT. Additional research is needed to improve the reliability of some codes and to further develop the SCCIT-A and other measures of couples’ interactions in therapy. PMID:25528049
Computational mechanics analysis tools for parallel-vector supercomputers
NASA Technical Reports Server (NTRS)
Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.
1993-01-01
Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.
CACTI: free, open-source software for the sequential coding of behavioral interactions.
Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.
NASA Technical Reports Server (NTRS)
Mcbeath, Giorgio; Ghorashi, Bahman; Chun, Kue
1993-01-01
A thermal NO(x) prediction model is developed to interface with a CFD, k-epsilon based code. A converged solution from the CFD code is the input to the postprocessing model for prediction of thermal NO(x). The model uses a decoupled analysis to estimate the equilibrium level of (NO(x))e which is the constant rate limit. This value is used to estimate the flame (NO(x)) and in turn predict the rate of formation at each node using a two-step Zeldovich mechanism. The rate is fixed on the NO(x) production rate plot by estimating the time to reach equilibrium by a differential analysis based on the reaction: O + N2 = NO + N. The rate is integrated in the nonequilibrium time space based on the residence time at each node in the computational domain. The sum of all nodal predictions yields the total NO(x) level.
Bao, Yi; Chen, Yizheng; Hoehler, Matthew S; Smith, Christopher M; Bundy, Matthew; Chen, Genda
2017-01-01
This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C.
Enhancing the ABAQUS Thermomechanics Code to Simulate Steady and Transient Fuel Rod Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. L. Williamson; D. A. Knoll
2009-09-01
A powerful multidimensional fuels performance capability, applicable to both steady and transient fuel behavior, is developed based on enhancements to the commercially available ABAQUS general-purpose thermomechanics code. Enhanced capabilities are described, including: UO2 temperature and burnup dependent thermal properties, solid and gaseous fission product swelling, fuel densification, fission gas release, cladding thermal and irradiation creep, cladding irradiation growth , gap heat transfer, and gap/plenum gas behavior during irradiation. The various modeling capabilities are demonstrated using a 2D axisymmetric analysis of the upper section of a simplified multi-pellet fuel rod, during both steady and transient operation. Computational results demonstrate the importancemore » of a multidimensional fully-coupled thermomechanics treatment. Interestingly, many of the inherent deficiencies in existing fuel performance codes (e.g., 1D thermomechanics, loose thermo-mechanical coupling, separate steady and transient analysis, cumbersome pre- and post-processing) are, in fact, ABAQUS strengths.« less
Snipelisky, David; Dumitrascu, Adrian; Ray, Jordan; Roy, Archana; Matcha, Gautam; Harris, Dana; Vadeboncoeur, Tyler; Kusumoto, Fred; Burton, M Caroline
2017-12-06
Guidelines recommend discussing code status with patients on hospital admission. No study has evaluated the feasibility of a full code with do not intubate (DNI) status. A retrospective analysis of patients who experienced a cardiopulmonary arrest was performed between May 1, 2008 and June 20, 2014. A descriptive analysis was created based on whether patients required mechanical ventilatory support during the hospitalization and comparisons were made between both patient subsets. A total of 239 patients were included. Almost all (n = 218, 91.2%) required intubation during the hospitalization. Over half (n = 117, 53.7%) were intubated on the same day as the cardiopulmonary arrest and 91 patients (41.7%) were intubated at the time of arrest. Comparisons between intubated and non-intubated patients showed little differences in clinical characteristics, except for a higher proportion of medical cardiac etiology for admission in patients who did not require intubation (n = 10, 47.6% versus n = 55, 25.2%; p = 0.18) and initial arrest rhythm of ventricular tachycardia/fibrillation (n = 8, 38.1% versus n = 50, 22.9%; p = 0.37). No differences in 24-hour and posthospital survivals were present. Mechanical ventilatory support is commonly utilized in patients who experience a cardiopulmonary arrest. The DNI status may not be a feasible code status option for most patients.
ICAN Computer Code Adapted for Building Materials
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.
1997-01-01
The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.
A phase code for memory could arise from circuit mechanisms in entorhinal cortex
Hasselmo, Michael E.; Brandon, Mark P.; Yoshida, Motoharu; Giocomo, Lisa M.; Heys, James G.; Fransen, Erik; Newman, Ehren L.; Zilli, Eric A.
2009-01-01
Neurophysiological data reveals intrinsic cellular properties that suggest how entorhinal cortical neurons could code memory by the phase of their firing. Potential cellular mechanisms for this phase coding in models of entorhinal function are reviewed. This mechanism for phase coding provides a substrate for modeling the responses of entorhinal grid cells, as well as the replay of neural spiking activity during waking and sleep. Efforts to implement these abstract models in more detailed biophysical compartmental simulations raise specific issues that could be addressed in larger scale population models incorporating mechanisms of inhibition. PMID:19656654
Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes
NASA Astrophysics Data System (ADS)
Harrington, James William
Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present a local classical processing scheme for correcting errors on toric codes, which demonstrates that quantum information can be maintained in two dimensions by purely local (quantum and classical) resources.
Development of the NASA/FLAGRO computer program for analysis of airframe structures
NASA Technical Reports Server (NTRS)
Forman, R. G.; Shivakumar, V.; Newman, J. C., Jr.
1994-01-01
The NASA/FLAGRO (NASGRO) computer program was developed for fracture control analysis of space hardware and is currently the standard computer code in NASA, the U.S. Air Force, and the European Agency (ESA) for this purpose. The significant attributes of the NASGRO program are the numerous crack case solutions, the large materials file, the improved growth rate equation based on crack closure theory, and the user-friendly promptive input features. In support of the National Aging Aircraft Research Program (NAARP); NASGRO is being further developed to provide advanced state-of-the-art capability for damage tolerance and crack growth analysis of aircraft structural problems, including mechanical systems and engines. The project currently involves a cooperative development effort by NASA, FAA, and ESA. The primary tasks underway are the incorporation of advanced methodology for crack growth rate retardation resulting from spectrum loading and improved analysis for determining crack instability. Also, the current weight function solutions in NASGRO or nonlinear stress gradient problems are being extended to more crack cases, and the 2-d boundary integral routine for stress analysis and stress-intensity factor solutions is being extended to 3-d problems. Lastly, effort is underway to enhance the program to operate on personal computers and work stations in a Windows environment. Because of the increasing and already wide usage of NASGRO, the code offers an excellent mechanism for technology transfer for new fatigue and fracture mechanics capabilities developed within NAARP.
Aeropropulsion 1987. Session 3: Internal Fluid Mechanics Research
NASA Technical Reports Server (NTRS)
1987-01-01
Internal fluid mechanics research at Lewis is directed toward an improved understanding of the important flow physics affecting aerospace propulsion systems, and applying this improved understanding to formulate accurate predictive codes. To this end, research is conducted involving detailed experimentation and analysis. The presentations in this session summarize ongoing work and indicated future emphasis in three major research thrusts: namely, inlets, ducts, and nozzles; turbomachinery; and chemical reacting flows.
Method for calculating internal radiation and ventilation with the ADINAT heat-flow code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butkovich, T.R.; Montan, D.N.
1980-04-01
One objective of the spent fuel test in Climax Stock granite (SFTC) is to correctly model the thermal transport, and the changes in the stress field and accompanying displacements from the application of the thermal loads. We have chosen the ADINA and ADINAT finite element codes to do these calculations. ADINAT is a heat transfer code compatible to the ADINA displacement and stress analysis code. The heat flow problem encountered at SFTC requires a code with conduction, radiation, and ventilation capabilities, which the present version of ADINAT does not have. We have devised a method for calculating internal radiation andmore » ventilation with the ADINAT code. This method effectively reproduces the results from the TRUMP multi-dimensional finite difference code, which correctly models radiative heat transport between drift surfaces, conductive and convective thermal transport to and through air in the drifts, and mass flow of air in the drifts. The temperature histories for each node in the finite element mesh calculated with ADINAT using this method can be used directly in the ADINA thermal-mechanical calculation.« less
Comprehensive analysis of transport aircraft flight performance
NASA Astrophysics Data System (ADS)
Filippone, Antonio
2008-04-01
This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance, atmospheric effects, economic Mach number and noise trajectories at F.A.R. landing points.
Rhodes, Gillian; Ewing, Louise; Jeffery, Linda; Avard, Eleni; Taylor, Libby
2014-09-01
Faces are adaptively coded relative to visual norms that are updated by experience. This coding is compromised in autism and the broader autism phenotype, suggesting that atypical adaptive coding of faces may be an endophenotype for autism. Here we investigate the nature of this atypicality, asking whether adaptive face-coding mechanisms are fundamentally altered, or simply less responsive to experience, in autism. We measured adaptive coding, using face identity aftereffects, in cognitively able children and adolescents with autism and neurotypical age- and ability-matched participants. We asked whether these aftereffects increase with adaptor identity strength as in neurotypical populations, or whether they show a different pattern indicating a more fundamental alteration in face-coding mechanisms. As expected, face identity aftereffects were reduced in the autism group, but they nevertheless increased with adaptor strength, like those of our neurotypical participants, consistent with norm-based coding of face identity. Moreover, their aftereffects correlated positively with face recognition ability, consistent with an intact functional role for adaptive coding in face recognition ability. We conclude that adaptive norm-based face-coding mechanisms are basically intact in autism, but are less readily calibrated by experience. Copyright © 2014 Elsevier Ltd. All rights reserved.
Application of bifurcation analysis for determining the mechanism of coding of nociceptive signals
NASA Astrophysics Data System (ADS)
Dik, O. E.; Shelykh, T. N.; Plakhova, V. B.; Nozdrachev, A. D.; Podzorova, S. A.; Krylov, B. V.
2015-10-01
The patch clamp method is used for studying the characteristics of slow sodium channels responsible for coding of nociceptive signals. Quantitative estimates of rate constants of transitions of "normal" and pharmacologically modified activation gating mechanisms of these channels are obtained. A mathematical model of the type of Hogdkin-Huxley nociceptive neuron membrane is constructed. Cometic acid, which is a drug substance of a new nonopioid analgesic, is used as a pharmacological agent. The application of bifurcation analysis makes it possible to outline the boundaries of the region in which a periodic impulse activity is generated. This boundary separates the set of values of the model parameter for which periodic pulsation is observed from the values for which such pulsations are absent or damped. The results show that the finest effect of modulation of physical characteristic of a part of a protein molecule and its effective charge suppresses the excitability of the nociceptive neuron membrane and, hence, leads to rapid reduction of pain.
NASA Technical Reports Server (NTRS)
Book, W. J.
1974-01-01
The Flexible Manipulator Analysis Program (FMAP) is a collection of FORTRAN coding to allow easy analysis of the flexible dynamics of mechanical arms. The user specifies the arm configuration and parameters and any or all of several frequency domain analyses to be performed, while the time domain impulse response is obtained by inverse Fourier transformation of the frequency response. A detailed explanation of how to use FMAP is provided.
A Mechanical Power Flow Capability for the Finite Element Code NASTRAN
1989-07-01
perimental methods. statistical energy analysis , the finite element method, and a finite element analog-,y using heat conduction equations. Experimental...weights and inertias of the transducers attached to an experimental structure may produce accuracy problems. Statistical energy analysis (SEA) is a...405-422 (1987). 8. Lyon, R.L., Statistical Energy Analysis of Dynamical Sistems, The M.I.T. Press, (1975). 9. Mickol, J.D., and R.J. Bernhard, "An
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
NASA Technical Reports Server (NTRS)
Baker, A. J.; Iannelli, G. S.; Manhardt, Paul D.; Orzechowski, J. A.
1993-01-01
This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.
An arbitrary grid CFD algorithm for configuration aerodynamics analysis. Volume 2: FEMNAS user guide
NASA Technical Reports Server (NTRS)
Manhardt, Paul D.; Orzechowski, J. A.; Baker, A. J.
1992-01-01
This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.
Approximate Micromechanics Treatise of Composite Impact
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Handler, Louis M.
2005-01-01
A formalism is described for micromechanic impact of composites. The formalism consists of numerous equations which describe all aspects of impact from impactor and composite conditions to impact contact, damage progression, and penetration or containment. The formalism is based on through-the-thickness displacement increments simulation which makes it convenient to track local damage in terms of microfailure modes and their respective characteristics. A flow chart is provided to cast the formalism (numerous equations) into a computer code for embedment in composite mechanic codes and/or finite element composite structural analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin Whiting; Crane, Nathan K.; Heinstein, Martin W.
2011-03-01
Adagio is a Lagrangian, three-dimensional, implicit code for the analysis of solids and structures. It uses a multi-level iterative solver, which enables it to solve problems with large deformations, nonlinear material behavior, and contact. It also has a versatile library of continuum and structural elements, and an extensive library of material models. Adagio is written for parallel computing environments, and its solvers allow for scalable solutions of very large problems. Adagio uses the SIERRA Framework, which allows for coupling with other SIERRA mechanics codes. This document describes the functionality and input structure for Adagio.
Lery, Letícia M S; Bitar, Mainá; Costa, Mauricio G S; Rössle, Shaila C S; Bisch, Paulo M
2010-12-22
G. diazotrophicus and A. vinelandii are aerobic nitrogen-fixing bacteria. Although oxygen is essential for the survival of these organisms, it irreversibly inhibits nitrogenase, the complex responsible for nitrogen fixation. Both microorganisms deal with this paradox through compensatory mechanisms. In A. vinelandii a conformational protection mechanism occurs through the interaction between the nitrogenase complex and the FeSII protein. Previous studies suggested the existence of a similar system in G. diazotrophicus, but the putative protein involved was not yet described. This study intends to identify the protein coding gene in the recently sequenced genome of G. diazotrophicus and also provide detailed structural information of nitrogenase conformational protection in both organisms. Genomic analysis of G. diazotrophicus sequences revealed a protein coding ORF (Gdia0615) enclosing a conserved "fer2" domain, typical of the ferredoxin family and found in A. vinelandii FeSII. Comparative models of both FeSII and Gdia0615 disclosed a conserved beta-grasp fold. Cysteine residues that coordinate the 2[Fe-S] cluster are in conserved positions towards the metallocluster. Analysis of solvent accessible residues and electrostatic surfaces unveiled an hydrophobic dimerization interface. Dimers assembled by molecular docking presented a stable behaviour and a proper accommodation of regions possibly involved in binding of FeSII to nitrogenase throughout molecular dynamics simulations in aqueous solution. Molecular modeling of the nitrogenase complex of G. diazotrophicus was performed and models were compared to the crystal structure of A. vinelandii nitrogenase. Docking experiments of FeSII and Gdia0615 with its corresponding nitrogenase complex pointed out in both systems a putative binding site presenting shape and charge complementarities at the Fe-protein/MoFe-protein complex interface. The identification of the putative FeSII coding gene in G. diazotrophicus genome represents a large step towards the understanding of the conformational protection mechanism of nitrogenase against oxygen. In addition, this is the first study regarding the structural complementarities of FeSII-nitrogenase interactions in diazotrophic bacteria. The combination of bioinformatic tools for genome analysis, comparative protein modeling, docking calculations and molecular dynamics provided a powerful strategy for the elucidation of molecular mechanisms and structural features of FeSII-nitrogenase interaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philip, Bobby
2012-06-01
The Advanced Multi-Physics (AMP) code, in its present form, will allow a user to build a multi-physics application code for existing mechanics and diffusion operators and extend them with user-defined material models and new physics operators. There are examples that demonstrate mechanics, thermo-mechanics, coupled diffusion, and mechanical contact. The AMP code is designed to leverage a variety of mathematical solvers (PETSc, Trilinos, SUNDIALS, and AMP solvers) and mesh databases (LibMesh and AMP) in a consistent interchangeable approach.
2015-11-01
Memorandum Simulation of Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes... Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes by Charles R. Fisher...TYPE Technical Report 3. DATES COVERED (From - To) Dec 2013 – July 2015 4. TITLE AND SUBTITLE Simulation of Weld Mechanical Behavior to Include
2015-11-01
Memorandum Simulation of Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes... Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes by Charles R. Fisher...TYPE Technical Report 3. DATES COVERED (From - To) Dec 2013 – July 2015 4. TITLE AND SUBTITLE Simulation of Weld Mechanical Behavior to Include
ERIC Educational Resources Information Center
Rogowski, Steve
1982-01-01
A problem is detailed which has a solution that embodies geometry, trigonometry, ballistics, projectile mechanics, vector analysis, and elementary computer graphics. It is felt that the information and sample computer programs can be a useful starting point for a user written code that involves missiles and other projectiles. (MP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raboin, P J
1998-01-01
The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D.more » Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.« less
Fracture Mechanics Analysis for Short Cracks
1990-02-01
OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION I(if applicable ) United Technologies Res. Ctr. JF S 6c ADDRESS (City. State, and ZIP Code) 7b...plastic fracture mechanics ( EPFM ) with the pioneering works of Hult and McClintock (Ref. 3), Dugdale (Ref. 4), Barenblatt (Ref. 5), Bilby, Cottrell and...Swinden (Refs. 6 and 7), Rice (Ref. 8), and Hutchinson (Ref. 9). EPFM is applica- ble and needed especially for high toughness and low strength
Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise
2018-05-01
Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.
Mechanism on brain information processing: Energy coding
NASA Astrophysics Data System (ADS)
Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa
2006-09-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.
Progress of Stirling cycle analysis and loss mechanism characterization
NASA Technical Reports Server (NTRS)
Tew, R. C., Jr.
1986-01-01
An assessment of Stirling engine thermodynamic modeling and design codes shows a general deficiency; this deficiency is due to poor understanding of the fluid flow and heat transfer phenomena that occur in the oscillating flow and pressure level environment within the engines. Stirling engine thermodynamic loss mechanisms are listed. Several experimental and computational research efforts now underway to characterize various loss mechanisms are reviewed. The need for additional experimental rigs and rig upgrades is discussed. Recent developments and current efforts in Stirling engine thermodynamic modeling are also reviewed.
Deciphering Neural Codes of Memory during Sleep
Chen, Zhe; Wilson, Matthew A.
2017-01-01
Memories of experiences are stored in the cerebral cortex. Sleep is critical for consolidating hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms on memory consolidation, and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here, we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches for analyzing sleep-associated neural codes (SANC). We focus on two analysis paradigms for sleep-associated memory, and propose a new unsupervised learning framework (“memory first, meaning later”) for unbiased assessment of SANC. PMID:28390699
Introduction to the internal fluid mechanics research session
NASA Technical Reports Server (NTRS)
Miller, Brent A.; Povinelli, Louis A.
1990-01-01
Internal fluid mechanics research at LeRC is directed toward an improved understanding of the important flow physics affecting aerospace propulsion systems, and applying this improved understanding to formulate accurate predictive codes. To this end, research is conducted involving detailed experimentation and analysis. The following three papers summarize ongoing work and indicate future emphasis in three major research thrusts: inlets, ducts, and nozzles; turbomachinery; and chemical reacting flows. The underlying goal of the research in each of these areas is to bring internal computational fluid mechanic to a state of practical application for aerospace propulsion systems. Achievement of this goal requires that carefully planned and executed experiments be conducted in order to develop and validate useful codes. It is critical that numerical code development work and experimental work be closely coupled. The insights gained are represented by mathematical models that form the basis for code development. The resultant codes are then tested by comparing them with appropriate experiments in order to ensure their validity and determine their applicable range. The ultimate user community must be a part of this process to assure relevancy of the work and to hasten its practical application. Propulsion systems are characterized by highly complex and dynamic internal flows. Many complex, 3-D flow phenomena may be present, including unsteadiness, shocks, and chemical reactions. By focusing on specific portions of a propulsion system, it is often possible to identify the dominant phenomena that must be understood and modeled for obtaining accurate predictive capability. The three major research thrusts serve as a focus leading to greater understanding of the relevant physics and to an improvement in analytic tools. This in turn will hasten continued advancements in propulsion system performance and capability.
Trading Speed and Accuracy by Coding Time: A Coupled-circuit Cortical Model
Standage, Dominic; You, Hongzhi; Wang, Da-Hui; Dorris, Michael C.
2013-01-01
Our actions take place in space and time, but despite the role of time in decision theory and the growing acknowledgement that the encoding of time is crucial to behaviour, few studies have considered the interactions between neural codes for objects in space and for elapsed time during perceptual decisions. The speed-accuracy trade-off (SAT) provides a window into spatiotemporal interactions. Our hypothesis is that temporal coding determines the rate at which spatial evidence is integrated, controlling the SAT by gain modulation. Here, we propose that local cortical circuits are inherently suited to the relevant spatial and temporal coding. In simulations of an interval estimation task, we use a generic local-circuit model to encode time by ‘climbing’ activity, seen in cortex during tasks with a timing requirement. The model is a network of simulated pyramidal cells and inhibitory interneurons, connected by conductance synapses. A simple learning rule enables the network to quickly produce new interval estimates, which show signature characteristics of estimates by experimental subjects. Analysis of network dynamics formally characterizes this generic, local-circuit timing mechanism. In simulations of a perceptual decision task, we couple two such networks. Network function is determined only by spatial selectivity and NMDA receptor conductance strength; all other parameters are identical. To trade speed and accuracy, the timing network simply learns longer or shorter intervals, driving the rate of downstream decision processing by spatially non-selective input, an established form of gain modulation. Like the timing network's interval estimates, decision times show signature characteristics of those by experimental subjects. Overall, we propose, demonstrate and analyse a generic mechanism for timing, a generic mechanism for modulation of decision processing by temporal codes, and we make predictions for experimental verification. PMID:23592967
Wind turbine design codes: A comparison of the structural response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buhl, M.L. Jr.; Wright, A.D.; Pierce, K.G.
2000-03-01
The National Wind Technology Center (NWTC) of the National Renewable Energy Laboratory is continuing a comparison of several computer codes used in the design and analysis of wind turbines. The second part of this comparison determined how well the programs predict the structural response of wind turbines. In this paper, the authors compare the structural response for four programs: ADAMS, BLADED, FAST{_}AD, and YawDyn. ADAMS is a commercial, multibody-dynamics code from Mechanical Dynamics, Inc. BLADED is a commercial, performance and structural-response code from Garrad Hassan and Partners Limited. FAST{_}AD is a structural-response code developed by Oregon State University and themore » University of Utah for the NWTC. YawDyn is a structural-response code developed by the University of Utah for the NWTC. ADAMS, FAST{_}AD, and YawDyn use the University of Utah's AeroDyn subroutine package for calculating aerodynamic forces. Although errors were found in all the codes during this study, once they were fixed, the codes agreed surprisingly well for most of the cases and configurations that were evaluated. One unresolved discrepancy between BLADED and the AeroDyn-based codes was when there was blade and/or teeter motion in addition to a large yaw error.« less
Rhodes, Gillian; Jeffery, Linda; Taylor, Libby; Hayward, William G; Ewing, Louise
2014-06-01
Despite their similarity as visual patterns, we can discriminate and recognize many thousands of faces. This expertise has been linked to 2 coding mechanisms: holistic integration of information across the face and adaptive coding of face identity using norms tuned by experience. Recently, individual differences in face recognition ability have been discovered and linked to differences in holistic coding. Here we show that they are also linked to individual differences in adaptive coding of face identity, measured using face identity aftereffects. Identity aftereffects correlated significantly with several measures of face-selective recognition ability. They also correlated marginally with own-race face recognition ability, suggesting a role for adaptive coding in the well-known other-race effect. More generally, these results highlight the important functional role of adaptive face-coding mechanisms in face expertise, taking us beyond the traditional focus on holistic coding mechanisms. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Evaluation of flaws in carbon steel piping. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zahoor, A.; Gamble, R.M.; Mehta, H.S.
1986-10-01
The objective of this program was to develop flaw evaluation procedures and allowable flaw sizes for ferritic piping used in light water reactor (LWR) power generation facilities. The program results provide relevant ASME Code groups with the information necessary to define flaw evaluation procedures, allowable flaw sizes, and their associated bases for Section XI of the code. Because there are several possible flaw-related failure modes for ferritic piping over the LWR operating temperature range, three analysis methods were employed to develop the evaluation procedures. These include limit load analysis for plastic collapse, elastic plastic fracture mechanics (EPFM) analysis for ductilemore » tearing, and linear elastic fracture mechanics (LEFM) analysis for non ductile crack extension. To ensure the appropriate analysis method is used in an evaluation, a step by step procedure also is provided to identify the relevant acceptance standard or procedure on a case by case basis. The tensile strength and toughness properties required to complete the flaw evaluation for any of the three analysis methods are included in the evaluation procedure. The flaw evaluation standards are provided in tabular form for the plastic collapse and ductile tearing modes, where the allowable part through flaw depth is defined as a function of load and flaw length. For non ductile crack extension, linear elastic fracture mechanics analysis methods, similar to those in Appendix A of Section XI, are defined. Evaluation flaw sizes and procedures are developed for both longitudinal and circumferential flaw orientations and normal/upset and emergency/faulted operating conditions. The tables are based on margins on load of 2.77 and 1.39 for circumferential flaws and 3.0 and 1.5 for longitudinal flaws for normal/upset and emergency/faulted conditions, respectively.« less
CACTI: Free, Open-Source Software for the Sequential Coding of Behavioral Interactions
Glynn, Lisa H.; Hallgren, Kevin A.; Houck, Jon M.; Moyers, Theresa B.
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery. PMID:22815713
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-05-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-01-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
Spriggs, M J; Sumner, R L; McMillan, R L; Moran, R J; Kirk, I J; Muthukumaraswamy, S D
2018-04-30
The Roving Mismatch Negativity (MMN), and Visual LTP paradigms are widely used as independent measures of sensory plasticity. However, the paradigms are built upon fundamentally different (and seemingly opposing) models of perceptual learning; namely, Predictive Coding (MMN) and Hebbian plasticity (LTP). The aim of the current study was to compare the generative mechanisms of the MMN and visual LTP, therefore assessing whether Predictive Coding and Hebbian mechanisms co-occur in the brain. Forty participants were presented with both paradigms during EEG recording. Consistent with Predictive Coding and Hebbian predictions, Dynamic Causal Modelling revealed that the generation of the MMN modulates forward and backward connections in the underlying network, while visual LTP only modulates forward connections. These results suggest that both Predictive Coding and Hebbian mechanisms are utilized by the brain under different task demands. This therefore indicates that both tasks provide unique insight into plasticity mechanisms, which has important implications for future studies of aberrant plasticity in clinical populations. Copyright © 2018 Elsevier Inc. All rights reserved.
Non-coding variants contribute to the clinical heterogeneity of TTR amyloidosis.
Iorio, Andrea; De Lillo, Antonella; De Angelis, Flavio; Di Girolamo, Marco; Luigetti, Marco; Sabatelli, Mario; Pradotto, Luca; Mauro, Alessandro; Mazzeo, Anna; Stancanelli, Claudia; Perfetto, Federico; Frusconi, Sabrina; My, Filomena; Manfellotto, Dario; Fuciarelli, Maria; Polimanti, Renato
2017-09-01
Coding mutations in TTR gene cause a rare hereditary form of systemic amyloidosis, which has a complex genotype-phenotype correlation. We investigated the role of non-coding variants in regulating TTR gene expression and consequently amyloidosis symptoms. We evaluated the genotype-phenotype correlation considering the clinical information of 129 Italian patients with TTR amyloidosis. Then, we conducted a re-sequencing of TTR gene to investigate how non-coding variants affect TTR expression and, consequently, phenotypic presentation in carriers of amyloidogenic mutations. Polygenic scores for genetically determined TTR expression were constructed using data from our re-sequencing analysis and the GTEx (Genotype-Tissue Expression) project. We confirmed a strong phenotypic heterogeneity across coding mutations causing TTR amyloidosis. Considering the effects of non-coding variants on TTR expression, we identified three patient clusters with specific expression patterns associated with certain phenotypic presentations, including late onset, autonomic neurological involvement, and gastrointestinal symptoms. This study provides novel data regarding the role of non-coding variation and the gene expression profiles in patients affected by TTR amyloidosis, also putting forth an approach that could be used to investigate the mechanisms at the basis of the genotype-phenotype correlation of the disease.
Bao, Yi; Chen, Yizheng; Hoehler, Matthew S.; Smith, Christopher M.; Bundy, Matthew; Chen, Genda
2016-01-01
This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C. PMID:28239230
Calculation of three-dimensional, inviscid, supersonic, steady flows
NASA Technical Reports Server (NTRS)
Moretti, G.
1981-01-01
A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.
NASA Astrophysics Data System (ADS)
Dattoli, G.; Migliorati, M.; Schiavi, A.
2007-05-01
The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high-intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of these types of problems should be fast and reliable, conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that the integration procedure is capable of reproducing the onset of instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed.
Advanced Pellet-Cladding Interaction Modeling using the US DOE CASL Fuel Performance Code: Peregrine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montgomery, Robert O.; Capps, Nathan A.; Sunderland, Dion J.
The US DOE’s Consortium for Advanced Simulation of LWRs (CASL) program has undertaken an effort to enhance and develop modeling and simulation tools for a virtual reactor application, including high fidelity neutronics, fluid flow/thermal hydraulics, and fuel and material behavior. The fuel performance analysis efforts aim to provide 3-dimensional capabilities for single and multiple rods to assess safety margins and the impact of plant operation and fuel rod design on the fuel thermo-mechanical-chemical behavior, including Pellet-Cladding Interaction (PCI) failures and CRUD-Induced Localized Corrosion (CILC) failures in PWRs. [1-3] The CASL fuel performance code, Peregrine, is an engineering scale code thatmore » is built upon the MOOSE/ELK/FOX computational FEM framework, which is also common to the fuel modeling framework, BISON [4,5]. Peregrine uses both 2-D and 3-D geometric fuel rod representations and contains a materials properties and fuel behavior model library for the UO2 and Zircaloy system common to PWR fuel derived from both open literature sources and the FALCON code [6]. The primary purpose of Peregrine is to accurately calculate the thermal, mechanical, and chemical processes active throughout a single fuel rod during operation in a reactor, for both steady state and off-normal conditions.« less
Data engineering systems: Computerized modeling and data bank capabilities for engineering analysis
NASA Technical Reports Server (NTRS)
Kopp, H.; Trettau, R.; Zolotar, B.
1984-01-01
The Data Engineering System (DES) is a computer-based system that organizes technical data and provides automated mechanisms for storage, retrieval, and engineering analysis. The DES combines the benefits of a structured data base system with automated links to large-scale analysis codes. While the DES provides the user with many of the capabilities of a computer-aided design (CAD) system, the systems are actually quite different in several respects. A typical CAD system emphasizes interactive graphics capabilities and organizes data in a manner that optimizes these graphics. On the other hand, the DES is a computer-aided engineering system intended for the engineer who must operationally understand an existing or planned design or who desires to carry out additional technical analysis based on a particular design. The DES emphasizes data retrieval in a form that not only provides the engineer access to search and display the data but also links the data automatically with the computer analysis codes.
Extrusion Process by Finite Volume Method Using OpenFoam Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matos Martins, Marcelo; Tonini Button, Sergio; Divo Bressan, Jose
The computational codes are very important tools to solve engineering problems. In the analysis of metal forming process, such as extrusion, this is not different because the computational codes allow analyzing the process with reduced cost. Traditionally, the Finite Element Method is used to solve solid mechanic problems, however, the Finite Volume Method (FVM) have been gaining force in this field of applications. This paper presents the velocity field and friction coefficient variation results, obtained by numerical simulation using the OpenFoam Software and the FVM to solve an aluminum direct cold extrusion process.
NASA Technical Reports Server (NTRS)
Veres, Joseph P.; Jorgenson, Philip C. E.; Wright, William B.
2011-01-01
The focus of this study is on utilizing a mean line compressor flow analysis code coupled to an engine system thermodynamic code, to estimate the effects of ice accretion on the low pressure compressor, and quantifying its effects on the engine system throughout a notional flight trajectory. In this paper a temperature range in which engine icing would occur was assumed. This provided a mechanism to locate potential component icing sites and allow the computational tools to add blockages due to ice accretion in a parametric fashion. Ultimately the location and level of blockage due to icing would be provided by an ice accretion code. To proceed, an engine system modeling code and a mean line compressor flow analysis code were utilized to calculate the flow conditions in the fan-core and low pressure compressor and to identify potential locations within the compressor where ice may accrete. In this study, an "additional blockage" due to the accretion of ice on the metal surfaces, has been added to the baseline aerodynamic blockage due to boundary layer, as well as the blade metal blockage. Once the potential locations of ice accretion are identified, the levels of additional blockage due to accretion were parametrically varied to estimate the effects on the low pressure compressor blade row performance operating within the engine system environment. This study includes detailed analysis of compressor and engine performance during cruise and descent operating conditions at several altitudes within the notional flight trajectory. The purpose of this effort is to develop the computer codes to provide a predictive capability to forecast the onset of engine icing events, such that they could ultimately help in the avoidance of these events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bass, B.R.; Bryan, R.H.; Bryson, J.W.
This paper summarizes the capabilities and applications of the general-purpose and special-purpose computer programs that have been developed for use in fracture mechanics analyses of HSST pressure vessel experiments. Emphasis is placed on the OCA/USA code, which is designed for analysis of pressurized-thermal-shock (PTS) conditions, and on the ORMGEN/ADINA/ORVIRT system which is used for more general analysis. Fundamental features of these programs are discussed, along with applications to pressure vessel experiments.
NASA Astrophysics Data System (ADS)
Watanabe, Junpei; Ishikawa, Hiroaki; Arouette, Xavier; Matsumoto, Yasuaki; Miki, Norihisa
2012-06-01
In this paper, we present a vibrational Braille code display with large-displacement micro-electro-mechanical systems (MEMS) actuator arrays. Tactile receptors are more sensitive to vibrational stimuli than to static ones. Therefore, when each cell of the Braille code vibrates at optimal frequencies, subjects can recognize the codes more efficiently. We fabricated a vibrational Braille code display that used actuators consisting of piezoelectric actuators and a hydraulic displacement amplification mechanism (HDAM) as cells. The HDAM that encapsulated incompressible liquids in microchambers with two flexible polymer membranes could amplify the displacement of the MEMS actuator. We investigated the voltage required for subjects to recognize Braille codes when each cell, i.e., the large-displacement MEMS actuator, vibrated at various frequencies. Lower voltages were required at vibration frequencies higher than 50 Hz than at vibration frequencies lower than 50 Hz, which verified that the proposed vibrational Braille code display is efficient by successfully exploiting the characteristics of human tactile receptors.
Stop Codon Reassignment in the Wild
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivanova, Natalia; Schwientek, Patrick; Tripp, H. James
Since the discovery of the genetic code and protein translation mechanisms (1), a limited number of variations of the standard assignment between unique base triplets (codons) and their encoded amino acids and translational stop signals have been found in bacteria and phages (2-3). Given the apparent ubiquity of the canonical genetic code, the design of genomically recoded organisms with non-canonical codes has been suggested as a means to prevent horizontal gene transfer between laboratory and environmental organisms (4). It is also predicted that genomically recoded organisms are immune to infection by viruses, under the assumption that phages and their hostsmore » must share a common genetic code (5). This paradigm is supported by the observation of increased resistance of genomically recoded bacteria to phages with a canonical code (4). Despite these assumptions and accompanying lines of evidence, it remains unclear whether differential and non-canonical codon usage represents an absolute barrier to phage infection and genetic exchange between organisms. Our knowledge of the diversity of genetic codes and their use by viruses and their hosts is primarily derived from the analysis of cultivated organisms. Advances in single-cell sequencing and metagenome assembly technologies have enabled the reconstruction of genomes of uncultivated bacterial and archaeal lineages (6). These initial findings suggest that large scale systematic studies of uncultivated microorganisms and viruses may reveal the extent and modes of divergence from the canonical genetic code operating in nature. To explore alternative genetic codes, we carried out a systematic analysis of stop codon reassignments from the canonical TAG amber, TGA opal, and TAA ochre codons in assembled metagenomes from environmental and host-associated samples, single-cell genomes of uncultivated bacteria and archaea, and a collection of phage sequences« less
Understanding LiP Promoters from Phanerochaete chrysosporium: A Bioinformatic Analysis
Sergio Lobos; Rubén Polanco; Mario Tello; Dan Cullen; Daniela Seelenfreund; Rafael Vicuña
2011-01-01
DNA contains the coding information for the entire set of proteins produced by an organism. The specific combination of proteins synthesized varies with developmental, metabolic and environmental circumstances. This variation is generated by regulatory mechanisms that direct the production of messenger ribonucleic acid (mRNA) and subsequent translation of the...
High temperature composite analyzer (HITCAN) user's manual, version 1.0
NASA Technical Reports Server (NTRS)
Lackney, J. J.; Singhal, S. N.; Murthy, P. L. N.; Gotsis, P.
1993-01-01
This manual describes 'how-to-use' the computer code, HITCAN (HIgh Temperature Composite ANalyzer). HITCAN is a general purpose computer program for predicting nonlinear global structural and local stress-strain response of arbitrarily oriented, multilayered high temperature metal matrix composite structures. This code combines composite mechanics and laminate theory with an internal data base for material properties of the constituents (matrix, fiber and interphase). The thermo-mechanical properties of the constituents are considered to be nonlinearly dependent on several parameters including temperature, stress and stress rate. The computation procedure for the analysis of the composite structures uses the finite element method. HITCAN is written in FORTRAN 77 computer language and at present has been configured and executed on the NASA Lewis Research Center CRAY XMP and YMP computers. This manual describes HlTCAN's capabilities and limitations followed by input/execution/output descriptions and example problems. The input is described in detail including (1) geometry modeling, (2) types of finite elements, (3) types of analysis, (4) material data, (5) types of loading, (6) boundary conditions, (7) output control, (8) program options, and (9) data bank.
Yang, Yiwei; Xu, Yuejin; Miu, Jichang; Zhou, Linghong; Xiao, Zhongju
2012-10-01
To apply the classic leakage integrate-and-fire models, based on the mechanism of the generation of physiological auditory stimulation, in the information processing coding of cochlear implants to improve the auditory result. The results of algorithm simulation in digital signal processor (DSP) were imported into Matlab for a comparative analysis. Compared with CIS coding, the algorithm of membrane potential integrate-and-fire (MPIF) allowed more natural pulse discharge in a pseudo-random manner to better fit the physiological structures. The MPIF algorithm can effectively solve the problem of the dynamic structure of the delivered auditory information sequence issued in the auditory center and allowed integration of the stimulating pulses and time coding to ensure the coherence and relevance of the stimulating pulse time.
Applications of Some New Ideas on Irreversible Processes to Particular Fluids.
1987-09-23
616ftf SI PPLICAtTIONS OF SOME1 NEW IDEAS ON IEIYRSIKE V PROCESSES TO PARTICULAR FLUIDS(U) JOHNS HOPKINS UNIV BALTIMORE NO DEPT OF RATIONAL MECHANICS...Code) 7b. ADDRESS (City, State, and ZIP Code) Department of Rational Mechanics Baltimore, MD 21218. Boiling AFB, DC 20332 OL NAME OF FUNDING/ SPONSORING...34" - ’ ’ I Justification_. Clifford A. Truesdell By_ Professor, Program in Rational Mechanics Distribution/ Availability Codes’ !Avail and/or Dist
Jet-A fuel evaporation analysis in conical tube injectors
NASA Technical Reports Server (NTRS)
Lai, M.-C.; Chue, T.-H.; Zhu, G.; Sun, H.; Tacina, R.; Chun, K.; Hicks, Y.
1991-01-01
A simple one-dimensional drop-life-history analysis and a multidimensional spray calculation using KIVA-II code are applied to the vaporization of Jet-A fuel in multiple tube injectors. Within the assumptions of the analysis, the one-dimensional results are useful for design purposes. The pressure-atomizer breakup models do not accurately predict the dropsize measured experimentally or deduced from the one-dimensional analysis. Cold flow visualization and dropsize measurements show that capillary wave breakup mechanism plays an important role in the spray angle and droplet impingement on the tube wall.
Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.
1997-01-01
A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.
Methods for simulation-based analysis of fluid-structure interaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barone, Matthew Franklin; Payne, Jeffrey L.
2005-10-01
Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less
Simulation of Laser Cooling and Trapping in Engineering Applications
NASA Technical Reports Server (NTRS)
Ramirez-Serrano, Jaime; Kohel, James; Thompson, Robert; Yu, Nan; Lunblad, Nathan
2005-01-01
An advanced computer code is undergoing development for numerically simulating laser cooling and trapping of large numbers of atoms. The code is expected to be useful in practical engineering applications and to contribute to understanding of the roles that light, atomic collisions, background pressure, and numbers of particles play in experiments using laser-cooled and -trapped atoms. The code is based on semiclassical theories of the forces exerted on atoms by magnetic and optical fields. Whereas computer codes developed previously for the same purpose account for only a few physical mechanisms, this code incorporates many more physical mechanisms (including atomic collisions, sub-Doppler cooling mechanisms, Stark and Zeeman energy shifts, gravitation, and evanescent-wave phenomena) that affect laser-matter interactions and the cooling of atoms to submillikelvin temperatures. Moreover, whereas the prior codes can simulate the interactions of at most a few atoms with a resonant light field, the number of atoms that can be included in a simulation by the present code is limited only by computer memory. Hence, the present code represents more nearly completely the complex physics involved when using laser-cooled and -trapped atoms in engineering applications. Another advantage that the code incorporates is the possibility to analyze the interaction between cold atoms of different atomic number. Some properties that cold atoms of different atomic species have, like cross sections and the particular excited states they can occupy when interacting with each other and light fields, play important roles not yet completely understood in the new experiments that are under way in laboratories worldwide to form ultracold molecules. Other research efforts use cold atoms as holders of quantum information, and more recent developments in cavity quantum electrodynamics also use ultracold atoms to explore and expand new information-technology ideas. These experiments give a hint on the wide range of applications and technology developments that can be tackled using cold atoms and light fields. From more precise atomic clocks and gravity sensors to the development of quantum computers, there will be a need to completely understand the whole ensemble of physical mechanisms that play a role in the development of such technologies. The code also permits the study of the dynamic and steady-state operations of technologies that use cold atoms. The physical characteristics of lasers and fields can be time-controlled to give a realistic simulation of the processes involved such that the design process can determine the best control features to use. It is expected that with the features incorporated into the code it will become a tool for the useful application of ultracold atoms in engineering applications. Currently, the software is being used for the analysis and understanding of simple experiments using cold atoms, and for the design of a modular compact source of cold atoms to be used in future research and development projects. The results so far indicate that the code is a useful design instrument that shows good agreement with experimental measurements (see figure), and a Windows-based user-friendly interface is also under development.
Advanced multiphysics coupling for LWR fuel performance analysis
Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; ...
2015-10-01
Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics,more » particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is possible to use lower length scale models such as those used in the mesoscale MARMOT code to compute average properties, e.g. swelling or thermal conductivity. These may then be used by an engineering-scale model. Examples of this type of multiscale, multiphysics modeling are shown.« less
Benchmark of FDNS CFD Code For Direct Connect RBCC Test Data
NASA Technical Reports Server (NTRS)
Ruf, J. H.
2000-01-01
Computational Fluid Dynamics (CFD) analysis results are compared with experimental data from the Pennsylvania State University's (PSU) Propulsion Engineering Research Center (PERC) rocket based combined cycle (RBCC) rocket-ejector experiments. The PERC RBCC experimental hardware was in a direct-connect configuration in diffusion and afterburning (DAB) operation. The objective of the present work was to validate the Finite Difference Navier Stokes (FDNS) CFD code for the rocket-ejector mode internal fluid mechanics and combustion phenomena. A second objective was determine the best application procedures to use FDNS as a predictive/engineering tool. Three-dimensional CFD analysis was performed. Solution methodology and grid requirements are discussed. CFD results are compared to experimental data for static pressure, Raman Spectroscopy species distribution data and RBCC net thrust and specified impulse.
Reardon, Joseph M; Harmon, Katherine J; Schult, Genevieve C; Staton, Catherine A; Waller, Anna E
2016-02-08
Although fatal opioid poisonings tripled from 1999 to 2008, data describing nonfatal poisonings are rare. Public health authorities are in need of tools to track opioid poisonings in near real time. We determined the utility of ICD-9-CM diagnosis codes for identifying clinically significant opioid poisonings in a state-wide emergency department (ED) surveillance system. We sampled visits from four hospitals from July 2009 to June 2012 with diagnosis codes of 965.00, 965.01, 965.02 and 965.09 (poisoning by opiates and related narcotics) and/or an external cause of injury code of E850.0-E850.2 (accidental poisoning by opiates and related narcotics), and developed a novel case definition to determine in which cases opioid poisoning prompted the ED visit. We calculated the percentage of visits coded for opioid poisoning that were clinically significant and compared it to the percentage of visits coded for poisoning by non-opioid agents in which there was actually poisoning by an opioid agent. We created a multivariate regression model to determine if other collected triage data can improve the positive predictive value of diagnosis codes alone for detecting clinically significant opioid poisoning. 70.1 % of visits (Standard Error 2.4 %) coded for opioid poisoning were primarily prompted by opioid poisoning. The remainder of visits represented opioid exposure in the setting of other primary diseases. Among non-opioid poisoning codes reviewed, up to 36 % were reclassified as an opioid poisoning. In multivariate analysis, only naloxone use improved the positive predictive value of ICD-9-CM codes for identifying clinically significant opioid poisoning, but was associated with a high false negative rate. This surveillance mechanism identifies many clinically significant opioid overdoses with a high positive predictive value. With further validation, it may help target control measures such as prescriber education and pharmacy monitoring.
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillon, Heather E.; Antonopoulos, Chrissi A.; Solana, Amy E.
As the model energy codes are improved to reach efficiency levels 50 percent greater than current codes, use of on-site renewable energy generation is likely to become a code requirement. This requirement will be needed because traditional mechanisms for code improvement, including envelope, mechanical and lighting, have been pressed to the end of reasonable limits. Research has been conducted to determine the mechanism for implementing this requirement (Kaufman 2011). Kaufmann et al. determined that the most appropriate way to structure an on-site renewable requirement for commercial buildings is to define the requirement in terms of an installed power density permore » unit of roof area. This provides a mechanism that is suitable for the installation of photovoltaic (PV) systems on future buildings to offset electricity and reduce the total building energy load. Kaufmann et al. suggested that an appropriate maximum for the requirement in the commercial sector would be 4 W/ft{sup 2} of roof area or 0.5 W/ft{sup 2} of conditioned floor area. As with all code requirements, there must be an alternative compliance path for buildings that may not reasonably meet the renewables requirement. This might include conditions like shading (which makes rooftop PV arrays less effective), unusual architecture, undesirable roof pitch, unsuitable building orientation, or other issues. In the short term, alternative compliance paths including high performance mechanical equipment, dramatic envelope changes, or controls changes may be feasible. These options may be less expensive than many renewable systems, which will require careful balance of energy measures when setting the code requirement levels. As the stringency of the code continues to increase however, efficiency trade-offs will be maximized, requiring alternative compliance options to be focused solely on renewable electricity trade-offs or equivalent programs. One alternate compliance path includes purchase of Renewable Energy Credits (RECs). Each REC represents a specified amount of renewable electricity production and provides an offset of environmental externalities associated with non-renewable electricity production. The purpose of this paper is to explore the possible issues with RECs and comparable alternative compliance options. Existing codes have been examined to determine energy equivalence between the energy generation requirement and the RECs alternative over the life of the building. The price equivalence of the requirement and the alternative are determined to consider the economic drivers for a market decision. This research includes case studies that review how the few existing codes have incorporated RECs and some of the issues inherent with REC markets. Section 1 of the report reviews compliance options including RECs, green energy purchase programs, shared solar agreements and leases, and other options. Section 2 provides detailed case studies on codes that include RECs and community based alternative compliance methods. The methods the existing code requirements structure alternative compliance options like RECs are the focus of the case studies. Section 3 explores the possible structure of the renewable energy generation requirement in the context of energy and price equivalence. The price of RECs have shown high variation by market and over time which makes it critical to for code language to be updated frequently for a renewable energy generation requirement or the requirement will not remain price-equivalent over time. Section 4 of the report provides a maximum case estimate for impact to the PV market and the REC market based on the Kaufmann et al. proposed requirement levels. If all new buildings in the commercial sector complied with the requirement to install rooftop PV arrays, nearly 4,700 MW of solar would be installed in 2012, a major increase from EIA estimates of 640 MW of solar generation capacity installed in 2009. The residential sector could contribute roughly an additional 2,300 MW based on the same code requirement levels of 4 W/ft{sup 2} of roof area. Section 5 of the report provides a basic framework for draft code language recommendations based on the analysis of the alternative compliance levels.« less
Snauwaert, Isabel; Stragier, Pieter; De Vuyst, Luc; Vandamme, Peter
2015-04-03
Pediococcus damnosus LMG 28219 is a lactic acid bacterium dominating the maturation phase of Flemish acid beer productions. It proved to be capable of growing in beer, thereby resisting this environment, which is unfavorable for microbial growth. The molecular mechanisms underlying its metabolic capabilities and niche adaptations were unknown up to now. In the present study, whole-genome sequencing and comparative genome analysis were used to investigate this strain's mechanisms to reside in the beer niche, with special focus on not only stress and hop resistances but also folate biosynthesis and exopolysaccharide (EPS) production. The draft genome sequence of P. damnosus LMG 28219 harbored 183 contigs, including an intact prophage region and several coding sequences involved in plasmid replication. The annotation of 2178 coding sequences revealed the presence of many transporters and transcriptional regulators and several genes involved in oxidative stress response, hop resistance, de novo folate biosynthesis, and EPS production. Comparative genome analysis of P. damnosus LMG 28219 with Pediococcus claussenii ATCC BAA-344(T) (beer origin) and Pediococcus pentosaceus ATCC 25745 (plant origin) revealed that various hop resistance genes and genes involved in de novo folate biosynthesis were unique to the strains isolated from beer. This contrasted with the genes related to osmotic stress responses, which were shared between the strains compared. Furthermore, transcriptional regulators were enriched in the genomes of bacteria capable of growth in beer, suggesting that those cause rapid up- or down-regulation of gene expression. Genome sequence analysis of P. damnosus LMG 28219 provided insights into the underlying mechanisms of its adaptation to the beer niche. The results presented will enable analysis of the transcriptome and proteome of P. damnosus LMG 28219, which will result in additional knowledge on its metabolic activities.
NASA Technical Reports Server (NTRS)
1994-01-01
General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.
The solution of linear systems of equations with a structural analysis code on the NAS CRAY-2
NASA Technical Reports Server (NTRS)
Poole, Eugene L.; Overman, Andrea L.
1988-01-01
Two methods for solving linear systems of equations on the NAS Cray-2 are described. One is a direct method; the other is an iterative method. Both methods exploit the architecture of the Cray-2, particularly the vectorization, and are aimed at structural analysis applications. To demonstrate and evaluate the methods, they were installed in a finite element structural analysis code denoted the Computational Structural Mechanics (CSM) Testbed. A description of the techniques used to integrate the two solvers into the Testbed is given. Storage schemes, memory requirements, operation counts, and reformatting procedures are discussed. Finally, results from the new methods are compared with results from the initial Testbed sparse Choleski equation solver for three structural analysis problems. The new direct solvers described achieve the highest computational rates of the methods compared. The new iterative methods are not able to achieve as high computation rates as the vectorized direct solvers but are best for well conditioned problems which require fewer iterations to converge to the solution.
NASA Technical Reports Server (NTRS)
McManus, Hugh L.; Chamis, Christos C.
1996-01-01
This report describes analytical methods for calculating stresses and damage caused by degradation of the matrix constituent in polymer matrix composite materials. Laminate geometry, material properties, and matrix degradation states are specified as functions of position and time. Matrix shrinkage and property changes are modeled as functions of the degradation states. The model is incorporated into an existing composite mechanics computer code. Stresses, strains, and deformations at the laminate, ply, and micro levels are calculated, and from these calculations it is determined if there is failure of any kind. The rationale for the model (based on published experimental work) is presented, its integration into the laminate analysis code is outlined, and example results are given, with comparisons to existing material and structural data. The mechanisms behind the changes in properties and in surface cracking during long-term aging of polyimide matrix composites are clarified. High-temperature-material test methods are also evaluated.
LSENS - GENERAL CHEMICAL KINETICS AND SENSITIVITY ANALYSIS CODE
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1994-01-01
LSENS has been developed for solving complex, homogeneous, gas-phase, chemical kinetics problems. The motivation for the development of this program is the continuing interest in developing detailed chemical reaction mechanisms for complex reactions such as the combustion of fuels and pollutant formation and destruction. A reaction mechanism is the set of all elementary chemical reactions that are required to describe the process of interest. Mathematical descriptions of chemical kinetics problems constitute sets of coupled, nonlinear, first-order ordinary differential equations (ODEs). The number of ODEs can be very large because of the numerous chemical species involved in the reaction mechanism. Further complicating the situation are the many simultaneous reactions needed to describe the chemical kinetics of practical fuels. For example, the mechanism describing the oxidation of the simplest hydrocarbon fuel, methane, involves over 25 species participating in nearly 100 elementary reaction steps. Validating a chemical reaction mechanism requires repetitive solutions of the governing ODEs for a variety of reaction conditions. Analytical solutions to the systems of ODEs describing chemistry are not possible, except for the simplest cases, which are of little or no practical value. Consequently, there is a need for fast and reliable numerical solution techniques for chemical kinetics problems. In addition to solving the ODEs describing chemical kinetics, it is often necessary to know what effects variations in either initial condition values or chemical reaction mechanism parameters have on the solution. Such a need arises in the development of reaction mechanisms from experimental data. The rate coefficients are often not known with great precision and in general, the experimental data are not sufficiently detailed to accurately estimate the rate coefficient parameters. The development of a reaction mechanism is facilitated by a systematic sensitivity analysis which provides the relationships between the predictions of a kinetics model and the input parameters of the problem. LSENS provides for efficient and accurate chemical kinetics computations and includes sensitivity analysis for a variety of problems, including nonisothermal conditions. LSENS replaces the previous NASA general chemical kinetics codes GCKP and GCKP84. LSENS is designed for flexibility, convenience and computational efficiency. A variety of chemical reaction models can be considered. The models include static system, steady one-dimensional inviscid flow, reaction behind an incident shock wave including boundary layer correction, and the perfectly stirred (highly backmixed) reactor. In addition, computations of equilibrium properties can be performed for the following assigned states, enthalpy and pressure, temperature and pressure, internal energy and volume, and temperature and volume. For static problems LSENS computes sensitivity coefficients with respect to the initial values of the dependent variables and/or the three rates coefficient parameters of each chemical reaction. To integrate the ODEs describing chemical kinetics problems, LSENS uses the packaged code LSODE, the Livermore Solver for Ordinary Differential Equations, because it has been shown to be the most efficient and accurate code for solving such problems. The sensitivity analysis computations use the decoupled direct method, as implemented by Dunker and modified by Radhakrishnan. This method has shown greater efficiency and stability with equal or better accuracy than other methods of sensitivity analysis. LSENS is written in FORTRAN 77 with the exception of the NAMELIST extensions used for input. While this makes the code fairly machine independent, execution times on IBM PC compatibles would be unacceptable to most users. LSENS has been successfully implemented on a Sun4 running SunOS and a DEC VAX running VMS. With minor modifications, it should also be easily implemented on other platforms with FORTRAN compilers which support NAMELIST input. LSENS required 4Mb of RAM under SunOS 4.1.1 and 3.4Mb of RAM under VMS 5.5.1. The standard distribution medium for LSENS is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. It is also available on a 1600 BPI 9-track magnetic tape or a TK50 tape cartridge in DEC VAX BACKUP format. Alternate distribution media and formats are available upon request. LSENS was developed in 1992.
Thermomechanical analysis of fast-burst reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, J.D.
1994-08-01
Fast-burst reactors are designed to provide intense, short-duration pulses of neutrons. The fission reaction also produces extreme time-dependent heating of the nuclear fuel. An existing transient-dynamic finite element code was modified specifically to compute the time-dependent stresses and displacements due to thermal shock loads of reactors. Thermomechanical analysis was then applied to determine structural feasibility of various concepts for an EDNA-type reactor and to optimize the mechanical design of the new SPR III-M reactor.
Composite blade structural analyzer (COBSTRAN) user's manual
NASA Technical Reports Server (NTRS)
Aiello, Robert A.
1989-01-01
The installation and use of a computer code, COBSTRAN (COmposite Blade STRuctrual ANalyzer), developed for the design and analysis of composite turbofan and turboprop blades and also for composite wind turbine blades was described. This code combines composite mechanics and laminate theory with an internal data base of fiber and matrix properties. Inputs to the code are constituent fiber and matrix material properties, factors reflecting the fabrication process, composite geometry and blade geometry. COBSTRAN performs the micromechanics, macromechanics and laminate analyses of these fiber composites. COBSTRAN generates a NASTRAN model with equivalent anisotropic homogeneous material properties. Stress output from NASTRAN is used to calculate individual ply stresses, strains, interply stresses, thru-the-thickness stresses and failure margins. Curved panel structures may be modeled providing the curvature of a cross-section is defined by a single value function. COBSTRAN is written in FORTRAN 77.
Entracking as a Brain Stem Code for Pitch: The Butte Hypothesis.
Joris, Philip X
2016-01-01
The basic nature of pitch is much debated. A robust code for pitch exists in the auditory nerve in the form of an across-fiber pooled interspike interval (ISI) distribution, which resembles the stimulus autocorrelation. An unsolved question is how this representation can be "read out" by the brain. A new view is proposed in which a known brain-stem property plays a key role in the coding of periodicity, which I refer to as "entracking", a contraction of "entrained phase-locking". It is proposed that a scalar rather than vector code of periodicity exists by virtue of coincidence detectors that code the dominant ISI directly into spike rate through entracking. Perfect entracking means that a neuron fires one spike per stimulus-waveform repetition period, so that firing rate equals the repetition frequency. Key properties are invariance with SPL and generalization across stimuli. The main limitation in this code is the upper limit of firing (~ 500 Hz). It is proposed that entracking provides a periodicity tag which is superimposed on a tonotopic analysis: at low SPLs and fundamental frequencies > 500 Hz, a spectral or place mechanism codes for pitch. With increasing SPL the place code degrades but entracking improves and first occurs in neurons with low thresholds for the spectral components present. The prediction is that populations of entracking neurons, extended across characteristic frequency, form plateaus ("buttes") of firing rate tied to periodicity.
Non-coding RNAs and Berberine: A new mechanism of its anti-diabetic activities.
Chang, Wenguang
2017-01-15
Type 2 Diabetes (T2D) is a metabolic disease with high mortality and morbidity. Non-coding RNAs, including small and long non-coding RNAs, are a novel class of functional RNA molecules that regulate multiple biological functions through diverse mechanisms. Studies in the last decade have demonstrated that non-coding RNAs may represent compelling therapeutic targets and play important roles in regulating the course of insulin resistance and T2D. Berberine, a plant-based alkaloid, has shown promise as an anti-hyperglycaemic, anti-hyperlipidaemic agent against T2D. Previous studies have primarily focused on a diverse array of efficacy end points of berberine in the pathogenesis of metabolic syndromes and inflammation or oxidative stress. Currently, an increasing number of studies have revealed the importance of non-coding RNAs as regulators of the anti-diabetic effects of berberine. The regulation of non-coding RNAs has been associated with several therapeutic actions of berberine in T2D progression. Thus, this review summarizes the anti-diabetic mechanisms of berberine by focusing on its role in regulating non-coding RNA, thus demonstrating that berberine exerts global anti-diabetic effects by targeting non-coding RNAs and that these effects involve several miRNAs, lncRNAs and multiple signal pathways, which may enhance the current understanding of the anti-diabetic mechanism actions of berberine and provide new pathological targets for the development of berberine-related drugs. Copyright © 2016 Elsevier B.V. All rights reserved.
COMBINE*: An integrated opto-mechanical tool for laser performance modeling
NASA Astrophysics Data System (ADS)
Rehak, M.; Di Nicola, J. M.
2015-02-01
Accurate modeling of thermal, mechanical and optical processes is important for achieving reliable, high-performance high energy lasers such as those at the National Ignition Facility [1] (NIF). The need for this capability is even more critical for high average power, high repetition rate applications. Modeling the effects of stresses and temperature fields on optical properties allows for optimal design of optical components and more generally of the architecture of the laser system itself. Stresses change the indices of refractions and induce inhomogeneities and anisotropy. We present a modern, integrated analysis tool that efficiently produces reliable results that are used in our laser propagation tools such as VBL [5]. COMBINE is built on and supplants the existing legacy tools developed for the previous generations of lasers at LLNL but also uses commercially available mechanical finite element codes ANSYS or COMSOL (including computational fluid dynamics). The COMBINE code computes birefringence and wave front distortions due to mechanical stresses on lenses and slabs of arbitrary geometry. The stresses calculated typically originate from mounting support, vacuum load, gravity, heat absorption and/or attending cooling. Of particular importance are the depolarization and detuning effects of nonlinear crystals due to thermal loading. Results are given in the form of Jones matrices, depolarization maps and wave front distributions. An incremental evaluation of Jones matrices and ray propagation in a 3D mesh with a stress and temperature field is performed. Wavefront and depolarization maps are available at the optical aperture and at slices within the optical element. The suite is validated, user friendly, supported, documented and amenable to collaborative development. * COMBINE stands for Code for Opto-Mechanical Birefringence Integrated Numerical Evaluations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.
2011-03-01
This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less
Predictive coding in autism spectrum disorder and attention deficit hyperactivity disorder.
Gonzalez-Gadea, Maria Luz; Chennu, Srivas; Bekinschtein, Tristan A; Rattazzi, Alexia; Beraudi, Ana; Tripicchio, Paula; Moyano, Beatriz; Soffita, Yamila; Steinberg, Laura; Adolfi, Federico; Sigman, Mariano; Marino, Julian; Manes, Facundo; Ibanez, Agustin
2015-11-01
Predictive coding has been proposed as a framework to understand neural processes in neuropsychiatric disorders. We used this approach to describe mechanisms responsible for attentional abnormalities in autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD). We monitored brain dynamics of 59 children (8-15 yr old) who had ASD or ADHD or who were control participants via high-density electroencephalography. We performed analysis at the scalp and source-space levels while participants listened to standard and deviant tone sequences. Through task instructions, we manipulated top-down expectation by presenting expected and unexpected deviant sequences. Children with ASD showed reduced superior frontal cortex (FC) responses to unexpected events but increased dorsolateral prefrontal cortex (PFC) activation to expected events. In contrast, children with ADHD exhibited reduced cortical responses in superior FC to expected events but strong PFC activation to unexpected events. Moreover, neural abnormalities were associated with specific control mechanisms, namely, inhibitory control in ASD and set-shifting in ADHD. Based on the predictive coding account, top-down expectation abnormalities could be attributed to a disproportionate reliance (precision) allocated to prior beliefs in ASD and to sensory input in ADHD. Copyright © 2015 the American Physiological Society.
Predictive coding in autism spectrum disorder and attention deficit hyperactivity disorder
Gonzalez-Gadea, Maria Luz; Chennu, Srivas; Bekinschtein, Tristan A.; Rattazzi, Alexia; Beraudi, Ana; Tripicchio, Paula; Moyano, Beatriz; Soffita, Yamila; Steinberg, Laura; Adolfi, Federico; Sigman, Mariano; Marino, Julian; Manes, Facundo
2015-01-01
Predictive coding has been proposed as a framework to understand neural processes in neuropsychiatric disorders. We used this approach to describe mechanisms responsible for attentional abnormalities in autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD). We monitored brain dynamics of 59 children (8–15 yr old) who had ASD or ADHD or who were control participants via high-density electroencephalography. We performed analysis at the scalp and source-space levels while participants listened to standard and deviant tone sequences. Through task instructions, we manipulated top-down expectation by presenting expected and unexpected deviant sequences. Children with ASD showed reduced superior frontal cortex (FC) responses to unexpected events but increased dorsolateral prefrontal cortex (PFC) activation to expected events. In contrast, children with ADHD exhibited reduced cortical responses in superior FC to expected events but strong PFC activation to unexpected events. Moreover, neural abnormalities were associated with specific control mechanisms, namely, inhibitory control in ASD and set-shifting in ADHD. Based on the predictive coding account, top-down expectation abnormalities could be attributed to a disproportionate reliance (precision) allocated to prior beliefs in ASD and to sensory input in ADHD. PMID:26311184
Di Giulio, Massimo
2017-11-07
The coevolution theory of the origin of the genetic code suggests that the organization of the genetic code coevolved with the biosynthetic relationships between amino acids. The mechanism that allowed this coevolution was based on tRNA-like molecules on which-this theory-would postulate the biosynthetic transformations between amino acids to have occurred. This mechanism makes a prediction on how the role conducted by the aminoacyl-tRNA synthetases (ARSs), in the origin of the genetic code, should have been. Indeed, if the biosynthetic transformations between amino acids occurred on tRNA-like molecules, then there was no need to link amino acids to these molecules because amino acids were already charged on tRNA-like molecules, as the coevolution theory suggests. In spite of the fact that ARSs make the genetic code responsible for the first interaction between a component of nucleic acids and that of proteins, for the coevolution theory the role of ARSs should have been entirely marginal in the genetic code origin. Therefore, I have conducted a further analysis of the distribution of the two classes of ARSs and of their subclasses-in the genetic code table-in order to perform a falsification test of the coevolution theory. Indeed, in the case in which the distribution of ARSs within the genetic code would have been highly significant, then the coevolution theory would be falsified since the mechanism on which it is based would not predict a fundamental role of ARSs in the origin of the genetic code. I found that the statistical significance of the distribution of the two classes of ARSs in the table of the genetic code is low or marginal, whereas that of the subclasses of ARSs statistically significant. However, this is in perfect agreement with the postulates of the coevolution theory. Indeed, the only case of statistical significance-regarding the classes of ARSs-is appreciable for the CAG code, whereas for its complement-the UNN/NUN code-only a marginal significance is measurable. These two codes codify roughly for the two ARS classes, in particular, the CAG code for the class II while the UNN/NUN code for the class I. Furthermore, the subclasses of ARSs show a statistical significance of their distribution in the genetic code table. Nevertheless, the more sensible explanation for these observations would be the following. The observation that would link the two classes of ARSs to the CAG and UNN/NUN codes, and the statistical significance of the distribution of the subclasses of ARSs in the genetic code table, would be only a secondary effect due to the highly significant distribution of the polarity of amino acids and their biosynthetic relationships in the genetic code. That is to say, the polarity of amino acids and their biosynthetic relationships would have conditioned the evolution of ARSs so that their presence in the genetic code would have been detectable. Even if the ARSs would not have-on their own-influenced directly the evolutionary organization of the genetic code. In other words, the role that ARSs had in the origin of the genetic code would have been entirely marginal. This conclusion would be in perfect accord with the predictions of the coevolution theory. Conversely, this conclusion would be in contrast-at least partially-with the physicochemical theories of the origin of the genetic code because they would foresee an absolutely more active role of ARSs in the origin of the organization of the genetic code. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reducing EnergyPlus Run Time For Code Compliance Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Athalye, Rahul A.; Gowri, Krishnan; Schultz, Robert W.
2014-09-12
Integration of the EnergyPlus ™ simulation engine into performance-based code compliance software raises a concern about simulation run time, which impacts timely feedback of compliance results to the user. EnergyPlus annual simulations for proposed and code baseline building models, and mechanical equipment sizing result in simulation run times beyond acceptable limits. This paper presents a study that compares the results of a shortened simulation time period using 4 weeks of hourly weather data (one per quarter), to an annual simulation using full 52 weeks of hourly weather data. Three representative building types based on DOE Prototype Building Models and threemore » climate zones were used for determining the validity of using a shortened simulation run period. Further sensitivity analysis and run time comparisons were made to evaluate the robustness and run time savings of using this approach. The results of this analysis show that the shortened simulation run period provides compliance index calculations within 1% of those predicted using annual simulation results, and typically saves about 75% of simulation run time.« less
Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)
2002-01-01
In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory
Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.
1999-01-01
A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.
Introduction to the computational structural mechanics testbed
NASA Technical Reports Server (NTRS)
Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.
1987-01-01
The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.
Manikandan, Selvaraj; Balaji, Seetharaaman; Kumar, Anil; Kumar, Rita
2007-01-01
The molecular basis for the survival of bacteria under extreme conditions in which growth is inhibited is a question of great current interest. A preliminary study was carried out to determine residue pattern conservation among the antiporters of enteric bacteria, responsible for extreme acid sensitivity especially in Escherichia coli and Shigella flexneri. Here we found the molecular evidence that proved the relationship between E. coli and S. flexneri. Multiple sequence alignment of the gadC coded acid sensitive antiporter showed many conserved residue patterns at regular intervals at the N-terminal region. It was observed that as the alignment approaches towards the C-terminal, the number of conserved residues decreases, indicating that the N-terminal region of this protein has much active role when compared to the carboxyl terminal. The motif, FHLVFFLLLGG, is well conserved within the entire gadC coded protein at the amino terminal. The motif is also partially conserved among other antiporters (which are not coded by gadC) but involved in acid sensitive/resistance mechanism. Phylogenetic cluster analysis proves the relationship of Escherichia coli and Shigella flexneri. The gadC coded proteins are converged as a clade and diverged from other antiporters belongs to the amino acid-polyamine-organocation (APC) superfamily. PMID:21670792
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, P. T.; Dickson, T. L.; Yin, S.
The current regulations to insure that nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to transients such as pressurized thermal shock (PTS) events were derived from computational models developed in the early-to-mid 1980s. Since that time, advancements and refinements in relevant technologies that impact RPV integrity assessment have led to an effort by the NRC to re-evaluate its PTS regulations. Updated computational methodologies have been developed through interactions between experts in the relevant disciplines of thermal hydraulics, probabilistic risk assessment, materials embrittlement, fracture mechanics, and inspection (flaw characterization). Contributors to the development of these methodologies include themore » NRC staff, their contractors, and representatives from the nuclear industry. These updated methodologies have been integrated into the Fracture Analysis of Vessels -- Oak Ridge (FAVOR, v06.1) computer code developed for the NRC by the Heavy Section Steel Technology (HSST) program at Oak Ridge National Laboratory (ORNL). The FAVOR, v04.1, code represents the baseline NRC-selected applications tool for re-assessing the current PTS regulations. This report is intended to document the technical bases for the assumptions, algorithms, methods, and correlations employed in the development of the FAVOR, v06.1, code.« less
Reed, Terrie L; Kaufman-Rivi, Diana
2010-01-01
The broad array of medical devices and the potential for device failures, malfunctions, and other adverse events associated with each device creates a challenge for public health device surveillance programs. Coding reported events by type of device problem provides one method for identifying a potential signal of a larger device issue. The Food and Drug Administration's (FDA) Center for Devices and Radiological Health (CDRH) Event Problem Codes that are used to report adverse events previously lacked a structured set of controls for code development and maintenance. Over time this led to inconsistent, ambiguous, and duplicative concepts being added to the code set on an ad-hoc basis. Recognizing the limitation of its coding system the FDA set out to update the system to improve its usefulness within FDA and as a basis of a global standard to identify important patient and device outcomes throughout the medical community. In 2004, FDA and the National Cancer Institute (NCI) signed a Memorandum of Understanding (MOU) whereby NCI agreed to provide terminology development and maintenance services to all FDA Centers. Under this MOU, CDRH's Office of Surveillance and Biometrics (OSB) convened a cross-Center workgroup and collaborated with staff at NCI Enterprise Vocabulary Service (EVS) to streamline the Patient and Device Problem Codes and integrate them into the NCI Thesaurus and Meta-Thesaurus. This initiative included many enhancements to the Event Problem Codes aimed at improving code selection as well as improving adverse event report analysis. LIMITATIONS & RECOMMENDATIONS: Staff resources, database concerns, and limited collaboration with external groups in the initial phases of the project are discussed. Adverse events associated with medical device use can be better understood when they are reported using a consistent and well-defined code set. This FDA initiative was an attempt to improve the structure and add control mechanisms to an existing code set, improve analysis tools that will better identify device safety trends, and improve the ability to prevent or mitigate effects of adverse events associated with medical device use.
24 CFR 200.925c - Model codes.
Code of Federal Regulations, 2012 CFR
2012-04-01
... below. (1) Model Building Codes—(i) The BOCA National Building Code, 1993 Edition, The BOCA National..., Administration, for the Building, Plumbing and Mechanical Codes and the references to fire retardant treated wood... number 2 (Chapter 7) of the Building Code, but including the Appendices of the Code. Available from...
24 CFR 200.925c - Model codes.
Code of Federal Regulations, 2013 CFR
2013-04-01
... below. (1) Model Building Codes—(i) The BOCA National Building Code, 1993 Edition, The BOCA National..., Administration, for the Building, Plumbing and Mechanical Codes and the references to fire retardant treated wood... number 2 (Chapter 7) of the Building Code, but including the Appendices of the Code. Available from...
24 CFR 200.925c - Model codes.
Code of Federal Regulations, 2014 CFR
2014-04-01
... below. (1) Model Building Codes—(i) The BOCA National Building Code, 1993 Edition, The BOCA National..., Administration, for the Building, Plumbing and Mechanical Codes and the references to fire retardant treated wood... number 2 (Chapter 7) of the Building Code, but including the Appendices of the Code. Available from...
Buckling analysis and test correlation of hat stiffened panels for hypersonic vehicles
NASA Technical Reports Server (NTRS)
Percy, Wendy C.; Fields, Roger A.
1990-01-01
The paper discusses the design, analysis, and test of hat stiffened panels subjected to a variety of thermal and mechanical load conditions. The panels were designed using data from structural optimization computer codes and finite element analysis. Test methods included the grid shadow moire method and a single gage force stiffness method. The agreement between the test data and analysis provides confidence in the methods that are currently being used to design structures for hypersonic vehicles. The agreement also indicates that post buckled strength may potentially be used to reduce the vehicle weight.
Bergin, Michael
2011-01-01
Qualitative data analysis is a complex process and demands clear thinking on the part of the analyst. However, a number of deficiencies may obstruct the research analyst during the process, leading to inconsistencies occurring. This paper is a reflection on the use of a qualitative data analysis program, NVivo 8, and its usefulness in identifying consistency and inconsistency during the coding process. The author was conducting a large-scale study of providers and users of mental health services in Ireland. He used NVivo 8 to store, code and analyse the data and this paper reflects some of his observations during the study. The demands placed on the analyst in trying to balance the mechanics of working through a qualitative data analysis program, while simultaneously remaining conscious of the value of all sources are highlighted. NVivo 8 as a qualitative data analysis program is a challenging but valuable means for advancing the robustness of qualitative research. Pitfalls can be avoided during analysis by running queries as the analyst progresses from tree node to tree node rather than leaving it to a stage whereby data analysis is well advanced.
A Mechanism to Avoid Collusion Attacks Based on Code Passing in Mobile Agent Systems
NASA Astrophysics Data System (ADS)
Jaimez, Marc; Esparza, Oscar; Muñoz, Jose L.; Alins-Delgado, Juan J.; Mata-Díaz, Jorge
Mobile agents are software entities consisting of code, data, state and itinerary that can migrate autonomously from host to host executing their code. Despite its benefits, security issues strongly restrict the use of code mobility. The protection of mobile agents against the attacks of malicious hosts is considered the most difficult security problem to solve in mobile agent systems. In particular, collusion attacks have been barely studied in the literature. This paper presents a mechanism that avoids collusion attacks based on code passing. Our proposal is based on a Multi-Code agent, which contains a different variant of the code for each host. A Trusted Third Party is responsible for providing the information to extract its own variant to the hosts, and for taking trusted timestamps that will be used to verify time coherence.
Observations Regarding Use of Advanced CFD Analysis, Sensitivity Analysis, and Design Codes in MDO
NASA Technical Reports Server (NTRS)
Newman, Perry A.; Hou, Gene J. W.; Taylor, Arthur C., III
1996-01-01
Observations regarding the use of advanced computational fluid dynamics (CFD) analysis, sensitivity analysis (SA), and design codes in gradient-based multidisciplinary design optimization (MDO) reflect our perception of the interactions required of CFD and our experience in recent aerodynamic design optimization studies using CFD. Sample results from these latter studies are summarized for conventional optimization (analysis - SA codes) and simultaneous analysis and design optimization (design code) using both Euler and Navier-Stokes flow approximations. The amount of computational resources required for aerodynamic design using CFD via analysis - SA codes is greater than that required for design codes. Thus, an MDO formulation that utilizes the more efficient design codes where possible is desired. However, in the aerovehicle MDO problem, the various disciplines that are involved have different design points in the flight envelope; therefore, CFD analysis - SA codes are required at the aerodynamic 'off design' points. The suggested MDO formulation is a hybrid multilevel optimization procedure that consists of both multipoint CFD analysis - SA codes and multipoint CFD design codes that perform suboptimizations.
Toren, Dmitri; Barzilay, Thomer; Tacutu, Robi; Lehmann, Gilad; Muradian, Khachik K; Fraifeld, Vadim E
2016-01-04
Mitochondria are the only organelles in the animal cells that have their own genome. Due to a key role in energy production, generation of damaging factors (ROS, heat), and apoptosis, mitochondria and mtDNA in particular have long been considered one of the major players in the mechanisms of aging, longevity and age-related diseases. The rapidly increasing number of species with fully sequenced mtDNA, together with accumulated data on longevity records, provides a new fascinating basis for comparative analysis of the links between mtDNA features and animal longevity. To facilitate such analyses and to support the scientific community in carrying these out, we developed the MitoAge database containing calculated mtDNA compositional features of the entire mitochondrial genome, mtDNA coding (tRNA, rRNA, protein-coding genes) and non-coding (D-loop) regions, and codon usage/amino acids frequency for each protein-coding gene. MitoAge includes 922 species with fully sequenced mtDNA and maximum lifespan records. The database is available through the MitoAge website (www.mitoage.org or www.mitoage.info), which provides the necessary tools for searching, browsing, comparing and downloading the data sets of interest for selected taxonomic groups across the Kingdom Animalia. The MitoAge website assists in statistical analysis of different features of the mtDNA and their correlative links to longevity. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Use of multiscale zirconium alloy deformation models in nuclear fuel behavior analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montgomery, Robert; Tomé, Carlos; Liu, Wenfeng
Accurate prediction of cladding mechanical behavior is a key aspect of modeling nuclear fuel behavior, especially for conditions of pellet-cladding interaction (PCI), reactivity-initiated accidents (RIA), and loss of coolant accidents (LOCA). Current approaches to fuel performance modeling rely on empirical models for cladding creep, growth and plastic deformation, which are limited to the materials and conditions for which the models were developed. CASL has endeavored to improve upon this approach by incorporating a microstructurally-based, atomistically-informed, zirconium alloy mechanical deformation analysis capability into the BISON-CASL engineering scale fuel performance code. Specifically, the viscoplastic self-consistent (VPSC) polycrystal plasticity modeling approach, developed bymore » Lebensohn and Tome´ [2], has been coupled with BISON-CASL to represent the mechanistic material processes controlling the deformation behavior of the cladding. A critical component of VPSC is the representation of the crystallographic orientation of the grains within the matrix material and the ability to account for the role of texture on deformation. The multiscale modeling of cladding deformation mechanisms allowed by VPSC far exceed the functionality of typical semi-empirical constitutive models employed in nuclear fuel behavior codes to model irradiation growth and creep, thermal creep, or plasticity. This paper describes the implementation of an interface between VPSC and BISON-CASL and provides initial results utilizing the coupled functionality.« less
Content Coding of Psychotherapy Transcripts Using Labeled Topic Models.
Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic
2017-03-01
Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, nonstandardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the labeled latent Dirichlet allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of 0.79, and 0.70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scalable method for accurate automated coding of psychotherapy sessions that perform better than comparable discriminative methods at session-level coding and can also predict fine-grained codes.
Content Coding of Psychotherapy Transcripts Using Labeled Topic Models
Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic
2016-01-01
Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, non-standardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly-available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the Labeled Latent Dirichlet Allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic (ROC) curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of .79, and .70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scaleable method for accurate automated coding of psychotherapy sessions that performs better than comparable discriminative methods at session-level coding and can also predict fine-grained codes. PMID:26625437
A systematic analysis of a mi-RNA inter-pathway regulatory motif
2013-01-01
Background The continuing discovery of new types and functions of small non-coding RNAs is suggesting the presence of regulatory mechanisms far more complex than the ones currently used to study and design Gene Regulatory Networks. Just focusing on the roles of micro RNAs (miRNAs), they have been found to be part of several intra-pathway regulatory motifs. However, inter-pathway regulatory mechanisms have been often neglected and require further investigation. Results In this paper we present the result of a systems biology study aimed at analyzing a high-level inter-pathway regulatory motif called Pathway Protection Loop, not previously described, in which miRNAs seem to play a crucial role in the successful behavior and activation of a pathway. Through the automatic analysis of a large set of public available databases, we found statistical evidence that this inter-pathway regulatory motif is very common in several classes of KEGG Homo Sapiens pathways and concurs in creating a complex regulatory network involving several pathways connected by this specific motif. The role of this motif seems also confirmed by a deeper review of other research activities on selected representative pathways. Conclusions Although previous studies suggested transcriptional regulation mechanism at the pathway level such as the Pathway Protection Loop, a high-level analysis like the one proposed in this paper is still missing. The understanding of higher-level regulatory motifs could, as instance, lead to new approaches in the identification of therapeutic targets because it could unveil new and “indirect” paths to activate or silence a target pathway. However, a lot of work still needs to be done to better uncover this high-level inter-pathway regulation including enlarging the analysis to other small non-coding RNA molecules. PMID:24152805
Development of a realistic stress analysis for fatigue analysis of notched composite laminates
NASA Technical Reports Server (NTRS)
Humphreys, E. A.; Rosen, B. W.
1979-01-01
A finite element stress analysis which consists of a membrane and interlaminar shear spring analysis was developed. This approach was utilized in order to model physically realistic failure mechanisms while maintaining a high degree of computational economy. The accuracy of the stress analysis predictions is verified through comparisons with other solutions to the composite laminate edge effect problem. The stress analysis model was incorporated into an existing fatigue analysis methodology and the entire procedure computerized. A fatigue analysis is performed upon a square laminated composite plate with a circular central hole. A complete description and users guide for the computer code FLAC (Fatigue of Laminated Composites) is included as an appendix.
BNL severe-accident sequence experiments and analysis program. [PWR; BWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, G.A.; Ginsberg, T.; Tutu, N.K.
1983-01-01
In the analysis of degraded core accidents, the two major sources of pressure loading on light water reactor containments are: steam generation from core debris-water thermal interactions; and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described.
1998-01-01
its underlying mechanism. The morphologies and associated terminology of the ferrography wear atlas (13), have been adopted almost universally by...connected to the World-Wide Web (WWW). What has emerged from the more recent developments is that, whereas a universal atlas , coupled to a coding...D.W., ’Wear Particle Atlas ,(Revised)’ Naval Air Eng. Centre Report No. NAEC 92 163 (1982) 14. Ruff A.W. ’Characterisation of debris particles
Analysis of high velocity impact on hybrid composite fan blades
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.
1979-01-01
Recent developments in the analysis of high velocity impact of composite blades are described, using a computerized capability which consists of coupling a composites mechanics code with the direct-time integration features of NASTRAN. The application of the capability to determine the linear dynamic response of an interply hybrid composite aircraft engine fan blade is described in detail. The results also show that the impact stresses reach sufficiently high magnitudes to cause failures in the impact region at early times of the impact event.
Perspectives On Dilution Jet Mixing
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Srinivasan, R.
1990-01-01
NASA recently completed program of measurements and modeling of mixing of transverse jets with ducted crossflow, motivated by need to design or tailor temperature pattern at combustor exit in gas turbine engines. Objectives of program to identify dominant physical mechanisms governing mixing, extend empirical models to provide near-term predictive capability, and compare numerical code calculations with data to guide future analysis improvement efforts.
Significance of Strain in Formulation in Theory of Solid Mechanics
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.
2003-01-01
The basic theory of solid mechanics was deemed complete circa 1860 when St. Venant provided the strain formulation or the field compatibility condition. The strain formulation was incomplete. The missing portion has been formulated and identified as the boundary compatibility condition (BCC). The BCC, derived through a variational formulation, has been verified through integral theorem and solution of problems. The BCC, unlike the field counterpart, do not trivialize when expressed in displacements. Navier s method and the stiffness formulation have to account for the extra conditions especially at the inter-element boundaries in a finite element model. Completion of the strain formulation has led to the revival of the direct force calculation methods: the Integrated Force Method (IFM) and its dual (IFMD) for finite element analysis, and the completed Beltrami-Michell formulation (CBMF) in elasticity. The benefits from the new methods in elasticity, in finite element analysis, and in design optimization are discussed. Existing solutions and computer codes may have to be adjusted for the compliance of the new conditions. Complacency because the discipline is over a century old and computer codes have been developed for half a century can lead to stagnation of the discipline.
Emmorey, Karen; Petrich, Jennifer; Gollan, Tamar H.
2012-01-01
Bilinguals who are fluent in American Sign Language (ASL) and English often produce code-blends - simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization times (Experiment 2) for code-blends versus ASL signs and English words produced alone. In production, code-blending did not slow lexical retrieval for ASL and actually facilitated access to low-frequency signs. However, code-blending delayed speech production because bimodal bilinguals synchronized English and ASL lexical onsets. In comprehension, code-blending speeded access to both languages. Bimodal bilinguals’ ability to produce code-blends without any cost to ASL implies that the language system either has (or can develop) a mechanism for switching off competition to allow simultaneous production of close competitors. Code-blend facilitation effects during comprehension likely reflect cross-linguistic (and cross-modal) integration at the phonological and/or semantic levels. The absence of any consistent processing costs for code-blending illustrates a surprising limitation on dual-task costs and may explain why bimodal bilinguals code-blend more often than they code-switch. PMID:22773886
Shape design sensitivity analysis and optimal design of structural systems
NASA Technical Reports Server (NTRS)
Choi, Kyung K.
1987-01-01
The material derivative concept of continuum mechanics and an adjoint variable method of design sensitivity analysis are used to relate variations in structural shape to measures of structural performance. A domain method of shape design sensitivity analysis is used to best utilize the basic character of the finite element method that gives accurate information not on the boundary but in the domain. Implementation of shape design sensitivty analysis using finite element computer codes is discussed. Recent numerical results are used to demonstrate the accuracy obtainable using the method. Result of design sensitivity analysis is used to carry out design optimization of a built-up structure.
Numerical modeling of the fracture process in a three-unit all-ceramic fixed partial denture.
Kou, Wen; Kou, Shaoquan; Liu, Hongyuan; Sjögren, Göran
2007-08-01
The main objectives were to examine the fracture mechanism and process of a ceramic fixed partial denture (FPD) framework under simulated mechanical loading using a recently developed numerical modeling code, the R-T(2D) code, and also to evaluate the suitability of R-T(2D) code as a tool for this purpose. Using the recently developed R-T(2D) code the fracture mechanism and process of a 3U yttria-tetragonal zirconia polycrystal ceramic (Y-TZP) FPD framework was simulated under static loading. In addition, the fracture pattern obtained using the numerical simulation was compared with the fracture pattern obtained in a previous laboratory test. The result revealed that the framework fracture pattern obtained using the numerical simulation agreed with that observed in a previous laboratory test. Quasi-photoelastic stress fringe pattern and acoustic emission showed that the fracture mechanism was tensile failure and that the crack started at the lower boundary of the framework. The fracture process could be followed both in step-by-step and step-in-step. Based on the findings in the current study, the R-T(2D) code seems suitable for use as a complement to other tests and clinical observations in studying stress distribution, fracture mechanism and fracture processes in ceramic FPD frameworks.
Energy coding in biological neural networks
Zhang, Zhikang
2007-01-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513
Software Developed for Analyzing High- Speed Rolling-Element Bearings
NASA Technical Reports Server (NTRS)
Fleming, David P.
2005-01-01
COBRA-AHS (Computer Optimized Ball & Roller Bearing Analysis--Advanced High Speed, J.V. Poplawski & Associates, Bethlehem, PA) is used for the design and analysis of rolling element bearings operating at high speeds under complex mechanical and thermal loading. The code estimates bearing fatigue life by calculating three-dimensional subsurface stress fields developed within the bearing raceways. It provides a state-of-the-art interactive design environment for bearing engineers within a single easy-to-use design-analysis package. The code analyzes flexible or rigid shaft systems containing up to five bearings acted upon by radial, thrust, and moment loads in 5 degrees of freedom. Bearing types include high-speed ball, cylindrical roller, and tapered roller bearings. COBRA-AHS is the first major upgrade in 30 years of such commercially available bearing software. The upgrade was developed under a Small Business Innovation Research contract from the NASA Glenn Research Center, and incorporates the results of 30 years of NASA and industry bearing research and technology.
NASA Technical Reports Server (NTRS)
Elfer, N.; Meibaum, R.; Olsen, G.
1995-01-01
A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design that is best suited to the predominant penetration mechanism. The analysis also suggests the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs or Microsoft-EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII. The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs. Examples will be presented of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration.
Shenoy, Archana; Blelloch, Robert
2009-09-11
The Microprocessor, containing the RNA binding protein Dgcr8 and RNase III enzyme Drosha, is responsible for processing primary microRNAs to precursor microRNAs. The Microprocessor regulates its own levels by cleaving hairpins in the 5'UTR and coding region of the Dgcr8 mRNA, thereby destabilizing the mature transcript. To determine whether the Microprocessor has a broader role in directly regulating other coding mRNA levels, we integrated results from expression profiling and ultra high-throughput deep sequencing of small RNAs. Expression analysis of mRNAs in wild-type, Dgcr8 knockout, and Dicer knockout mouse embryonic stem (ES) cells uncovered mRNAs that were specifically upregulated in the Dgcr8 null background. A number of these transcripts had evolutionarily conserved predicted hairpin targets for the Microprocessor. However, analysis of deep sequencing data of 18 to 200nt small RNAs in mouse ES, HeLa, and HepG2 indicates that exonic sequence reads that map in a pattern consistent with Microprocessor activity are unique to Dgcr8. We conclude that the Microprocessor's role in directly destabilizing coding mRNAs is likely specifically targeted to Dgcr8 itself, suggesting a specialized cellular mechanism for gene auto-regulation.
Quasi-Static Indentation Analysis of Carbon-Fiber Laminates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briggs, Timothy; English, Shawn Allen; Nelson, Stacy Michelle
2015-12-01
A series of quasi - static indentation experiments are conducted on carbon fiber reinforced polymer laminates with a systematic variation of thicknesses and fixture boundary conditions. Different deformation mechanisms and their resulting damage mechanisms are activated b y changing the thickn ess and boundary conditions. The quasi - static indentation experiments have been shown to achieve damage mechanisms similar to impact and penetration, however without strain rate effects. The low rate allows for the detailed analysis on the load response. Moreover, interrupted tests allow for the incremental analysis of various damage mechanisms and pr ogressions. The experimentally tested specimens aremore » non - destructively evaluated (NDE) with optical imaging, ultrasonics and computed tomography. The load displacement responses and the NDE are then utilized in numerical simulations for the purpose of model validation and vetting. The accompanying numerical simulation work serves two purposes. First, the results further reveal the time sequence of events and the meaning behind load dro ps not clear from NDE . Second, the simulations demonstrate insufficiencies in the code and can then direct future efforts for development.« less
Design and Analysis of Boiler Pressure Vessels based on IBR codes
NASA Astrophysics Data System (ADS)
Balakrishnan, B.; Kanimozhi, B.
2017-05-01
Pressure vessels components are widely used in the thermal and nuclear power plants for generating steam using the philosophy of heat transfer. In Thermal power plant, Coal is burnt inside the boiler furnace for generating the heat. The amount of heat produced through the combustion of pulverized coal is used in changing the phase transfer (i.e. Water into Super-Heated Steam) in the Pressure Parts Component. Pressure vessels are designed as per the Standards and Codes of the country, where the boiler is to be installed. One of the Standards followed in designing Pressure Parts is ASME (American Society of Mechanical Engineers). The mandatory requirements of ASME code must be satisfied by the manufacturer. In our project case, A Shell/pipe which has been manufactured using ASME code has an issue during the drilling of hole. The Actual Size of the drilled holes must be, as per the drawing, but due to error, the size has been differentiate from approved design calculation (i.e. the diameter size has been exceeded). In order to rectify this error, we have included an additional reinforcement pad to the drilled and modified the design of header in accordance with the code requirements.
Muharam, Yuswan; Warnatz, Jürgen
2007-08-21
A mechanism generator code to automatically generate mechanisms for the oxidation of large hydrocarbons has been successfully modified and considerably expanded in this work. The modification was through (1) improvement of the existing rules such as cyclic-ether reactions and aldehyde reactions, (2) inclusion of some additional rules to the code, such as ketone reactions, hydroperoxy cyclic-ether formations and additional reactions of alkenes, (3) inclusion of small oxygenates, produced by the code but not included in the handwritten C(1)-C(4) sub-mechanism yet, to the handwritten C(1)-C(4) sub-mechanism. In order to evaluate mechanisms generated by the code, simulations of observed results in different experimental environments have been carried out. Experimentally derived and numerically predicted ignition delays of n-heptane-air and n-decane-air mixtures in high-pressure shock tubes in a wide range of temperatures, pressures and equivalence ratios agree very well. Concentration profiles of the main products and intermediates of n-heptane and n-decane oxidation in jet-stirred reactors at a wide range of temperatures and equivalence ratios are generally well reproduced. In addition, the ignition delay times of different normal alkanes was numerically studied.
Qiu, Guo-Hua
2016-01-01
In this review, the protective function of the abundant non-coding DNA in the eukaryotic genome is discussed from the perspective of genome defense against exogenous nucleic acids. Peripheral non-coding DNA has been proposed to act as a bodyguard that protects the genome and the central protein-coding sequences from ionizing radiation-induced DNA damage. In the proposed mechanism of protection, the radicals generated by water radiolysis in the cytosol and IR energy are absorbed, blocked and/or reduced by peripheral heterochromatin; then, the DNA damage sites in the heterochromatin are removed and expelled from the nucleus to the cytoplasm through nuclear pore complexes, most likely through the formation of extrachromosomal circular DNA. To strengthen this hypothesis, this review summarizes the experimental evidence supporting the protective function of non-coding DNA against exogenous nucleic acids. Based on these data, I hypothesize herein about the presence of an additional line of defense formed by small RNAs in the cytosol in addition to their bodyguard protection mechanism in the nucleus. Therefore, exogenous nucleic acids may be initially inactivated in the cytosol by small RNAs generated from non-coding DNA via mechanisms similar to the prokaryotic CRISPR-Cas system. Exogenous nucleic acids may enter the nucleus, where some are absorbed and/or blocked by heterochromatin and others integrate into chromosomes. The integrated fragments and the sites of DNA damage are removed by repetitive non-coding DNA elements in the heterochromatin and excluded from the nucleus. Therefore, the normal eukaryotic genome and the central protein-coding sequences are triply protected by non-coding DNA against invasion by exogenous nucleic acids. This review provides evidence supporting the protective role of non-coding DNA in genome defense. Copyright © 2016 Elsevier B.V. All rights reserved.
2011-01-01
Background Human Rhinoviruses (HRVs) are well recognized viral pathogens associated with acute respiratory tract illnesses (RTIs) abundant worldwide. Although recent studies have phylogenetically identified the new HRV species (HRV-C), data on molecular epidemiology, genetic diversity, and clinical manifestation have been limited. Result To gain new insight into HRV genetic diversity, we determined the complete coding sequences of putative new members of HRV species C (HRV-CU072 with 1% prevalence) and HRV-B (HRV-CU211) identified from clinical specimens collected from pediatric patients diagnosed with a symptom of acute lower RTI. Complete coding sequence and phylogenetic analysis revealed that the HRV-CU072 strain shared a recent common ancestor with most closely related Chinese strain (N4). Comparative analysis at the protein level showed that HRV-CU072 might accumulate substitutional mutations in structural proteins, as well as nonstructural proteins 3C and 3 D. Comparative analysis of all available HRVs and HEVs indicated that HRV-C contains a relatively high G+C content and is more closely related to HEV-D. This might be correlated to their replication and capability to adapt to the high temperature environment of the human lower respiratory tract. We herein report an infrequently occurring intra-species recombination event in HRV-B species (HRV-CU211) with a crossing over having taken place at the boundary of VP2 and VP3 genes. Moreover, we observed phylogenetic compatibility in all HRV species and suggest that dynamic mechanisms for HRV evolution seem to be related to recombination events. These findings indicated that the elementary units shaping the genetic diversity of HRV-C could be found in the nonstructural 2A and 3D genes. Conclusion This study provides information for understanding HRV genetic diversity and insight into the role of selection pressure and recombination mechanisms influencing HRV evolution. PMID:21214911
Linsuwanon, Piyada; Payungporn, Sunchai; Suwannakarn, Kamol; Chieochansin, Thaweesak; Theamboonlers, Apiradee; Poovorawan, Yong
2011-01-07
Human Rhinoviruses (HRVs) are well recognized viral pathogens associated with acute respiratory tract illnesses (RTIs) abundant worldwide. Although recent studies have phylogenetically identified the new HRV species (HRV-C), data on molecular epidemiology, genetic diversity, and clinical manifestation have been limited. To gain new insight into HRV genetic diversity, we determined the complete coding sequences of putative new members of HRV species C (HRV-CU072 with 1% prevalence) and HRV-B (HRV-CU211) identified from clinical specimens collected from pediatric patients diagnosed with a symptom of acute lower RTI. Complete coding sequence and phylogenetic analysis revealed that the HRV-CU072 strain shared a recent common ancestor with most closely related Chinese strain (N4). Comparative analysis at the protein level showed that HRV-CU072 might accumulate substitutional mutations in structural proteins, as well as nonstructural proteins 3C and 3 D. Comparative analysis of all available HRVs and HEVs indicated that HRV-C contains a relatively high G+C content and is more closely related to HEV-D. This might be correlated to their replication and capability to adapt to the high temperature environment of the human lower respiratory tract. We herein report an infrequently occurring intra-species recombination event in HRV-B species (HRV-CU211) with a crossing over having taken place at the boundary of VP2 and VP3 genes. Moreover, we observed phylogenetic compatibility in all HRV species and suggest that dynamic mechanisms for HRV evolution seem to be related to recombination events. These findings indicated that the elementary units shaping the genetic diversity of HRV-C could be found in the nonstructural 2A and 3D genes. This study provides information for understanding HRV genetic diversity and insight into the role of selection pressure and recombination mechanisms influencing HRV evolution.
Li, Jia; Liu, Fei; Wang, Qi; Ge, Pupu; Woo, Patrick C. Y.; Yan, Jinghua; Zhao, Yanlin; Gao, George F.; Liu, Cui Hua; Liu, Changting
2014-01-01
The emergence and rapid spread of New Delhi Metallo-beta-lactamase-1 (NDM-1)-producing Klebsiella pneumoniae strains has caused a great concern worldwide. To better understand the mechanisms underlying environmental adaptation of those highly drug-resistant K. pneumoniae strains, we took advantage of the China's Shenzhou 10 spacecraft mission to conduct comparative genomic and transcriptomic analysis of a NDM-1 K. pneumoniae strain (ATCC BAA-2146) being cultivated under different conditions. The samples were recovered from semisolid medium placed on the ground (D strain), in simulated space condition (M strain), or in Shenzhou 10 spacecraft (T strain) for analysis. Our data revealed multiple variations underlying pathogen adaptation into different environments in terms of changes in morphology, H2O2 tolerance and biofilm formation ability, genomic stability and regulation of metabolic pathways. Additionally, we found a few non-coding RNAs to be differentially regulated. The results are helpful for better understanding the adaptive mechanisms of drug-resistant bacterial pathogens. PMID:25163721
Analysis of woven and braided fabric reinforced composites
NASA Technical Reports Server (NTRS)
Naik, Rajiv A.
1994-01-01
A general purpose micromechanics analysis that discretely models the yarn architecture within the textile repeating unit cell, was developed to predict overall, three dimensional, thermal and mechanical properties. This analytical technique was implemented in a user-friendly, personal computer-based, windows compatible code called Textile Composite Analysis for Design (TEXCAD). TEXCAD was used to analyze plain, 5-harness satin, and 8-harness satin weave composites along with 2-D braided and 2x2, 2-D triaxial braided composites. The calculated overall stiffnesses correlated well with available 3-D finite element results and test data for both the woven and the braided composites. Parametric studies were performed to investigate the effects of yarn size on the yarn crimp and the overall thermal and mechanical constants for plain weave composites. The effects of braid angle were investigated for the 2-D braided composites. Finally, the effects of fiber volume fraction on the yarn undulations and the thermal and mechanical properties of 2x2, 2-D triaxial braided composites were also investigated.
Benchmarking of Computational Models for NDE and SHM of Composites
NASA Technical Reports Server (NTRS)
Wheeler, Kevin; Leckey, Cara; Hafiychuk, Vasyl; Juarez, Peter; Timucin, Dogan; Schuet, Stefan; Hafiychuk, Halyna
2016-01-01
Ultrasonic wave phenomena constitute the leading physical mechanism for nondestructive evaluation (NDE) and structural health monitoring (SHM) of solid composite materials such as carbon-fiber-reinforced polymer (CFRP) laminates. Computational models of ultrasonic guided-wave excitation, propagation, scattering, and detection in quasi-isotropic laminates can be extremely valuable in designing practically realizable NDE and SHM hardware and software with desired accuracy, reliability, efficiency, and coverage. This paper presents comparisons of guided-wave simulations for CFRP composites implemented using three different simulation codes: two commercial finite-element analysis packages, COMSOL and ABAQUS, and a custom code implementing the Elastodynamic Finite Integration Technique (EFIT). Comparisons are also made to experimental laser Doppler vibrometry data and theoretical dispersion curves.
NASA Technical Reports Server (NTRS)
Moore, James; Marty, Dave; Cody, Joe
2000-01-01
SRS and NASA/MSFC have developed software with unique capabilities to couple bearing kinematic modeling with high fidelity thermal modeling. The core thermomechanical modeling software was developed by SRS and others in the late 1980's and early 1990's under various different contractual efforts. SRS originally developed software that enabled SHABERTH (Shaft Bearing Thermal Model) and SINDA (Systems Improved Numerical Differencing Analyzer) to exchange data and autonomously allowing bearing component temperature effects to propagate into the steady state bearing mechanical model. A separate contract was issued in 1990 to create a personal computer version of the software. At that time SRS performed major improvements to the code. Both SHABERTH and SINDA were independently ported to the PC and compiled. SRS them integrated the two programs into a single program that was named SINSHA. This was a major code improvement.
NASA Technical Reports Server (NTRS)
Moore, James; Marty, Dave; Cody, Joe
2000-01-01
SRS and NASA/MSFC have developed software with unique capabilities to couple bearing kinematic modeling with high fidelity thermal modeling. The core thermomechanical modeling software was developed by SRS and others in the late 1980's and early 1990's under various different contractual efforts. SRS originally developed software that enabled SHABERTH (Shaft Bearing Thermal Model) and SINDA (Systems Improved Numerical Differencing Analyzer) to exchange data and autonomously allowing bearing component temperature effects to propagate into the steady state bearing mechanical model. A separate contract was issued in 1990 to create a personal computer version of the software. At that time SRS performed major improvements to the code. Both SHABERTH and SINDA were independently ported to the PC and compiled. SRS them integrated the two programs into a single program that was named SINSHA. This was a major code improvement.
NASA Technical Reports Server (NTRS)
Tescher, Andrew G. (Editor)
1989-01-01
Various papers on image compression and automatic target recognition are presented. Individual topics addressed include: target cluster detection in cluttered SAR imagery, model-based target recognition using laser radar imagery, Smart Sensor front-end processor for feature extraction of images, object attitude estimation and tracking from a single video sensor, symmetry detection in human vision, analysis of high resolution aerial images for object detection, obscured object recognition for an ATR application, neural networks for adaptive shape tracking, statistical mechanics and pattern recognition, detection of cylinders in aerial range images, moving object tracking using local windows, new transform method for image data compression, quad-tree product vector quantization of images, predictive trellis encoding of imagery, reduced generalized chain code for contour description, compact architecture for a real-time vision system, use of human visibility functions in segmentation coding, color texture analysis and synthesis using Gibbs random fields.
PetIGA: A framework for high-performance isogeometric analysis
Dalcin, Lisandro; Collier, Nathaniel; Vignal, Philippe; ...
2016-05-25
We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility ofmore » PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. Lastly, we show strong scaling results on up to 4096 cores, which confirm the suitability of PetIGA for large scale simulations.« less
NASA Technical Reports Server (NTRS)
Sang, Janche
2003-01-01
Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.
Rodriguez-Horta, Edwin; Estevez-Rams, Ernesto; Lora-Serrano, Raimundo; Neder, Reinhard
2017-09-01
This is the second contribution in a series of papers dealing with dynamical models in equilibrium theories of polytypism. A Hamiltonian introduced by Ahmad & Khan [Phys. Status Solidi B (2000), 218, 425-430] avoids the unphysical assignment of interaction terms to fictitious entities given by spins in the Hägg coding of the stacking arrangement. In this paper an analysis of polytype generation and disorder in close-packed structures is made for such a Hamiltonian. Results are compared with a previous analysis using the Ising model. Computational mechanics is the framework under which the analysis is performed. The competing effects of disorder and structure, as given by entropy density and excess entropy, respectively, are discussed. It is argued that the Ahmad & Khan model is simpler and predicts a larger set of polytypes than previous treatments.
The accuracy of burn diagnosis codes in health administrative data: A validation study.
Mason, Stephanie A; Nathens, Avery B; Byrne, James P; Fowler, Rob; Gonzalez, Alejandro; Karanicolas, Paul J; Moineddin, Rahim; Jeschke, Marc G
2017-03-01
Health administrative databases may provide rich sources of data for the study of outcomes following burn. We aimed to determine the accuracy of International Classification of Diseases diagnoses codes for burn in a population-based administrative database. Data from a regional burn center's clinical registry of patients admitted between 2006-2013 were linked to administrative databases. Burn total body surface area (TBSA), depth, mechanism, and inhalation injury were compared between the registry and administrative records. The sensitivity, specificity, and positive and negative predictive values were determined, and coding agreement was assessed with the kappa statistic. 1215 burn center patients were linked to administrative records. TBSA codes were highly sensitive and specific for ≥10 and ≥20% TBSA (89/93% sensitive and 95/97% specific), with excellent agreement (κ, 0.85/κ, 0.88). Codes were weakly sensitive (68%) in identifying ≥10% TBSA full-thickness burn, though highly specific (86%) with moderate agreement (κ, 0.46). Codes for inhalation injury had limited sensitivity (43%) but high specificity (99%) with moderate agreement (κ, 0.54). Burn mechanism had excellent coding agreement (κ, 0.84). Administrative data diagnosis codes accurately identify burn by burn size and mechanism, while identification of inhalation injury or full-thickness burns is less sensitive but highly specific. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
Villegas, Victoria E; Rahman, Mohammed Ferdous-Ur; Fernandez-Barrena, Maite G; Diao, Yumei; Liapi, Eleni; Sonkoly, Enikö; Ståhle, Mona; Pivarcsi, Andor; Annaratone, Laura; Sapino, Anna; Ramírez Clavijo, Sandra; Bürglin, Thomas R; Shimokawa, Takashi; Ramachandran, Saraswathi; Kapranov, Philipp; Fernandez-Zapico, Martin E; Zaphiropoulos, Peter G
2014-07-01
Non-coding RNAs are a complex class of nucleic acids, with growing evidence supporting regulatory roles in gene expression. Here we identify a non-coding RNA located head-to-head with the gene encoding the Glioma-associated oncogene 1 (GLI1), a transcriptional effector of multiple cancer-associated signaling pathways. The expression of this three-exon GLI1 antisense (GLI1AS) RNA in cancer cells was concordant with GLI1 levels. siRNAs knockdown of GLI1AS up-regulated GLI1 and increased cellular proliferation and tumor growth in a xenograft model system. Conversely, GLI1AS overexpression decreased the levels of GLI1, its target genes PTCH1 and PTCH2, and cellular proliferation. Additionally, we demonstrate that GLI1 knockdown reduced GLI1AS, while GLI1 overexpression increased GLI1AS, supporting the role of GLI1AS as a target gene of the GLI1 transcription factor. Activation of TGFβ and Hedgehog signaling, two known regulators of GLI1 expression, conferred a concordant up-regulation of GLI1 and GLI1AS in cancer cells. Finally, analysis of the mechanism underlying the interplay between GLI1 and GLI1AS indicates that the non-coding RNA elicits a local alteration of chromatin structure by increasing the silencing mark H3K27me3 and decreasing the recruitment of RNA polymerase II to this locus. Taken together, the data demonstrate the existence of a novel non-coding RNA-based negative feedback loop controlling GLI1 levels, thus expanding the repertoire of mechanisms regulating the expression of this oncogenic transcription factor. Copyright © 2014 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Zhou, Yingbiao; Zhu, Yueming; Dai, Longhai; Men, Yan; Wu, Jinhai; Zhang, Juankun; Sun, Yuanxia
2017-01-01
Melibiose is widely used as a functional carbohydrate. Whole-cell biocatalytic production of melibiose from raffinose could reduce its cost. However, characteristics of strains for whole-cell biocatalysis and mechanism of such process are unclear. We compared three different Saccharomyces cerevisiae strains (liquor, wine, and baker's yeasts) in terms of concentration variations of substrate (raffinose), target product (melibiose), and by-products (fructose and galactose) in whole-cell biocatalysis process. Distinct difference was observed in whole-cell catalytic efficiency among three strains. Furthermore, activities of key enzymes (invertase, α-galactosidase, and fructose transporter) involved in process and expression levels of their coding genes (suc2, mel1, and fsy1) were investigated. Conservation of key genes in S. cerevisiae strains was also evaluated. Results show that whole-cell catalytic efficiency of S. cerevisiae in the raffinose substrate was closely related to activity of key enzymes and expression of their coding genes. Finally, we summarized characteristics of producing strain that offered advantages, as well as contributions of key genes to excellent strains. Furthermore, we presented a dynamic mechanism model to achieve some mechanism insight for this whole-cell biocatalytic process. This pioneering study should contribute to improvement of whole-cell biocatalytic production of melibiose from raffinose.
Fracture modes in notched angleplied composite laminates
NASA Technical Reports Server (NTRS)
Irvine, T. B.; Ginty, C. A.
1984-01-01
The Composite Durability Structural Analysis (CODSTRAN) computer code is used to determine composite fracture. Fracture modes in solid and notched, unidirectional and angleplied graphite/epoxy composites were determined by using CODSTRAN. Experimental verification included both nondestructive (ultrasonic C-Scanning) and destructive (scanning electron microscopy) techniques. The fracture modes were found to be a function of ply orientations and whether the composite is notched or unnotched. Delaminations caused by stress concentrations around notch tips were also determined. Results indicate that the composite mechanics, structural analysis, laminate analysis, and fracture criteria modules embedded in CODSTRAN are valid for determining composite fracture modes.
A Primer on Architectural Level Fault Tolerance
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
2008-01-01
This paper introduces the fundamental concepts of fault tolerant computing. Key topics covered are voting, fault detection, clock synchronization, Byzantine Agreement, diagnosis, and reliability analysis. Low level mechanisms such as Hamming codes or low level communications protocols are not covered. The paper is tutorial in nature and does not cover any topic in detail. The focus is on rationale and approach rather than detailed exposition.
Guidelines and Metrics for Assessing Space System Cost Estimates
2008-01-01
analysis time, reuse tooling, models , mechanical ground-support equipment [MGSE]) High mass margin ( simplifying assumptions used to bound solution...engineering environment changes High reuse of architecture, design , tools, code, test scripts, and commercial real- time operating systems Simplified life...Coronal Explorer TWTA traveling wave tube amplifier USAF U.S. Air Force USCM Unmanned Space Vehicle Cost Model USN U.S. Navy UV ultraviolet UVOT UV
Asteroseismology and mass loss in Be stars. Study with CoRoT
NASA Astrophysics Data System (ADS)
Diago, P. D.
The general aim of this work is the study of Be stars with the CoRoT space mission. The mechanisms responsible of the production and dynamics of the circumstellar gas in Be stars are still not constrained. Observations of non-radial pulsation beating phenomena connected to outbursts point toward a relevance of pulsation, but this mechanism cannot be generalized. In this regard, the observation of classical Be stars with the high-precision CoRoT satellite is providing important keys to understand the physics of these objects and the nature of the Be phenomenon. In order to study the light variations of the selected stars we use photometric and spectroscopic observations. These observations allow us to extract frequencies, amplitudes and phases of these variations. As we will show, these light variations can be connected with pulsations on the stellar surface. For carrying out the frequency analysis we have developed a new code based on standard Fourier analysis. The point is that this code, called PASPER, allows the frequency analysis of large sets of light curves in an automatic mode. This Ph.D. thesis is arranged as follows: In the first three Chapters we describe the scientific framework of this project, giving a brief description on Asteroseismology, presenting the current status of Be stars, and describing the basics of the Fourier analysis and the rudiments of the time series analysis. At the early begin of this Ph.D. thesis, the CoRoT satellite was still on ground getting ready for the launch. In this context, we perform a search for short-period B and Be star variables in the low metallicity environment of the Magellanic Clouds. This study constitutes the Part I of this Ph.D. thesis. This Part has a double goal: i) to test the frequency analysis codes; and ii) to detect observationally beta Cephei and SPB-like B-type pulsators in low metallicity environments, actually not predicted by the pulsational theory and models. This constitutes the PartI. Part II is devoted to the study of Be stars with the CoRoT space mission. Here we depict a complete review on the CoRoT mission. We also describe the results on the analysis of three Be stars from the CoRoT exoplanet field. Finally, we present the results on the frequency analysis of the late Be star HD50209, observed in the seismology field of the \\corot satellite. The analysis of this Be star has revealed up to sixty frequencies, grouped in six different and separated sets, attributed to g-mode pulsations. Finally, we resume the main conclusions of the whole project, including prospects and future work to be done. An addendum with all the published results derived from this project has been added at the end of this Part II. Part III encloses the Appendixes, providing a brief summary of this work in Spanish, a complete description on basic equations of non-radial oscillation, the user guide of the PASPER code and the user guide of the KURTZ_BOS code.
Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples
NASA Astrophysics Data System (ADS)
Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.
2012-12-01
The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mcmanus, H.L.; Chamis, C.C.
1996-01-01
This report describes analytical methods for calculating stresses and damage caused by degradation of the matrix constituent in polymer matrix composite materials. Laminate geometry, material properties, and matrix degradation states are specified as functions of position and time. Matrix shrinkage and property changes are modeled as functions of the degradation states. The model is incorporated into an existing composite mechanics computer code. Stresses, strains, and deformations at the laminate, ply, and micro levels are calculated, and from these calculations it is determined if there is failure of any kind. The rationale for the model (based on published experimental work) ismore » presented, its integration into the laminate analysis code is outlined, and example results are given, with comparisons to existing material and structural data. The mechanisms behind the changes in properties and in surface cracking during long-term aging of polyimide matrix composites are clarified. High-temperature-material test methods are also evaluated.« less
Gallistel, C. R.; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam
2014-01-01
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer. PMID:24637442
Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam
2014-02-26
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Data Documentation for Navy Civilian Manpower Study,
1986-09-01
Engineering 0830 Mechanical Engineer 0840 Nuclear Engineering 0850 Electrical Engineering 0855 Electronics Engineering 0856 Electronics ...OCCUPATIONAL LEVEL (DONOL) CODES DONOL code Title 1060 Engineering Drafting 1061 Electronics Technician w 1062 Engineering Technician 1063 Industrial...Architect 2314 Electrical Engineer 2315 Electronic Engineer 2316 Industrial Engineer 2317 Mechanical Engineer 2318
NASA Astrophysics Data System (ADS)
Uwaba, Tomoyuki; Ito, Masahiro; Nemoto, Junichi; Ichikawa, Shoichi; Katsuyama, Kozo
2014-09-01
The BAMBOO computer code was verified by results for the out-of-pile bundle compression test with large diameter pin bundle deformation under the bundle-duct interaction (BDI) condition. The pin diameters of the examined test bundles were 8.5 mm and 10.4 mm, which are targeted as preliminary fuel pin diameters for the upgraded core of the prototype fast breeder reactor (FBR) and for demonstration and commercial FBRs studied in the FaCT project. In the bundle compression test, bundle cross-sectional views were obtained from X-ray computer tomography (CT) images and local parameters of bundle deformation such as pin-to-duct and pin-to-pin clearances were measured by CT image analyses. In the verification, calculation results of bundle deformation obtained by the BAMBOO code analyses were compared with the experimental results from the CT image analyses. The comparison showed that the BAMBOO code reasonably predicts deformation of large diameter pin bundles under the BDI condition by assuming that pin bowing and cladding oval distortion are the major deformation mechanisms, the same as in the case of small diameter pin bundles. In addition, the BAMBOO analysis results confirmed that cladding oval distortion effectively suppresses BDI in large diameter pin bundles as well as in small diameter pin bundles.
Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller
NASA Astrophysics Data System (ADS)
Perdikis, S.; Leeb, R.; Williamson, J.; Ramsay, A.; Tavella, M.; Desideri, L.; Hoogerwerf, E.-J.; Al-Khodairy, A.; Murray-Smith, R.; Millán, J. d. R.
2014-06-01
Objective. While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. Approach. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. Main results. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. Significance. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.
NASA Technical Reports Server (NTRS)
Kiris, Cetin
1995-01-01
Development of an incompressible Navier-Stokes solution procedure was performed for the analysis of a liquid rocket engine pump components and for the mechanical heart assist devices. The solution procedure for the propulsion systems is applicable to incompressible Navier-Stokes flows in a steadily rotating frame of reference for any general complex configurations. The computer codes were tested on different complex configurations such as liquid rocket engine inducer and impellers. As a spin-off technology from the turbopump component simulations, the flow analysis for an axial heart pump was conducted. The baseline Left Ventricular Assist Device (LVAD) design was improved by adding an inducer geometry by adapting from the liquid rocket engine pump. The time-accurate mode of the incompressible Navier-Stokes code was validated with flapping foil experiment by using different domain decomposition methods. In the flapping foil experiment, two upstream NACA 0025 foils perform high-frequency synchronized motion and generate unsteady flow conditions for a downstream larger stationary foil. Fairly good agreement was obtained between unsteady experimental data and numerical results from two different moving boundary procedures. Incompressible Navier-Stokes code (INS3D) has been extended for heat transfer applications. The temperature equation was written for both forced and natural convection phenomena. Flow in a square duct case was used for the validation of the code in both natural and forced convection.
Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller.
Perdikis, S; Leeb, R; Williamson, J; Ramsay, A; Tavella, M; Desideri, L; Hoogerwerf, E-J; Al-Khodairy, A; Murray-Smith, R; Millán, J D R
2014-06-01
While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.
Estimation of the behavior factor of existing RC-MRF buildings
NASA Astrophysics Data System (ADS)
Vona, Marco; Mastroberti, Monica
2018-01-01
In recent years, several research groups have studied a new generation of analysis methods for seismic response assessment of existing buildings. Nevertheless, many important developments are still needed in order to define more reliable and effective assessment procedures. Moreover, regarding existing buildings, it should be highlighted that due to the low knowledge level, the linear elastic analysis is the only analysis method allowed. The same codes (such as NTC2008, EC8) consider the linear dynamic analysis with behavior factor as the reference method for the evaluation of seismic demand. This type of analysis is based on a linear-elastic structural model subject to a design spectrum, obtained by reducing the elastic spectrum through a behavior factor. The behavior factor (reduction factor or q factor in some codes) is used to reduce the elastic spectrum ordinate or the forces obtained from a linear analysis in order to take into account the non-linear structural capacities. The behavior factors should be defined based on several parameters that influence the seismic nonlinear capacity, such as mechanical materials characteristics, structural system, irregularity and design procedures. In practical applications, there is still an evident lack of detailed rules and accurate behavior factor values adequate for existing buildings. In this work, some investigations of the seismic capacity of the main existing RC-MRF building types have been carried out. In order to make a correct evaluation of the seismic force demand, actual behavior factor values coherent with force based seismic safety assessment procedure have been proposed and compared with the values reported in the Italian seismic code, NTC08.
Cenik, Can; Chua, Hon Nian; Zhang, Hui; Tarnawsky, Stefan P.; Akef, Abdalla; Derti, Adnan; Tasan, Murat; Moore, Melissa J.; Palazzo, Alexander F.; Roth, Frederick P.
2011-01-01
In higher eukaryotes, messenger RNAs (mRNAs) are exported from the nucleus to the cytoplasm via factors deposited near the 5′ end of the transcript during splicing. The signal sequence coding region (SSCR) can support an alternative mRNA export (ALREX) pathway that does not require splicing. However, most SSCR–containing genes also have introns, so the interplay between these export mechanisms remains unclear. Here we support a model in which the furthest upstream element in a given transcript, be it an intron or an ALREX–promoting SSCR, dictates the mRNA export pathway used. We also experimentally demonstrate that nuclear-encoded mitochondrial genes can use the ALREX pathway. Thus, ALREX can also be supported by nucleotide signals within mitochondrial-targeting sequence coding regions (MSCRs). Finally, we identified and experimentally verified novel motifs associated with the ALREX pathway that are shared by both SSCRs and MSCRs. Our results show strong correlation between 5′ untranslated region (5′UTR) intron presence/absence and sequence features at the beginning of the coding region. They also suggest that genes encoding secretory and mitochondrial proteins share a common regulatory mechanism at the level of mRNA export. PMID:21533221
Multiplexed Detection of Cytokines Based on Dual Bar-Code Strategy and Single-Molecule Counting.
Li, Wei; Jiang, Wei; Dai, Shuang; Wang, Lei
2016-02-02
Cytokines play important roles in the immune system and have been regarded as biomarkers. While single cytokine is not specific and accurate enough to meet the strict diagnosis in practice, in this work, we constructed a multiplexed detection method for cytokines based on dual bar-code strategy and single-molecule counting. Taking interferon-γ (IFN-γ) and tumor necrosis factor-α (TNF-α) as model analytes, first, the magnetic nanobead was functionalized with the second antibody and primary bar-code strands, forming a magnetic nanoprobe. Then, through the specific reaction of the second antibody and the antigen that fixed by the primary antibody, sandwich-type immunocomplex was formed on the substrate. Next, the primary bar-code strands as amplification units triggered multibranched hybridization chain reaction (mHCR), producing nicked double-stranded polymers with multiple branched arms, which were served as secondary bar-code strands. Finally, the secondary bar-code strands hybridized with the multimolecule labeled fluorescence probes, generating enhanced fluorescence signals. The numbers of fluorescence dots were counted one by one for quantification with epi-fluorescence microscope. By integrating the primary and secondary bar-code-based amplification strategy and the multimolecule labeled fluorescence probes, this method displayed an excellent sensitivity with the detection limits were both 5 fM. Unlike the typical bar-code assay that the bar-code strands should be released and identified on a microarray, this method is more direct. Moreover, because of the selective immune reaction and the dual bar-code mechanism, the resulting method could detect the two targets simultaneously. Multiple analysis in human serum was also performed, suggesting that our strategy was reliable and had a great potential application in early clinical diagnosis.
NASA Astrophysics Data System (ADS)
Naumov, D.; Fischer, T.; Böttcher, N.; Watanabe, N.; Walther, M.; Rink, K.; Bilke, L.; Shao, H.; Kolditz, O.
2014-12-01
OpenGeoSys (OGS) is a scientific open source code for numerical simulation of thermo-hydro-mechanical-chemical processes in porous and fractured media. Its basic concept is to provide a flexible numerical framework for solving multi-field problems for applications in geoscience and hydrology as e.g. for CO2 storage applications, geothermal power plant forecast simulation, salt water intrusion, water resources management, etc. Advances in computational mathematics have revolutionized the variety and nature of the problems that can be addressed by environmental scientists and engineers nowadays and an intensive code development in the last years enables in the meantime the solutions of much larger numerical problems and applications. However, solving environmental processes along the water cycle at large scales, like for complete catchment or reservoirs, stays computationally still a challenging task. Therefore, we started a new OGS code development with focus on execution speed and parallelization. In the new version, a local data structure concept improves the instruction and data cache performance by a tight bundling of data with an element-wise numerical integration loop. Dedicated analysis methods enable the investigation of memory-access patterns in the local and global assembler routines, which leads to further data structure optimization for an additional performance gain. The concept is presented together with a technical code analysis of the recent development and a large case study including transient flow simulation in the unsaturated / saturated zone of the Thuringian Syncline, Germany. The analysis is performed on a high-resolution mesh (up to 50M elements) with embedded fault structures.
NASA Astrophysics Data System (ADS)
Insulander Björk, Klara; Kekkonen, Laura
2015-12-01
Thorium-plutonium Mixed OXide (Th-MOX) fuel is considered for use in light water reactors fuel due to some inherent benefits over conventional fuel types in terms of neutronic properties. The good material properties of ThO2 also suggest benefits in terms of thermal-mechanical fuel performance, but the use of Th-MOX fuel for commercial power production demands that its thermal-mechanical behavior can be accurately predicted using a well validated fuel performance code. Given the scant operational experience with Th-MOX fuel, no such code is available today. This article describes the first phase of the development of such a code, based on the well-established code FRAPCON 3.4, and in particular the correlations reviewed and chosen for the fuel material properties. The results of fuel temperature calculations with the code in its current state of development are shown and compared with data from a Th-MOX test irradiation campaign which is underway in the Halden research reactor. The results are good for fresh fuel, whereas experimental complications make it difficult to judge the adequacy of the code for simulations of irradiated fuel.
Unsteady Flow Interactions Between the LH2 Feed Line and SSME LPFP Inducer
NASA Technical Reports Server (NTRS)
Dorney, Dan; Griffin, Lisa; Marcu, Bogdan; Williams, Morgan
2006-01-01
An extensive computational effort has been performed in order to investigate the nature of unsteady flow in the fuel line supplying the three Space Shuttle Main Engines during flight. Evidence of high cycle fatigue (HCF) in the flow liner one diameter upstream of the Low Pressure Fuel Pump inducer has been observed in several locations. The analysis presented in this report has the objective of determining the driving mechanisms inducing HCF and the associated fluid flow phenomena. The simulations have been performed using two different computational codes, the NASA MSFC PHANTOM code and the Pratt and Whitney Rocketdyne ENIGMA code. The fuel flow through the flow liner and the pump inducer have been modeled in full three-dimensional geometry, and the results of the computations compared with test data taken during hot fire tests at NASA Stennis Space Center, and cold-flow water flow test data obtained at NASA MSFC. The numerical results indicate that unsteady pressure fluctuations at specific frequencies develop in the duct at the flow-liner location. Detailed frequency analysis of the flow disturbances is presented. The unsteadiness is believed to be an important source for fluctuating pressures generating high cycle fatigue.
Computational control of flexible aerospace systems
NASA Technical Reports Server (NTRS)
Sharpe, Lonnie, Jr.; Shen, Ji Yao
1994-01-01
The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed based on several incomplete versions. The verification of the code had been conducted by comparing the results with those examples for which the exact theoretical solutions can be obtained. The theoretical background of the package and the verification examples has been reported in a technical paper submitted to the Joint Applied Mechanics & Material Conference, ASME. A brief USER'S MANUAL had been compiled, which includes three parts: (1) Input data preparation; (2) Explanation of the Subroutines; and (3) Specification of control variables. Meanwhile, a theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modeling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide an embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.
Application of unsteady aeroelastic analysis techniques on the national aerospace plane
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Spain, Charles V.; Soistmann, David L.; Noll, Thomas E.
1988-01-01
A presentation provided at the Fourth National Aerospace Plane Technology Symposium held in Monterey, California, in February 1988 is discussed. The objective is to provide current results of ongoing investigations to develop a methodology for predicting the aerothermoelastic characteristics of NASP-type (hypersonic) flight vehicles. Several existing subsonic and supersonic unsteady aerodynamic codes applicable to the hypersonic class of flight vehicles that are generally available to the aerospace industry are described. These codes were evaluated by comparing calculated results with measured wind-tunnel aeroelastic data. The agreement was quite good in the subsonic speed range but showed mixed agreement in the supersonic range. In addition, a future endeavor to extend the aeroelastic analysis capability to hypersonic speeds is outlined. An investigation to identify the critical parameters affecting the aeroelastic characteristics of a hypersonic vehicle, to define and understand the various flutter mechanisms, and to develop trends for the important parameters using a simplified finite element model of the vehicle is summarized. This study showed the value of performing inexpensive and timely aeroelastic wind-tunnel tests to expand the experimental data base required for code validation using simple to complex models that are representative of the NASP configurations and root boundary conditions are discussed.
Genomic and Epigenomic Insights into Nutrition and Brain Disorders
Dauncey, Margaret Joy
2013-01-01
Considerable evidence links many neuropsychiatric, neurodevelopmental and neurodegenerative disorders with multiple complex interactions between genetics and environmental factors such as nutrition. Mental health problems, autism, eating disorders, Alzheimer’s disease, schizophrenia, Parkinson’s disease and brain tumours are related to individual variability in numerous protein-coding and non-coding regions of the genome. However, genotype does not necessarily determine neurological phenotype because the epigenome modulates gene expression in response to endogenous and exogenous regulators, throughout the life-cycle. Studies using both genome-wide analysis of multiple genes and comprehensive analysis of specific genes are providing new insights into genetic and epigenetic mechanisms underlying nutrition and neuroscience. This review provides a critical evaluation of the following related areas: (1) recent advances in genomic and epigenomic technologies, and their relevance to brain disorders; (2) the emerging role of non-coding RNAs as key regulators of transcription, epigenetic processes and gene silencing; (3) novel approaches to nutrition, epigenetics and neuroscience; (4) gene-environment interactions, especially in the serotonergic system, as a paradigm of the multiple signalling pathways affected in neuropsychiatric and neurological disorders. Current and future advances in these four areas should contribute significantly to the prevention, amelioration and treatment of multiple devastating brain disorders. PMID:23503168
Validation of CFD/Heat Transfer Software for Turbine Blade Analysis
NASA Technical Reports Server (NTRS)
Kiefer, Walter D.
2004-01-01
I am an intern in the Turbine Branch of the Turbomachinery and Propulsion Systems Division. The division is primarily concerned with experimental and computational methods of calculating heat transfer effects of turbine blades during operation in jet engines and land-based power systems. These include modeling flow in internal cooling passages and film cooling, as well as calculating heat flux and peak temperatures to ensure safe and efficient operation. The branch is research-oriented, emphasizing the development of tools that may be used by gas turbine designers in industry. The branch has been developing a computational fluid dynamics (CFD) and heat transfer code called GlennHT to achieve the computational end of this analysis. The code was originally written in FORTRAN 77 and run on Silicon Graphics machines. However the code has been rewritten and compiled in FORTRAN 90 to take advantage of more modem computer memory systems. In addition the branch has made a switch in system architectures from SGI's to Linux PC's. The newly modified code therefore needs to be tested and validated. This is the primary goal of my internship. To validate the GlennHT code, it must be run using benchmark fluid mechanics and heat transfer test cases, for which there are either analytical solutions or widely accepted experimental data. From the solutions generated by the code, comparisons can be made to the correct solutions to establish the accuracy of the code. To design and create these test cases, there are many steps and programs that must be used. Before a test case can be run, pre-processing steps must be accomplished. These include generating a grid to describe the geometry, using a software package called GridPro. Also various files required by the GlennHT code must be created including a boundary condition file, a file for multi-processor computing, and a file to describe problem and algorithm parameters. A good deal of this internship will be to become familiar with these programs and the structure of the GlennHT code. Additional information is included in the original extended abstract.
Comprehensive Analysis of Genome Rearrangements in Eight Human Malignant Tumor Tissues
Wang, Chong
2016-01-01
Carcinogenesis is a complex multifactorial, multistage process, but the precise mechanisms are not well understood. In this study, we performed a genome-wide analysis of the copy number variation (CNV), breakpoint region (BPR) and fragile sites in 2,737 tumor samples from eight tumor entities and in 432 normal samples. CNV detection and BPR identification revealed that BPRs tended to accumulate in specific genomic regions in tumor samples whereas being dispersed genome-wide in the normal samples. Hotspots were observed, at which segments with similar alteration in copy number were overlapped along with BPRs adjacently clustered. Evaluation of BPR occurrence frequency showed that at least one was detected in about and more than 15% of samples for each tumor entity while BPRs were maximal in 12% of the normal samples. 127 of 2,716 tumor-relevant BPRs (termed ‘common BPRs’) exhibited also a noticeable occurrence frequency in the normal samples. Colocalization assessment identified 20,077 CNV-affecting genes and 169 of these being known tumor-related genes. The most noteworthy genes are KIAA0513 important for immunologic, synaptic and apoptotic signal pathways, intergenic non-coding RNA RP11-115C21.2 possibly acting as oncogene or tumor suppressor by changing the structure of chromatin, and ADAM32 likely importance in cancer cell proliferation and progression by ectodomain-shedding of diverse growth factors, and the well-known tumor suppressor gene p53. The BPR distributions indicate that CNV mutations are likely non-random in tumor genomes. The marked recurrence of BPRs at specific regions supports common progression mechanisms in tumors. The presence of hotspots together with common BPRs, despite its small group size, imply a relation between fragile sites and cancer-gene alteration. Our data further suggest that both protein-coding and non-coding genes possessing a range of biological functions might play a causative or functional role in tumor biology. This research enhances our understanding of the mechanisms for tumorigenesis and progression. PMID:27391163
Development of a Fatigue Crack Growth Coupon for Highly Plastic Stress Conditions
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Aggarwal, Pravin K.; Swanson, Gregory R.
2003-01-01
This paper presents an analytical approach used to develop a novel fatigue crack growth coupon for a highly plastic 3-D stress field condition. The flight hardware investigated in this paper is a large separation bolt that fractures using pyrotechnics at the appointed time during the flight sequence. The separation bolt has a deep notch that produces a severe stress concentration and a large plastic zone when highly loaded. For this geometry, linear-elastic fracture mechanics (LEFM) techniques are not valid due to the large nonlinear stress field. Unfortunately, industry codes that are generally available for fracture mechanics analysis and fatigue crack growth (e.g. NASGRO (11) are limited to LEFM and are available for only a limited number of geometries. The results of LEFM based codes are questionable when used on geometries with significant plasticity. Therefore elastic-plastic fracture mechanics (EPFM) techniques using the finite element method (FEM) were used to analyze the bolt and test coupons. scale flight hardware is very costly in t e r n of assets, laboratory resources, and schedule. Therefore to alleviate some of these problems, a series of novel test coupons were developed to simulate the elastic-plastic stress field present in the bolt.
DNA methylation of miRNA coding sequences putatively associated with childhood obesity.
Mansego, M L; Garcia-Lacarte, M; Milagro, F I; Marti, A; Martinez, J A
2017-02-01
Epigenetic mechanisms may be involved in obesity onset and its consequences. The aim of the present study was to evaluate whether DNA methylation status in microRNA (miRNA) coding regions is associated with childhood obesity. DNA isolated from white blood cells of 24 children (identification sample: 12 obese and 12 non-obese) from the Grupo Navarro de Obesidad Infantil study was hybridized in a 450 K methylation microarray. Several CpGs whose DNA methylation levels were statistically different between obese and non-obese were validated by MassArray® in 95 children (validation sample) from the same study. Microarray analysis identified 16 differentially methylated CpGs between both groups (6 hypermethylated and 10 hypomethylated). DNA methylation levels in miR-1203, miR-412 and miR-216A coding regions significantly correlated with body mass index standard deviation score (BMI-SDS) and explained up to 40% of the variation of BMI-SDS. The network analysis identified 19 well-defined obesity-relevant biological pathways from the KEGG database. MassArray® validation identified three regions located in or near miR-1203, miR-412 and miR-216A coding regions differentially methylated between obese and non-obese children. The current work identified three CpG sites located in coding regions of three miRNAs (miR-1203, miR-412 and miR-216A) that were differentially methylated between obese and non-obese children, suggesting a role of miRNA epigenetic regulation in childhood obesity. © 2016 World Obesity Federation.
Structural dynamics of shroudless, hollow fan blades with composite in-lays
NASA Technical Reports Server (NTRS)
Aiello, R. A.; Hirschbein, M. S.; Chamis, C. C.
1982-01-01
Structural and dynamic analyses are presented for a shroudless, hollow titanium fan blade proposed for future use in aircraft turbine engines. The blade was modeled and analyzed using the composite blade structural analysis computer program (COBSTRAN); an integrated program consisting of mesh generators, composite mechanics codes, NASTRAN, and pre- and post-processors. Vibration and impact analyses are presented. The vibration analysis was conducted with COBSTRAN. Results show the effect of the centrifugal force field on frequencies, twist, and blade camber. Bird impact analysis was performed with the multi-mode blade impact computer program. This program uses the geometric model and modal analysis from the COBSTRAN vibration analysis to determine the gross impact response of the fan blades to bird strikes. The structural performance of this blade is also compared to a blade of similar design but with composite in-lays on the outer surface. Results show that the composite in-lays can be selected (designed) to substantially modify the mechanical performance of the shroudless, hollow fan blade.
Digital video technologies and their network requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. P. Tsang; H. Y. Chen; J. M. Brandt
1999-11-01
Coded digital video signals are considered to be one of the most difficult data types to transport due to their real-time requirements and high bit rate variability. In this study, the authors discuss the coding mechanisms incorporated by the major compression standards bodies, i.e., JPEG and MPEG, as well as more advanced coding mechanisms such as wavelet and fractal techniques. The relationship between the applications which use these coding schemes and their network requirements are the major focus of this study. Specifically, the authors relate network latency, channel transmission reliability, random access speed, buffering and network bandwidth with the variousmore » coding techniques as a function of the applications which use them. Such applications include High-Definition Television, Video Conferencing, Computer-Supported Collaborative Work (CSCW), and Medical Imaging.« less
Codes of Ethics and Teachers' Professional Autonomy
ERIC Educational Resources Information Center
Schwimmer, Marina; Maxwell, Bruce
2017-01-01
This article considers the value of adopting a code of professional ethics for teachers. After having underlined how a code of ethics stands to benefits a community of educators--namely, by providing a mechanism for regulating autonomy and promoting a shared professional ethic--the article examines the principal arguments against codes of ethics.…
Do plant cell walls have a code?
Tavares, Eveline Q P; Buckeridge, Marcos S
2015-12-01
A code is a set of rules that establish correspondence between two worlds, signs (consisting of encrypted information) and meaning (of the decrypted message). A third element, the adaptor, connects both worlds, assigning meaning to a code. We propose that a Glycomic Code exists in plant cell walls where signs are represented by monosaccharides and phenylpropanoids and meaning is cell wall architecture with its highly complex association of polymers. Cell wall biosynthetic mechanisms, structure, architecture and properties are addressed according to Code Biology perspective, focusing on how they oppose to cell wall deconstruction. Cell wall hydrolysis is mainly focused as a mechanism of decryption of the Glycomic Code. Evidence for encoded information in cell wall polymers fine structure is highlighted and the implications of the existence of the Glycomic Code are discussed. Aspects related to fine structure are responsible for polysaccharide packing and polymer-polymer interactions, affecting the final cell wall architecture. The question whether polymers assembly within a wall display similar properties as other biological macromolecules (i.e. proteins, DNA, histones) is addressed, i.e. do they display a code? Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y. Q.; Shemon, E. R.; Thomas, J. W.
SHARP is an advanced modeling and simulation toolkit for the analysis of nuclear reactors. It is comprised of several components including physical modeling tools, tools to integrate the physics codes for multi-physics analyses, and a set of tools to couple the codes within the MOAB framework. Physics modules currently include the neutronics code PROTEUS, the thermal-hydraulics code Nek5000, and the structural mechanics code Diablo. This manual focuses on performing multi-physics calculations with the SHARP ToolKit. Manuals for the three individual physics modules are available with the SHARP distribution to help the user to either carry out the primary multi-physics calculationmore » with basic knowledge or perform further advanced development with in-depth knowledge of these codes. This manual provides step-by-step instructions on employing SHARP, including how to download and install the code, how to build the drivers for a test case, how to perform a calculation and how to visualize the results. Since SHARP has some specific library and environment dependencies, it is highly recommended that the user read this manual prior to installing SHARP. Verification tests cases are included to check proper installation of each module. It is suggested that the new user should first follow the step-by-step instructions provided for a test problem in this manual to understand the basic procedure of using SHARP before using SHARP for his/her own analysis. Both reference output and scripts are provided along with the test cases in order to verify correct installation and execution of the SHARP package. At the end of this manual, detailed instructions are provided on how to create a new test case so that user can perform novel multi-physics calculations with SHARP. Frequently asked questions are listed at the end of this manual to help the user to troubleshoot issues.« less
Space Debris Surfaces - Probability of no penetration versus impact velocity and obliquity
NASA Technical Reports Server (NTRS)
Elfer, N.; Meibaum, R.; Olsen, G.
1992-01-01
A collection of computer codes called Space Debris Surfaces (SD-SURF), have been developed to assist in the design and analysis of space debris protection systems. An SD-SURF analysis will show which obliquities and velocities are most likely to cause a penetration to help the analyst select a shield design best suited to the predominant penetration mechanism. Examples of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration are presented.
Analysis for Building Envelopes and Mechanical Systems Using 2012 CBECS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winiarski, David W.; Halverson, Mark A.; Butzbaugh, Joshua B.
This report describes the aggregation and mapping of certain building characteristics data available in the most recent Commercial Building Energy Consumption Survey (CBECS) (DOE EIA 2012) to describe most typical building construction practices. This report provides summary data for potential use in the support of modifications to the Pacific Northwest National Laboratory’s commercial building prototypes used for building energy code analysis. Specifically, this report outlines findings and most typical design choices for certain building envelope and heating, ventilating, and air-conditioning (HVAC) system choices.
Fracture Mechanics for Composites: State of the Art and Challenges
NASA Technical Reports Server (NTRS)
Krueger, Ronald; Krueger, Ronald
2006-01-01
Interlaminar fracture mechanics has proven useful for characterizing the onset of delaminations in composites and has been used with limited success primarily to investigate onset in fracture toughness specimens and laboratory size coupon type specimens. Future acceptance of the methodology by industry and certification authorities however, requires the successful demonstration of the methodology on the structural level. In this paper, the state-of-the-art in fracture toughness characterization, and interlaminar fracture mechanics analysis tools are described. To demonstrate the application on the structural level, a panel was selected which is reinforced with stringers. Full implementation of interlaminar fracture mechanics in design however remains a challenge and requires a continuing development effort of codes to calculate energy release rates and advancements in delamination onset and growth criteria under mixed mode conditions.
Zipf's Law in Short-Time Timbral Codings of Speech, Music, and Environmental Sound Signals
Haro, Martín; Serrà, Joan; Herrera, Perfecto; Corral, Álvaro
2012-01-01
Timbre is a key perceptual feature that allows discrimination between different sounds. Timbral sensations are highly dependent on the temporal evolution of the power spectrum of an audio signal. In order to quantitatively characterize such sensations, the shape of the power spectrum has to be encoded in a way that preserves certain physical and perceptual properties. Therefore, it is common practice to encode short-time power spectra using psychoacoustical frequency scales. In this paper, we study and characterize the statistical properties of such encodings, here called timbral code-words. In particular, we report on rank-frequency distributions of timbral code-words extracted from 740 hours of audio coming from disparate sources such as speech, music, and environmental sounds. Analogously to text corpora, we find a heavy-tailed Zipfian distribution with exponent close to one. Importantly, this distribution is found independently of different encoding decisions and regardless of the audio source. Further analysis on the intrinsic characteristics of most and least frequent code-words reveals that the most frequent code-words tend to have a more homogeneous structure. We also find that speech and music databases have specific, distinctive code-words while, in the case of the environmental sounds, this database-specific code-words are not present. Finally, we find that a Yule-Simon process with memory provides a reasonable quantitative approximation for our data, suggesting the existence of a common simple generative mechanism for all considered sound sources. PMID:22479497
Long non-coding RNAs and mRNAs profiling during spleen development in pig.
Che, Tiandong; Li, Diyan; Jin, Long; Fu, Yuhua; Liu, Yingkai; Liu, Pengliang; Wang, Yixin; Tang, Qianzi; Ma, Jideng; Wang, Xun; Jiang, Anan; Li, Xuewei; Li, Mingzhou
2018-01-01
Genome-wide transcriptomic studies in humans and mice have become extensive and mature. However, a comprehensive and systematic understanding of protein-coding genes and long non-coding RNAs (lncRNAs) expressed during pig spleen development has not been achieved. LncRNAs are known to participate in regulatory networks for an array of biological processes. Here, we constructed 18 RNA libraries from developing fetal pig spleen (55 days before birth), postnatal pig spleens (0, 30, 180 days and 2 years after birth), and the samples from the 2-year-old Wild Boar. A total of 15,040 lncRNA transcripts were identified among these samples. We found that the temporal expression pattern of lncRNAs was more restricted than observed for protein-coding genes. Time-series analysis showed two large modules for protein-coding genes and lncRNAs. The up-regulated module was enriched for genes related to immune and inflammatory function, while the down-regulated module was enriched for cell proliferation processes such as cell division and DNA replication. Co-expression networks indicated the functional relatedness between protein-coding genes and lncRNAs, which were enriched for similar functions over the series of time points examined. We identified numerous differentially expressed protein-coding genes and lncRNAs in all five developmental stages. Notably, ceruloplasmin precursor (CP), a protein-coding gene participating in antioxidant and iron transport processes, was differentially expressed in all stages. This study provides the first catalog of the developing pig spleen, and contributes to a fuller understanding of the molecular mechanisms underpinning mammalian spleen development.
X-Antenna: A graphical interface for antenna analysis codes
NASA Technical Reports Server (NTRS)
Goldstein, B. L.; Newman, E. H.; Shamansky, H. T.
1995-01-01
This report serves as the user's manual for the X-Antenna code. X-Antenna is intended to simplify the analysis of antennas by giving the user graphical interfaces in which to enter all relevant antenna and analysis code data. Essentially, X-Antenna creates a Motif interface to the user's antenna analysis codes. A command-file allows new antennas and codes to be added to the application. The menu system and graphical interface screens are created dynamically to conform to the data in the command-file. Antenna data can be saved and retrieved from disk. X-Antenna checks all antenna and code values to ensure they are of the correct type, writes an output file, and runs the appropriate antenna analysis code. Volumetric pattern data may be viewed in 3D space with an external viewer run directly from the application. Currently, X-Antenna includes analysis codes for thin wire antennas (dipoles, loops, and helices), rectangular microstrip antennas, and thin slot antennas.
Kozlik, Julia; Neumann, Roland; Lozo, Ljubica
2015-01-01
Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus-response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus-response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S-R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed.
Kozlik, Julia; Neumann, Roland; Lozo, Ljubica
2015-01-01
Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus–response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus–response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S–R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed. PMID:25983718
Dynamic analysis of flexible mechanical systems using LATDYN
NASA Technical Reports Server (NTRS)
Wu, Shih-Chin; Chang, Che-Wei; Housner, Jerrold M.
1989-01-01
A 3-D, finite element based simulation tool for flexible multibody systems is presented. Hinge degrees-of-freedom is built into equations of motion to reduce geometric constraints. The approach avoids the difficulty in selecting deformation modes for flexible components by using assumed mode method. The tool is applied to simulate a practical space structure deployment problem. Results of examples demonstrate the capability of the code and approach.
Cancer communication science funding trends, 2000-2012.
Ramírez, A Susana; Galica, Kasia; Blake, Kelly D; Chou, Wen-Ying Sylvia; Hesse, Bradford W
2013-12-01
Since 2000, the field of health communication has grown tremendously, owing largely to research funding by the National Cancer Institute (NCI). This study provides an overview of cancer communication science funding trends in the past decade. We conducted an analysis of communication-related grant applications submitted to the NCI in fiscal years 2000-2012. Using 103 keywords related to health communication, data were extracted from the Portfolio Management Application, a grants management application used at NCI. Automated coding described key grant characteristics such as mechanism and review study section. Manual coding determined funding across the cancer control continuum, by cancer site, and by cancer risk factors. A total of 3307 unique grant applications met initial inclusion criteria; 1013 of these were funded over the 12-year period. The top funded grant mechanisms were the R01, R21, and R03. Applications were largely investigator-initiated proposals as opposed to responses to particular funding opportunity announcements. Among funded communication research, the top risk factor being studied was tobacco, and across the cancer control continuum, cancer prevention was the most common stage investigated. NCI support of cancer communication research has been an important source of growth for health communication science over the last 12 years. The analysis' findings describe NCI's priorities in cancer communication science and suggest areas for future investments.
Fluid Structure Interaction in a Turbine Blade
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.
2004-01-01
An unsteady, three dimensional Navier-Stokes solution in rotating frame formulation for turbomachinery applications is presented. Casting the governing equations in a rotating frame enabled the freezing of grid motion and resulted in substantial savings in computer time. The turbine blade was computationally simulated and probabilistically evaluated in view of several uncertainties in the aerodynamic, structural, material and thermal variables that govern the turbine blade. The interconnection between the computational fluid dynamics code and finite element structural analysis code was necessary to couple the thermal profiles with the structural design. The stresses and their variations were evaluated at critical points on the Turbine blade. Cumulative distribution functions and sensitivity factors were computed for stress responses due to aerodynamic, geometric, mechanical and thermal random variables.
Numerical Simulation of the ``Fluid Mechanical Sewing Machine''
NASA Astrophysics Data System (ADS)
Brun, Pierre-Thomas; Audoly, Basile; Ribe, Neil
2011-11-01
A thin thread of viscous fluid falling onto a moving conveyor belt generates a wealth of complex ``stitch'' patterns depending on the belt speed and the fall height. To understand the rich nonlinear dynamics of this system, we have developed a new numerical code for simulating unsteady viscous threads, based on a discrete description of the geometry and a variational formulation for the viscous stresses. The code successfully reproduces all major features of the experimental state diagram of Morris et al. (Phys. Rev. E 2008). Fourier analysis of the motion of the thread's contact point with the belt suggests a new classification of the observed patterns, and reveals that the system behaves as a nonlinear oscillator coupling the pendulum modes of the thread.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, D.L.; Simonen, F.A.; Strosnider, J. Jr.
The VISA (Vessel Integrity Simulation Analysis) code was developed as part of the NRC staff evaluation of pressurized thermal shock. VISA uses Monte Carlo simulation to evaluate the failure probability of a pressurized water reactor (PWR) pressure vessel subjected to a pressure and thermal transient specified by the user. Linear elastic fracture mechanics are used to model crack initiation and propagation. parameters for initial crack size, copper content, initial RT/sub NDT/, fluence, crack-initiation fracture toughness, and arrest fracture toughness are treated as random variables. This report documents the version of VISA used in the NRC staff report (Policy Issue frommore » J.W. Dircks to NRC Commissioners, Enclosure A: NRC Staff Evaluation of Pressurized Thermal Shock, November 1982, SECY-82-465) and includes a user's guide for the code.« less
Development of high-fidelity multiphysics system for light water reactor analysis
NASA Astrophysics Data System (ADS)
Magedanz, Jeffrey W.
There has been a tendency in recent years toward greater heterogeneity in reactor cores, due to the use of mixed-oxide (MOX) fuel, burnable absorbers, and longer cycles with consequently higher fuel burnup. The resulting asymmetry of the neutron flux and energy spectrum between regions with different compositions causes a need to account for the directional dependence of the neutron flux, instead of the traditional diffusion approximation. Furthermore, the presence of both MOX and high-burnup fuel in the core increases the complexity of the heat conduction. The heat transfer properties of the fuel pellet change with irradiation, and the thermal and mechanical expansion of the pellet and cladding strongly affect the size of the gap between them, and its consequent thermal resistance. These operational tendencies require higher fidelity multi-physics modeling capabilities, and this need is addressed by the developments performed within this PhD research. The dissertation describes the development of a High-Fidelity Multi-Physics System for Light Water Reactor Analysis. It consists of three coupled codes -- CTF for Thermal Hydraulics, TORT-TD for Neutron Kinetics, and FRAPTRAN for Fuel Performance. It is meant to address these modeling challenges in three ways: (1) by resolving the state of the system at the level of each fuel pin, rather than homogenizing entire fuel assemblies, (2) by using the multi-group Discrete Ordinates method to account for the directional dependence of the neutron flux, and (3) by using a fuel-performance code, rather than a Thermal Hydraulics code's simplified fuel model, to account for the material behavior of the fuel and its feedback to the hydraulic and neutronic behavior of the system. While the first two are improvements, the third, the use of a fuel-performance code for feedback, constitutes an innovation in this PhD project. Also important to this work is the manner in which such coupling is written. While coupling involves combining codes into a single executable, they are usually still developed and maintained separately. It should thus be a design objective to minimize the changes to those codes, and keep the changes to each code free of dependence on the details of the other codes. This will ease the incorporation of new versions of the code into the coupling, as well as re-use of parts of the coupling to couple with different codes. In order to fulfill this objective, an interface for each code was created in the form of an object-oriented abstract data type. Object-oriented programming is an effective method for enforcing a separation between different parts of a program, and clarifying the communication between them. The interfaces enable the main program to control the codes in terms of high-level functionality. This differs from the established practice of a master/slave relationship, in which the slave code is incorporated into the master code as a set of subroutines. While this PhD research continues previous work with a coupling between CTF and TORT-TD, it makes two major original contributions: (1) using a fuel-performance code, instead of a thermal-hydraulics code's simplified built-in models, to model the feedback from the fuel rods, and (2) the design of an object-oriented interface as an innovative method to interact with a coupled code in a high-level, easily-understandable manner. The resulting code system will serve as a tool to study the question of under what conditions, and to what extent, these higher-fidelity methods will provide benefits to reactor core analysis. (Abstract shortened by UMI.)
1991-01-01
Society 6 of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code [ 1980]. Their results are similar to those of Satoh and Toyoda, and are...E813-89. American Society of Mechanical Engineers, Boiler and Pressure Vessel Code , Section III, Nuclear Power Plant Components, 1980. American
ERIC Educational Resources Information Center
Sweeny, Timothy D.; Haroz, Steve; Whitney, David
2013-01-01
Many species, including humans, display group behavior. Thus, perceiving crowds may be important for social interaction and survival. Here, we provide the first evidence that humans use ensemble-coding mechanisms to perceive the behavior of a crowd of people with surprisingly high sensitivity. Observers estimated the headings of briefly presented…
Guo, Jun; Zhou, Yuan; Cheng, Yafen; Fang, Weiwei; Hu, Gang; Wei, Jie; Lin, Yajun; Man, Yong; Guo, Lixin; Sun, Mingxiao; Cui, Qinghua; Li, Jian
2018-01-01
Recent studies have suggested that changes in non-coding mRNA play a key role in the progression of non-alcoholic fatty liver disease (NAFLD). Metformin is now recommended and effective for the treatment of NAFLD. We hope the current analyses of the non-coding mRNA transcriptome will provide a better presentation of the potential roles of mRNAs and long non-coding RNAs (lncRNAs) that underlie NAFLD and metformin intervention. The present study mainly analysed changes in the coding transcriptome and non-coding RNAs after the application of a five-week metformin intervention. Liver samples from three groups of mice were harvested for transcriptome profiling, which covered mRNA, lncRNA, microRNA (miRNA) and circular RNA (circRNA), using a microarray technique. A systematic alleviation of high-fat diet (HFD)-induced transcriptome alterations by metformin was observed. The metformin treatment largely reversed the correlations with diabetes-related pathways. Our analysis also suggested interaction networks between differentially expressed lncRNAs and known hepatic disease genes and interactions between circRNA and their disease-related miRNA partners. Eight HFD-responsive lncRNAs and three metformin-responsive lncRNAs were noted due to their widespread associations with disease genes. Moreover, seven miRNAs that interacted with multiple differentially expressed circRNAs were highlighted because they were likely to be associated with metabolic or liver diseases. The present study identified novel changes in the coding transcriptome and non-coding RNAs in the livers of NAFLD mice after metformin treatment that might shed light on the underlying mechanism by which metformin impedes the progression of NAFLD. © 2018 The Author(s). Published by S. Karger AG, Basel.
A reaction-diffusion-based coding rate control mechanism for camera sensor networks.
Yamamoto, Hiroshi; Hyodo, Katsuya; Wakamiya, Naoki; Murata, Masayuki
2010-01-01
A wireless camera sensor network is useful for surveillance and monitoring for its visibility and easy deployment. However, it suffers from the limited capacity of wireless communication and a network is easily overflown with a considerable amount of video traffic. In this paper, we propose an autonomous video coding rate control mechanism where each camera sensor node can autonomously determine its coding rate in accordance with the location and velocity of target objects. For this purpose, we adopted a biological model, i.e., reaction-diffusion model, inspired by the similarity of biological spatial patterns and the spatial distribution of video coding rate. Through simulation and practical experiments, we verify the effectiveness of our proposal.
Spartan Release Engagement Mechanism (REM) stress and fracture analysis
NASA Technical Reports Server (NTRS)
Marlowe, D. S.; West, E. J.
1984-01-01
The revised stress and fracture analysis of the Spartan REM hardware for current load conditions and mass properties is presented. The stress analysis was performed using a NASTRAN math model of the Spartan REM adapter, base, and payload. Appendix A contains the material properties, loads, and stress analysis of the hardware. The computer output and model description are in Appendix B. Factors of safety used in the stress analysis were 1.4 on tested items and 2.0 on all other items. Fracture analysis of the items considered fracture critical was accomplished using the MSFC Crack Growth Analysis code. Loads and stresses were obtaind from the stress analysis. The fracture analysis notes are located in Appendix A and the computer output in Appendix B. All items analyzed met design and fracture criteria.
Analysis of the new code stroke protocol in Asturias after one year. Experience at one hospital.
García-Cabo, C; Benavente, L; Martínez-Ramos, J; Pérez-Álvarez, Á; Trigo, A; Calleja, S
2018-03-01
Prehospital code stroke (CS) systems have been proved effective for improving access to specialised medical care in acute stroke cases. They also improve the prognosis of this disease, which is one of the leading causes of death and disability in our setting. The aim of this study is to analyse results one year after implementation of the new code stroke protocol at one hospital in Asturias. We prospectively included patients who were admitted to our tertiary care centre as per the code stroke protocol for the period of one year. We analysed 363 patients. Mean age was 69 years and 54% of the cases were men. During the same period in the previous year, there were 236 non-hospital CS activations. One hundred forty-seven recanalisation treatments were performed (66 fibrinolysis and 81 mechanical thrombectomies or combined treatments), representing a 25% increase with regard to the previous year. Recent advances in the management of acute stroke call for coordinated code stroke protocols that are adapted to the needs of each specific region. This may result in an increased number of patients receiving early care, as well as revascularisation treatments. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.
Laser identification system based on acousto-optical barcode scanner principles
NASA Astrophysics Data System (ADS)
Khansuvarov, Ruslan A.; Korol, Georgy I.; Preslenev, Leonid N.; Bestugin, Aleksandr R.; Paraskun, Arthur S.
2016-09-01
The main purpose of the bar code in the modern world is the unique identification of the product, service, or any of their features, so personal and stationary barcode scanners so widely used. One of the important parameters of bar code scanners is their reliability, accuracy of the barcode recognition, response time and performance. Nowadays, the most popular personal barcode scanners contain a mechanical part, which extremely impairs the reliability indices. Group of SUAI engineers has proposed bar code scanner based on laser beam acoustic deflection effect in crystals [RU patent No 156009 issued 4/16/2015] Through the use of an acousto-optic deflector element in barcode scanner described by a group of engineers SUAI, it can be implemented in the manual form factor, and the stationary form factor of a barcode scanner. Being a wave electronic device, an acousto-optic element in the composition of the acousto-optic barcode scanner allows you to clearly establish a mathematical link between the encoded function of the bar code with the accepted input photodetector intensities function that allows you to speak about the great probability of a bar code clear definition. This paper provides a description of the issued patent, the description of the principles of operation based on the mathematical analysis, a description of the layout of the implemented scanner.
Methodology for fast detection of false sharing in threaded scientific codes
Chung, I-Hsin; Cong, Guojing; Murata, Hiroki; Negishi, Yasushi; Wen, Hui-Fang
2014-11-25
A profiling tool identifies a code region with a false sharing potential. A static analysis tool classifies variables and arrays in the identified code region. A mapping detection library correlates memory access instructions in the identified code region with variables and arrays in the identified code region while a processor is running the identified code region. The mapping detection library identifies one or more instructions at risk, in the identified code region, which are subject to an analysis by a false sharing detection library. A false sharing detection library performs a run-time analysis of the one or more instructions at risk while the processor is re-running the identified code region. The false sharing detection library determines, based on the performed run-time analysis, whether two different portions of the cache memory line are accessed by the generated binary code.
Generating code adapted for interlinking legacy scalar code and extended vector code
Gschwind, Michael K
2013-06-04
Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.
NASA Astrophysics Data System (ADS)
Moon, Hokyu; Kim, Kyung Min; Park, Jun Su; Kim, Beom Seok; Cho, Hyung Hee
2015-12-01
The after-shell section, which is part of the gas turbine combustion liner, is exposed to the hottest combustion gas. Various cooling schemes have been applied to protect against severe thermal load. However, there is a significant discrepancy in the thermal expansion with large temperature differences, resulting in thermo-mechanical crack formation. In this study, to reduce combustion liner damage, thermo-mechanical analysis was conducted on three after-shell section configurations: inline-discrete divider wall, staggered divider wall, and swirler wall arrays. These array components are well-known heat-transfer enhancement structures in the duct. In the numerical analyses, the heat transfer characteristics, temperature and thermo-mechanical stress distribution were evaluated using finite volume method and finite element method commercial codes. As a result, we demonstrated that the temperature and the thermo-mechanical stress distribution were readily dependent on the structural array for cooling effectiveness and structural support in each modified cooling system. Compared with the reference model, the swirler wall array was most effective in diminishing the thermo-mechanical stress concentration, especially on the inner ring that is vulnerable to crack formation.
RNA-based ovarian cancer research from 'a gene to systems biomedicine' perspective.
Gov, Esra; Kori, Medi; Arga, Kazim Yalcin
2017-08-01
Ovarian cancer remains the leading cause of death from a gynecologic malignancy, and treatment of this disease is harder than any other type of female reproductive cancer. Improvements in the diagnosis and development of novel and effective treatment strategies for complex pathophysiologies, such as ovarian cancer, require a better understanding of disease emergence and mechanisms of progression through systems medicine approaches. RNA-level analyses generate new information that can help in understanding the mechanisms behind disease pathogenesis, to identify new biomarkers and therapeutic targets and in new drug discovery. Whole RNA sequencing and coding and non-coding RNA expression array datasets have shed light on the mechanisms underlying disease progression and have identified mRNAs, miRNAs, and lncRNAs involved in ovarian cancer progression. In addition, the results from these analyses indicate that various signalling pathways and biological processes are associated with ovarian cancer. Here, we present a comprehensive literature review on RNA-based ovarian cancer research and highlight the benefits of integrative approaches within the systems biomedicine concept for future ovarian cancer research. We invite the ovarian cancer and systems biomedicine research fields to join forces to achieve the interdisciplinary caliber and rigor required to find real-life solutions to common, devastating, and complex diseases such as ovarian cancer. CAF: cancer-associated fibroblasts; COG: Cluster of Orthologous Groups; DEA: disease enrichment analysis; EOC: epithelial ovarian carcinoma; ESCC: oesophageal squamous cell carcinoma; GSI: gamma secretase inhibitor; GO: Gene Ontology; GSEA: gene set enrichment analyzes; HAS: Hungarian Academy of Sciences; lncRNAs: long non-coding RNAs; MAPK/ERK: mitogen-activated protein kinase/extracellular signal-regulated kinases; NGS: next-generation sequencing; ncRNAs: non-coding RNAs; OvC: ovarian cancer; PI3K/Akt/mTOR: phosphatidylinositol-3-kinase/protein kinase B/mammalian target of rapamycin; RT-PCR: real-time polymerase chain reaction; SNP: single nucleotide polymorphism; TF: transcription factor; TGF-β: transforming growth factor-β.
Muon catalyzed fusion beam window mechanical strength testing and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ware, A.G.; Zabriskie, J.M.
A thin aluminum window (0.127 mm (0.005-inch) thick x 146 mm (5 3/4-inch) diameter) of 2024-T6 alloy was modeled and analyzed using the ABAQUS non-linear finite element analysis code. A group of windows was fabricated, heat-treated and subsequently tested. Testing included both ultimate burst pressure and fatigue. Fatigue testing cycles involved ''oil-canning'' behavior representing vacuum purge and reversal to pressure. Test results are compared to predictions and the mode of failure is discussed. Operational requirements, based on the above analysis and correlational testing, for the actual beam windows are discussed. 1 ref., 3 figs.
A 3D moisture-stress FEM analysis for time dependent problems in timber structures
NASA Astrophysics Data System (ADS)
Fortino, Stefania; Mirianon, Florian; Toratti, Tomi
2009-11-01
This paper presents a 3D moisture-stress numerical analysis for timber structures under variable humidity and load conditions. An orthotropic viscoelastic-mechanosorptive material model is specialized on the basis of previous models. Both the constitutive model and the equations needed to describe the moisture flow across the structure are implemented into user subroutines of the Abaqus finite element code and a coupled moisture-stress analysis is performed for several types of mechanical loads and moisture changes. The presented computational approach is validated by analyzing some wood tests described in the literature and comparing the computational results with the reported experimental data.
Tam, Vivian; Edge, Jennifer S; Hoffman, Steven J
2016-10-12
Shortages of health workers in low-income countries are exacerbated by the international migration of health workers to more affluent countries. This problem is compounded by the active recruitment of health workers by destination countries, particularly Australia, Canada, UK and USA. The World Health Organization (WHO) adopted a voluntary Code of Practice in May 2010 to mitigate tensions between health workers' right to migrate and the shortage of health workers in source countries. The first empirical impact evaluation of this Code was conducted 11-months after its adoption and demonstrated a lack of impact on health workforce recruitment policy and practice in the short-term. This second empirical impact evaluation was conducted 4-years post-adoption using the same methodology to determine whether there have been any changes in the perceived utility, applicability, and implementation of the Code in the medium-term. Forty-four respondents representing government, civil society and the private sector from Australia, Canada, UK and USA completed an email-based survey evaluating their awareness of the Code, perceived impact, changes to policy or recruitment practices resulting from the Code, and the effectiveness of non-binding Codes generally. The same survey instrument from the original study was used to facilitate direct comparability of responses. Key lessons were identified through thematic analysis. The main findings between the initial impact evaluation and the current one are unchanged. Both sets of key informants reported no significant policy or regulatory changes to health worker recruitment in their countries as a direct result of the Code due to its lack of incentives, institutional mechanisms and interest mobilizers. Participants emphasized the existence of previous bilateral and regional Codes, the WHO Code's non-binding nature, and the primacy of competing domestic healthcare priorities in explaining this perceived lack of impact. The Code has probably still not produced the tangible improvements in health worker flows it aspired to achieve. Several actions, including a focus on developing bilateral codes, linking the Code to topical global priorities, and reframing the Code's purpose to emphasize health system sustainability, are proposed to improve the Code's uptake and impact.
MAC/GMC Code Enhanced for Coupled Electromagnetothermoelastic Analysis of Smart Composites
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.; Aboudi, Jacob
2002-01-01
Intelligent materials are those that exhibit coupling between their electromagnetic response and their thermomechanical response. This coupling allows smart materials to react mechanically (e.g., an induced displacement) to applied electrical or magnetic fields (for instance). These materials find many important applications in sensors, actuators, and transducers. Recently interest has arisen in the development of smart composites that are formed via the combination of two or more phases, one or more of which is a smart material. To design with and utilize smart composites, designers need theories that predict the coupled smart behavior of these materials from the electromagnetothermoelastic properties of the individual phases. The micromechanics model known as the generalized method of cells (GMC) has recently been extended to provide this important capability. This coupled electromagnetothermoelastic theory has recently been incorporated within NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC). This software package is user friendly and has many additional features that render it useful as a design and analysis tool for composite materials in general, and with its new capabilities, for smart composites as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKeown, J.; Labrie, J.P.
1983-08-01
A general purpose finite element computer code called MARC is used to calculate the temperature distribution and dimensional changes in linear accelerator rf structures. Both steady state and transient behaviour are examined with the computer model. Combining results from MARC with the cavity evaluation computer code SUPERFISH, the static and dynamic behaviour of a structure under power is investigated. Structure cooling is studied to minimize loss in shunt impedance and frequency shifts during high power operation. Results are compared with an experimental test carried out on a cw 805 MHz on-axis coupled structure at an energy gradient of 1.8 MeV/m.more » The model has also been used to compare the performance of on-axis and coaxial structures and has guided the mechanical design of structures suitable for average gradients in excess of 2.0 MeV/m at 2.45 GHz.« less
Higher-Order Theory: Structural/MicroAnalysis Code (HOTSMAC) Developed
NASA Technical Reports Server (NTRS)
Arnold, Steven M.
2002-01-01
The full utilization of advanced materials (be they composite or functionally graded materials) in lightweight aerospace components requires the availability of accurate analysis, design, and life-prediction tools that enable the assessment of component and material performance and reliability. Recently, a new commercially available software product called HOTSMAC (Higher-Order Theory--Structural/MicroAnalysis Code) was jointly developed by Collier Research Corporation, Engineered Materials Concepts LLC, and the NASA Glenn Research Center under funding provided by Glenn's Commercial Technology Office. The analytical framework for HOTSMAC is based on almost a decade of research into the coupled micromacrostructural analysis of heterogeneous materials. Consequently, HOTSMAC offers a comprehensive approach for analyzing/designing the response of components with various microstructural details, including certain advantages not always available in standard displacement-based finite element analysis techniques. The capabilities of HOTSMAC include combined thermal and mechanical analysis, time-independent and time-dependent material behavior, and internal boundary cells (e.g., those that can be used to represent internal cooling passages, see the preceding figure) to name a few. In HOTSMAC problems, materials can be randomly distributed and/or functionally graded (as shown in the figure, wherein the inclusions are distributed linearly), or broken down by strata, such as in the case of thermal barrier coatings or composite laminates.
Pullout of a Rigid Insert Adhesively Bonded to an Elastic Half Plane.
1983-12-01
COMMAND UNITED STATES AIR FORCE C-= °84 02 13 071. C,, W % d 6 This document was prepared by the Department of Engineering Mechanics, USAF Academy Faculty...THOMAS E. KULLGREN, Lt Col, USAF Project Engineer /Scientist Professor and Acting Head, Department of Engineering Mechanics KENNETH E. SIEGETH Lt Col...Department of Engineering (Ifapphicable) Mechanics USAFA/DFEM 6c. ADDRESS (City. State and ZIP Code) 7b. ADDRESS (City, Slate and ZIP Code) USAF Academy
Jettison Engineering Trajectory Tool
NASA Technical Reports Server (NTRS)
Zaczek, Mariusz; Walter, Patrick; Pascucci, Joseph; Armstrong, Phyllis; Hallbick, Patricia; Morgan, Randal; Cooney, James
2013-01-01
The Jettison Engineering Trajectory Tool (JETT) performs the jettison analysis function for any orbiting asset. It provides a method to compute the relative trajectories between an orbiting asset and any jettisoned item (intentional or unintentional) or sublimating particles generated by fluid dumps to assess whether an object is safe to jettison, or if there is a risk with an item that was inadvertently lost overboard. The main concern is the interaction and possible recontact of the jettisoned object with an asset. This supports the analysis that jettisoned items will safely clear the vehicle, ensuring no collisions. The software will reduce the jettison analysis task from one that could take days to complete to one that can be completed in hours, with an analysis that is more comprehensive than the previous method. It provides the ability to define the jettison operation relative to International Space Station (ISS) structure, and provides 2D and 3D plotting capability to allow an analyst to perform a subjective clearance assessment with ISS structures. The developers followed the SMP to create the code and all supporting documentation. The code makes extensive use of the object-oriented format of Java and, in addition, the Model-View-Controller architecture was used in the organization of the code, allowing each piece to be independent of updates to the other pieces. The model category is for maintaining data entered by the user and generated by the analysis. The view category provides capabilities for data entry and displaying all or a portion of the analysis data in tabular, 2D, and 3D representation. The controller category allows for handling events that affect the model or view(s). The JETT utilizes orbital mechanics with complex algorithms. Since JETT is written in JAVA, it is essentially platform-independent.
Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor
2013-01-01
Unhealthy behaviors increase individual health risks and are a socioeconomic burden. Harnessing social influence is perceived as fundamental for interventions to influence health-related behaviors. However, the mechanisms through which social influence occurs are poorly understood. Online social networks provide the opportunity to understand these mechanisms as they digitally archive communication between members. In this paper, we present a methodology for content-based social network analysis, combining qualitative coding, automated text analysis, and formal network analysis such that network structure is determined by the content of messages exchanged between members. We apply this approach to characterize the communication between members of QuitNet, an online social network for smoking cessation. Results indicate that the method identifies meaningful theme-based social sub-networks. Modeling social network data using this method can provide us with theme-specific insights such as the identities of opinion leaders and sub-community clusters. Implications for design of targeted social interventions are discussed.
Network analysis for the visualization and analysis of qualitative data.
Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D
2018-03-01
We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Effective Use of Weld Metal Yield Strength for HY-Steels
1983-01-01
Boiler and Pressure Vessel Code The ASME Boiler and Pressure Vessel Code (B&PV Code) is divided...As noted earlier, the ASME Boiler and Pressure Vessel Code makes only one exception to its overall philosophy of matching weld-metal strength and...material where toughness is of primary importance. REFERENCES American Society of Mechanical Engineers, Boiler and Pressure Vessel
Amodal processing in human prefrontal cortex.
Tamber-Rosenau, Benjamin J; Dux, Paul E; Tombu, Michael N; Asplund, Christopher L; Marois, René
2013-07-10
Information enters the cortex via modality-specific sensory regions, whereas actions are produced by modality-specific motor regions. Intervening central stages of information processing map sensation to behavior. Humans perform this central processing in a flexible, abstract manner such that sensory information in any modality can lead to response via any motor system. Cognitive theories account for such flexible behavior by positing amodal central information processing (e.g., "central executive," Baddeley and Hitch, 1974; "supervisory attentional system," Norman and Shallice, 1986; "response selection bottleneck," Pashler, 1994). However, the extent to which brain regions embodying central mechanisms of information processing are amodal remains unclear. Here we apply multivariate pattern analysis to functional magnetic resonance imaging (fMRI) data to compare response selection, a cognitive process widely believed to recruit an amodal central resource across sensory and motor modalities. We show that most frontal and parietal cortical areas known to activate across a wide variety of tasks code modality, casting doubt on the notion that these regions embody a central processor devoid of modality representation. Importantly, regions of anterior insula and dorsolateral prefrontal cortex consistently failed to code modality across four experiments. However, these areas code at least one other task dimension, process (instantiated as response selection vs response execution), ensuring that failure to find coding of modality is not driven by insensitivity of multivariate pattern analysis in these regions. We conclude that abstract encoding of information modality is primarily a property of subregions of the prefrontal cortex.
Pseudouridine profiling reveals regulated mRNA pseudouridylation in yeast and human cells
Carlile, Thomas M.; Rojas-Duran, Maria F.; Zinshteyn, Boris; Shin, Hakyung; Bartoli, Kristen M.; Gilbert, Wendy V.
2014-01-01
Post-transcriptional modification of RNA nucleosides occurs in all living organisms. Pseudouridine, the most abundant modified nucleoside in non-coding RNAs1, enhances the function of transfer RNA and ribosomal RNA by stabilizing RNA structure2–8. mRNAs were not known to contain pseudouridine, but artificial pseudouridylation dramatically affects mRNA function – it changes the genetic code by facilitating non-canonical base pairing in the ribosome decoding center9,10. However, without evidence of naturally occurring mRNA pseudouridylation, its physiological was unclear. Here we present a comprehensive analysis of pseudouridylation in yeast and human RNAs using Pseudo-seq, a genome-wide, single-nucleotide-resolution method for pseudouridine identification. Pseudo-seq accurately identifies known modification sites as well as 100 novel sites in non-coding RNAs, and reveals hundreds of pseudouridylated sites in mRNAs. Genetic analysis allowed us to assign most of the new modification sites to one of seven conserved pseudouridine synthases, Pus1–4, 6, 7 and 9. Notably, the majority of pseudouridines in mRNA are regulated in response to environmental signals, such as nutrient deprivation in yeast and serum starvation in human cells. These results suggest a mechanism for the rapid and regulated rewiring of the genetic code through inducible mRNA modifications. Our findings reveal unanticipated roles for pseudouridylation and provide a resource for identifying the targets of pseudouridine synthases implicated in human disease11–13. PMID:25192136
Duellman, Tyler; Warren, Christopher; Yang, Jay
2014-01-01
Microribonucleic acids (miRNAs) work with exquisite specificity and are able to distinguish a target from a non-target based on a single nucleotide mismatch in the core nucleotide domain. We questioned whether miRNA regulation of gene expression could occur in a single nucleotide polymorphism (SNP)-specific manner, manifesting as a post-transcriptional control of expression of genetic polymorphisms. In our recent study of the functional consequences of matrix metalloproteinase (MMP)-9 SNPs, we discovered that expression of a coding exon SNP in the pro-domain of the protein resulted in a profound decrease in the secreted protein. This missense SNP results in the N38S amino acid change and a loss of an N-glycosylation site. A systematic study demonstrated that the loss of secreted protein was due not to the loss of an N-glycosylation site, but rather an SNP-specific targeting by miR-671-3p and miR-657. Bioinformatics analysis identified 41 SNP-specific miRNA targeting MMP-9 SNPs, mostly in the coding exon and an extension of the analysis to chromosome 20, where the MMP-9 gene is located, suggesting that SNP-specific miRNAs targeting the coding exon are prevalent. This selective post-transcriptional regulation of a target messenger RNA harboring genetic polymorphisms by miRNAs offers an SNP-dependent post-transcriptional regulatory mechanism, allowing for polymorphic-specific differential gene regulation. PMID:24627221
Almeida, Jonas S.; Iriabho, Egiebade E.; Gorrepati, Vijaya L.; Wilkinson, Sean R.; Grüneberg, Alexander; Robbins, David E.; Hackney, James R.
2012-01-01
Background: Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. Materials and Methods: ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Results: Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. Conclusions: The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local “download and installation”. PMID:22934238
A New Capability for Nuclear Thermal Propulsion Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amiri, Benjamin W.; Nuclear and Radiological Engineering Department, University of Florida, Gainesville, FL 32611; Kapernick, Richard J.
2007-01-30
This paper describes a new capability for Nuclear Thermal Propulsion (NTP) design that has been developed, and presents the results of some analyses performed with this design tool. The purpose of the tool is to design to specified mission and material limits, while maximizing system thrust to weight. The head end of the design tool utilizes the ROCket Engine Transient Simulation (ROCETS) code to generate a system design and system design requirements as inputs to the core analysis. ROCETS is a modular system level code which has been used extensively in the liquid rocket engine industry for many years. Themore » core design tool performs high-fidelity reactor core nuclear and thermal-hydraulic design analysis. At the heart of this process are two codes TMSS-NTP and NTPgen, which together greatly automate the analysis, providing the capability to rapidly produce designs that meet all specified requirements while minimizing mass. A PERL based command script, called CORE DESIGNER controls the execution of these two codes, and checks for convergence throughout the process. TMSS-NTP is executed first, to produce a suite of core designs that meet the specified reactor core mechanical, thermal-hydraulic and structural requirements. The suite of designs consists of a set of core layouts and, for each core layout specific designs that span a range of core fuel volumes. NTPgen generates MCNPX models for each of the core designs from TMSS-NTP. Iterative analyses are performed in NTPgen until a reactor design (fuel volume) is identified for each core layout that meets cold and hot operation reactivity requirements and that is zoned to meet a radial core power distribution requirement.« less
Chen, Wen; Zhang, Xuan; Li, Jing; Huang, Shulan; Xiang, Shuanglin; Hu, Xiang; Liu, Changning
2018-05-09
Zebrafish is a full-developed model system for studying development processes and human disease. Recent studies of deep sequencing had discovered a large number of long non-coding RNAs (lncRNAs) in zebrafish. However, only few of them had been functionally characterized. Therefore, how to take advantage of the mature zebrafish system to deeply investigate the lncRNAs' function and conservation is really intriguing. We systematically collected and analyzed a series of zebrafish RNA-seq data, then combined them with resources from known database and literatures. As a result, we obtained by far the most complete dataset of zebrafish lncRNAs, containing 13,604 lncRNA genes (21,128 transcripts) in total. Based on that, a co-expression network upon zebrafish coding and lncRNA genes was constructed and analyzed, and used to predict the Gene Ontology (GO) and the KEGG annotation of lncRNA. Meanwhile, we made a conservation analysis on zebrafish lncRNA, identifying 1828 conserved zebrafish lncRNA genes (1890 transcripts) that have their putative mammalian orthologs. We also found that zebrafish lncRNAs play important roles in regulation of the development and function of nervous system; these conserved lncRNAs present a significant sequential and functional conservation, with their mammalian counterparts. By integrative data analysis and construction of coding-lncRNA gene co-expression network, we gained the most comprehensive dataset of zebrafish lncRNAs up to present, as well as their systematic annotations and comprehensive analyses on function and conservation. Our study provides a reliable zebrafish-based platform to deeply explore lncRNA function and mechanism, as well as the lncRNA commonality between zebrafish and human.
Almeida, Jonas S; Iriabho, Egiebade E; Gorrepati, Vijaya L; Wilkinson, Sean R; Grüneberg, Alexander; Robbins, David E; Hackney, James R
2012-01-01
Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local "download and installation".
Crustal Fracturing Field and Presence of Fluid as Revealed by Seismic Anisotropy
NASA Astrophysics Data System (ADS)
Pastori, M.; Piccinini, D.; de Gori, P.; Margheriti, L.; Barchi, M. R.; di Bucci, D.
2010-12-01
In the last three years, we developed, tested and improved an automatic analysis code (Anisomat+) to calculate the shear wave splitting parameters, fast polarization direction (φ) and delay time (∂t). The code is a set of MatLab scripts able to retrieve crustal anisotropy parameters from three-component seismic recording of local earthquakes using horizontal component cross-correlation method. The analysis procedure consists in choosing an appropriate frequency range, that better highlights the signal containing the shear waves, and a length of time window on the seismogram centered on the S arrival (the temporal window contains at least one cycle of S wave). The code was compared to other two automatic analysis code (SPY and SHEBA) and tested on three Italian areas (Val d’Agri, Tiber Valley and L’Aquila surrounding) along the Apennine mountains. For each region we used the anisotropic parameters resulting from the automatic computation as a tool to determine the fracture field geometries connected with the active stress field. We compare the temporal variations of anisotropic parameters to the evolution of vp/vs ratio for the same seismicity. The anisotropic fast directions are used to define the active stress field (EDA model), finding a general consistence between fast direction and main stress indicators (focal mechanism and borehole break-out). The magnitude of delay time is used to define the fracture field intensity finding higher value in the volume where micro-seismicity occurs. Furthermore we studied temporal variations of anisotropic parameters and vp/vs ratio in order to explain if fluids play an important role in the earthquake generation process. The close association of anisotropic and vp/vs parameters variations and seismicity rate changes supports the hypothesis that the background seismicity is influenced by the fluctuation of pore fluid pressure in the rocks.
NASA Technical Reports Server (NTRS)
Pronchick, Stephen W.
1998-01-01
Materials that pyrolyze at elevated temperature have been commonly used as thermal protection materials in hypersonic flight, and advanced pyrolyzing materials for this purpose continue to be developed. Because of the large temperature gradients that can arise in thermal protection materials, significant thermal stresses can develop. Advanced applications of pyrolytic materials are calling for more complex heatshield configurations, making accurate thermal stress analysis more important, and more challenging. For non-pyrolyzing materials, many finite element codes are available and capable of performing coupled thermal-mechanical analyses. These codes do not, however, have a built-in capability to perform analyses that include pyrolysis effects. When a pyrolyzing material is heated, one or more components of the original virgin material pyrolyze and create a gas. This gas flows away from the pyrolysis zone to the surface, resulting in a reduction in surface heating. A porous residue, referred to as char, remains in place of the virgin material. While the processes involved can be complex, it has been found that a simple physical model in which virgin material reacts to form char and pyrolysis gas, will yield satisfactory analytical results. Specifically, the effects that must be modeled include: (1) Variation of thermal properties (density, specific heat, thermal conductivity) as the material composition changes; (2) Energy released or absorbed by the pyrolysis reactions; (3) Energy convected by the flow of pyrolysis gas from the interior to the surface; (4) The reduction in surface heating due to surface blowing; and (5) Chemical and mass diffusion effects at the surface between the pyrolysis gas and edge gas Computational tools for the one-dimensional thermal analysis these materials exist and have proven to be reliable design tools. The objective of the present work is to extend the analysis capabilities of pyrolyzing materials to axisymmetric configurations, and to couple thermal and mechanical analyses so that thermal stresses may be efficiently and accurately calculated.
Lewis Structures Technology, 1988. Volume 2: Structural Mechanics
NASA Technical Reports Server (NTRS)
1988-01-01
Lewis Structures Div. performs and disseminates results of research conducted in support of aerospace engine structures. These results have a wide range of applicability to practitioners of structural engineering mechanics beyond the aerospace arena. The engineering community was familiarized with the depth and range of research performed by the division and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive evaluation, constitutive models and experimental capabilities, dynamic systems, fatigue and damage, wind turbines, hot section technology (HOST), aeroelasticity, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics, and structural mechanics computer codes.
Posttest analysis of the FFTF inherent safety tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Padilla, A. Jr.; Claybrook, S.W.
Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less
Mechanisms resulting in accreted ice roughness
NASA Technical Reports Server (NTRS)
Bilanin, Alan J.; Chua, Kiat
1992-01-01
Icing tests conducted on rotating cylinders in the BF Goodrich's Icing Research Facility indicate that a regular, deterministic, icing roughness pattern is typical. The roughness pattern is similar to kernels of corn on a cob for cylinders of diameter typical of a cob. An analysis is undertaken to determine the mechanisms which result in this roughness to ascertain surface scale and amplitude of roughness. Since roughness and the resulting augmentation of the convected heat transfer coefficient has been determined to most strongly control the accreted ice in ice prediction codes, the ability to predict a priori, location, amplitude and surface scale of roughness would greatly augment the capabilities of current ice accretion models.
Nano/micro-electro mechanical systems: a patent view
NASA Astrophysics Data System (ADS)
Hu, Guangyuan; Liu, Weishu
2015-12-01
Combining both bibliometrics and citation network analysis, this research evaluates the global development of micro-electro mechanical systems (MEMS) research based on the Derwent Innovations Index database. We found that worldwide, the growth trajectory of MEMS patents demonstrates an approximate S shape, with United States, Japan, China, and Korea leading the global MEMS race. Evidenced by Derwent class codes, the technology structure of global MEMS patents remains steady over time. Yet there does exist a national competitiveness component among the top country players. The latecomer China has become the second most prolific country filing MEMS patents, but its patent quality still lags behind the global average.
Gallistel, C R
2017-07-01
Recent electrophysiological results imply that the duration of the stimulus onset asynchrony in eyeblink conditioning is encoded by a mechanism intrinsic to the cerebellar Purkinje cell. This raises the general question - how is quantitative information (durations, distances, rates, probabilities, amounts, etc.) transmitted by spike trains and encoded into engrams? The usual assumption is that information is transmitted by firing rates. However, rate codes are energetically inefficient and computationally awkward. A combinatorial code is more plausible. If the engram consists of altered synaptic conductances (the usual assumption), then we must ask how numbers may be written to synapses. It is much easier to formulate a coding hypothesis if the engram is realized by a cell-intrinsic molecular mechanism. Copyright © 2017 Elsevier Ltd. All rights reserved.
Structural architecture of the human long non-coding RNA, steroid receptor RNA activator
Novikova, Irina V.; Hennelly, Scott P.; Sanbonmatsu, Karissa Y.
2012-01-01
While functional roles of several long non-coding RNAs (lncRNAs) have been determined, the molecular mechanisms are not well understood. Here, we report the first experimentally derived secondary structure of a human lncRNA, the steroid receptor RNA activator (SRA), 0.87 kB in size. The SRA RNA is a non-coding RNA that coactivates several human sex hormone receptors and is strongly associated with breast cancer. Coding isoforms of SRA are also expressed to produce proteins, making the SRA gene a unique bifunctional system. Our experimental findings (SHAPE, in-line, DMS and RNase V1 probing) reveal that this lncRNA has a complex structural organization, consisting of four domains, with a variety of secondary structure elements. We examine the coevolution of the SRA gene at the RNA structure and protein structure levels using comparative sequence analysis across vertebrates. Rapid evolutionary stabilization of RNA structure, combined with frame-disrupting mutations in conserved regions, suggests that evolutionary pressure preserves the RNA structural core rather than its translational product. We perform similar experiments on alternatively spliced SRA isoforms to assess their structural features. PMID:22362738
Hou, Zhouhua; Xu, Xuwen; Zhou, Ledu; Fu, Xiaoyu; Tao, Shuhui; Zhou, Jiebin; Tan, Deming; Liu, Shuiping
2017-07-01
Increasing evidence supports the significance of long non-coding RNA in cancer development. Several recent studies suggest the oncogenic activity of long non-coding RNA metastasis-associated lung adenocarcinoma transcript 1 (MALAT1) in hepatocellular carcinoma. In this study, we explored the molecular mechanisms by which MALAT1 modulates hepatocellular carcinoma biological behaviors. We found that microRNA-204 was significantly downregulated in sh-MALAT1 HepG2 cell and 15 hepatocellular carcinoma tissues by quantitative real-time polymerase chain reaction analysis. Through bioinformatic screening, luciferase reporter assay, RNA-binding protein immunoprecipitation, and RNA pull-down assay, we identified microRNA-204 as a potential interacting partner for MALAT1. Functionally, wound-healing and transwell assays revealed that microRNA-204 significantly inhibited the migration and invasion of hepatocellular carcinoma cells. Notably, sirtuin 1 was recognized as a direct downstream target of microRNA-204 in HepG2 cells. Moreover, si-SIRT1 significantly inhibited cell invasion and migration process. These data elucidated, by sponging and competitive binding to microRNA-204, MALAT1 releases the suppression on sirtuin 1, which in turn promotes hepatocellular carcinoma migration and invasion. This study reveals a novel mechanism by which MALAT1 stimulates hepatocellular carcinoma progression and justifies targeting metastasis-associated lung adenocarcinoma transcript 1 as a potential therapy for hepatocellular carcinoma.
Computational simulation of matrix micro-slip bands in SiC/Ti-15 composite
NASA Technical Reports Server (NTRS)
Mital, S. K.; Lee, H.-J.; Murthy, P. L. N.; Chamis, C. C.
1992-01-01
Computational simulation procedures are used to identify the key deformation mechanisms for (0)(sub 8) and (90)(sub 8) SiC/Ti-15 metal matrix composites. The computational simulation procedures employed consist of a three-dimensional finite-element analysis and a micromechanics based computer code METCAN. The interphase properties used in the analysis have been calibrated using the METCAN computer code with the (90)(sub 8) experimental stress-strain curve. Results of simulation show that although shear stresses are sufficiently high to cause the formation of some slip bands in the matrix concentrated mostly near the fibers, the nonlinearity in the composite stress-strain curve in the case of (90)(sub 8) composite is dominated by interfacial damage, such as microcracks and debonding rather than microplasticity. The stress-strain curve for (0)(sub 8) composite is largely controlled by the fibers and shows only slight nonlinearity at higher strain levels that could be the result of matrix microplasticity.
Overview of ICE Project: Integration of Computational Fluid Dynamics and Experiments
NASA Technical Reports Server (NTRS)
Stegeman, James D.; Blech, Richard A.; Babrauckas, Theresa L.; Jones, William H.
2001-01-01
Researchers at the NASA Glenn Research Center have developed a prototype integrated environment for interactively exploring, analyzing, and validating information from computational fluid dynamics (CFD) computations and experiments. The Integrated CFD and Experiments (ICE) project is a first attempt at providing a researcher with a common user interface for control, manipulation, analysis, and data storage for both experiments and simulation. ICE can be used as a live, on-tine system that displays and archives data as they are gathered; as a postprocessing system for dataset manipulation and analysis; and as a control interface or "steering mechanism" for simulation codes while visualizing the results. Although the full capabilities of ICE have not been completely demonstrated, this report documents the current system. Various applications of ICE are discussed: a low-speed compressor, a supersonic inlet, real-time data visualization, and a parallel-processing simulation code interface. A detailed data model for the compressor application is included in the appendix.
Reassigning stop codons via translation termination: How a few eukaryotes broke the dogma.
Alkalaeva, Elena; Mikhailova, Tatiana
2017-03-01
The genetic code determines how amino acids are encoded within mRNA. It is universal among the vast majority of organisms, although several exceptions are known. Variant genetic codes are found in ciliates, mitochondria, and numerous other organisms. All revealed genetic codes (standard and variant) have at least one codon encoding a translation stop signal. However, recently two new genetic codes with a reassignment of all three stop codons were revealed in studies examining the protozoa transcriptomes. Here, we discuss this finding and the recent studies of variant genetic codes in eukaryotes. We consider the possible molecular mechanisms allowing the use of certain codons as sense and stop signals simultaneously. The results obtained by studying these amazing organisms represent a new and exciting insight into the mechanism of stop codon decoding in eukaryotes. Also see the video abstract here. © 2017 WILEY Periodicals, Inc.
A program for undergraduate research into the mechanisms of sensory coding and memory decay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calin-Jageman, R J
This is the final technical report for this DOE project, entitltled "A program for undergraduate research into the mechanisms of sensory coding and memory decay". The report summarizes progress on the three research aims: 1) to identify phyisological and genetic correlates of long-term habituation, 2) to understand mechanisms of olfactory coding, and 3) to foster a world-class undergraduate neuroscience program. Progress on the first aim has enabled comparison of learning-regulated transcripts across closely related learning paradigms and species, and results suggest that only a small core of transcripts serve truly general roles in long-term memory. Progress on the second aimmore » has enabled testing of several mutant phenotypes for olfactory behaviors, and results show that responses are not fully consistent with the combinitoral coding hypothesis. Finally, 14 undergraduate students participated in this research, the neuroscience program attracted extramural funding, and we completed a successful summer program to enhance transitions for community-college students into 4-year colleges to persue STEM fields.« less
Pitchiaya, Sethuramasundaram; Krishnan, Vishalakshi; Custer, Thomas C.; Walter, Nils G.
2013-01-01
Non-coding RNAs (ncRNAs) recently were discovered to outnumber their protein-coding counterparts, yet their diverse functions are still poorly understood. Here we report on a method for the intracellular Single-molecule High Resolution Localization and Counting (iSHiRLoC) of microRNAs (miRNAs), a conserved, ubiquitous class of regulatory ncRNAs that controls the expression of over 60% of all mammalian protein coding genes post-transcriptionally, by a mechanism shrouded by seemingly contradictory observations. We present protocols to execute single particle tracking (SPT) and single-molecule counting of functional microinjected, fluorophore-labeled miRNAs and thereby extract diffusion coefficients and molecular stoichiometries of micro-ribonucleoprotein (miRNP) complexes from living and fixed cells, respectively. This probing of miRNAs at the single molecule level sheds new light on the intracellular assembly/disassembly of miRNPs, thus beginning to unravel the dynamic nature of this important gene regulatory pathway and facilitating the development of a parsimonious model for their obscured mechanism of action. PMID:23820309
Automated Simplification of Full Chemical Mechanisms
NASA Technical Reports Server (NTRS)
Norris, A. T.
1997-01-01
A code has been developed to automatically simplify full chemical mechanisms. The method employed is based on the Intrinsic Low Dimensional Manifold (ILDM) method of Maas and Pope. The ILDM method is a dynamical systems approach to the simplification of large chemical kinetic mechanisms. By identifying low-dimensional attracting manifolds, the method allows complex full mechanisms to be parameterized by just a few variables; in effect, generating reduced chemical mechanisms by an automatic procedure. These resulting mechanisms however, still retain all the species used in the full mechanism. Full and skeletal mechanisms for various fuels are simplified to a two dimensional manifold, and the resulting mechanisms are found to compare well with the full mechanisms, and show significant improvement over global one step mechanisms, such as those by Westbrook and Dryer. In addition, by using an ILDM reaction mechanism in a CID code, a considerable improvement in turn-around time can be achieved.
Auer, Paul L; Nalls, Mike; Meschia, James F; Worrall, Bradford B; Longstreth, W T; Seshadri, Sudha; Kooperberg, Charles; Burger, Kathleen M; Carlson, Christopher S; Carty, Cara L; Chen, Wei-Min; Cupples, L Adrienne; DeStefano, Anita L; Fornage, Myriam; Hardy, John; Hsu, Li; Jackson, Rebecca D; Jarvik, Gail P; Kim, Daniel S; Lakshminarayan, Kamakshi; Lange, Leslie A; Manichaikul, Ani; Quinlan, Aaron R; Singleton, Andrew B; Thornton, Timothy A; Nickerson, Deborah A; Peters, Ulrike; Rich, Stephen S
2015-07-01
Stroke is the second leading cause of death and the third leading cause of years of life lost. Genetic factors contribute to stroke prevalence, and candidate gene and genome-wide association studies (GWAS) have identified variants associated with ischemic stroke risk. These variants often have small effects without obvious biological significance. Exome sequencing may discover predicted protein-altering variants with a potentially large effect on ischemic stroke risk. To investigate the contribution of rare and common genetic variants to ischemic stroke risk by targeting the protein-coding regions of the human genome. The National Heart, Lung, and Blood Institute (NHLBI) Exome Sequencing Project (ESP) analyzed approximately 6000 participants from numerous cohorts of European and African ancestry. For discovery, 365 cases of ischemic stroke (small-vessel and large-vessel subtypes) and 809 European ancestry controls were sequenced; for replication, 47 affected sibpairs concordant for stroke subtype and an African American case-control series were sequenced, with 1672 cases and 4509 European ancestry controls genotyped. The ESP's exome sequencing and genotyping started on January 1, 2010, and continued through June 30, 2012. Analyses were conducted on the full data set between July 12, 2012, and July 13, 2013. Discovery of new variants or genes contributing to ischemic stroke risk and subtype (primary analysis) and determination of support for protein-coding variants contributing to risk in previously published candidate genes (secondary analysis). We identified 2 novel genes associated with an increased risk of ischemic stroke: a protein-coding variant in PDE4DIP (rs1778155; odds ratio, 2.15; P = 2.63 × 10(-8)) with an intracellular signal transduction mechanism and in ACOT4 (rs35724886; odds ratio, 2.04; P = 1.24 × 10(-7)) with a fatty acid metabolism; confirmation of PDE4DIP was observed in affected sibpair families with large-vessel stroke subtype and in African Americans. Replication of protein-coding variants in candidate genes was observed for 2 previously reported GWAS associations: ZFHX3 (cardioembolic stroke) and ABCA1 (large-vessel stroke). Exome sequencing discovered 2 novel genes and mechanisms, PDE4DIP and ACOT4, associated with increased risk for ischemic stroke. In addition, ZFHX3 and ABCA1 were discovered to have protein-coding variants associated with ischemic stroke. These results suggest that genetic variation in novel pathways contributes to ischemic stroke risk and serves as a target for prediction, prevention, and therapy.
Mistranslation: from adaptations to applications.
Hoffman, Kyle S; O'Donoghue, Patrick; Brandl, Christopher J
2017-11-01
The conservation of the genetic code indicates that there was a single origin, but like all genetic material, the cell's interpretation of the code is subject to evolutionary pressure. Single nucleotide variations in tRNA sequences can modulate codon assignments by altering codon-anticodon pairing or tRNA charging. Either can increase translation errors and even change the code. The frozen accident hypothesis argued that changes to the code would destabilize the proteome and reduce fitness. In studies of model organisms, mistranslation often acts as an adaptive response. These studies reveal evolutionary conserved mechanisms to maintain proteostasis even during high rates of mistranslation. This review discusses the evolutionary basis of altered genetic codes, how mistranslation is identified, and how deviations to the genetic code are exploited. We revisit early discoveries of genetic code deviations and provide examples of adaptive mistranslation events in nature. Lastly, we highlight innovations in synthetic biology to expand the genetic code. The genetic code is still evolving. Mistranslation increases proteomic diversity that enables cells to survive stress conditions or suppress a deleterious allele. Genetic code variants have been identified by genome and metagenome sequence analyses, suppressor genetics, and biochemical characterization. Understanding the mechanisms of translation and genetic code deviations enables the design of new codes to produce novel proteins. Engineering the translation machinery and expanding the genetic code to incorporate non-canonical amino acids are valuable tools in synthetic biology that are impacting biomedical research. This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cai, Huai-yu; Dong, Xiao-tong; Zhu, Meng; Huang, Zhan-hua
2018-01-01
Wavefront coding for athermal technique can effectively ensure the stability of the optical system imaging in large temperature range, as well as the advantages of compact structure and low cost. Using simulation method to analyze the properties such as PSF and MTF of wavefront coding athermal system under several typical temperature gradient distributions has directive function to characterize the working state of non-ideal temperature environment, and can effectively realize the system design indicators as well. In this paper, we utilize the interoperability of data between Solidworks and ZEMAX to simplify the traditional process of structure/thermal/optical integrated analysis. Besides, we design and build the optical model and corresponding mechanical model of the infrared imaging wavefront coding athermal system. The axial and radial temperature gradients of different degrees are applied to the whole system by using SolidWorks software, thus the changes of curvature, refractive index and the distance between the lenses are obtained. Then, we import the deformation model to ZEMAX for ray tracing, and obtain the changes of PSF and MTF in optical system. Finally, we discuss and evaluate the consistency of the PSF (MTF) of the wavefront coding athermal system and the image restorability, which provides the basis and reference for the optimal design of the wavefront coding athermal system. The results show that the adaptability of single material infrared wavefront coding athermal system to axial temperature gradient can reach the upper limit of temperature fluctuation of 60°C, which is much higher than that of radial temperature gradient.
Cultural and Technological Issues and Solutions for Geodynamics Software Citation
NASA Astrophysics Data System (ADS)
Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.
2014-12-01
Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.
NASA Technical Reports Server (NTRS)
Ni, Jianjun David
2011-01-01
This presentation briefly discusses a research effort on mitigation techniques of pulsed radio frequency interference (RFI) on a Low-Density-Parity-Check (LDPC) code. This problem is of considerable interest in the context of providing reliable communications to the space vehicle which might suffer severe degradation due to pulsed RFI sources such as large radars. The LDPC code is one of modern forward-error-correction (FEC) codes which have the decoding performance to approach the Shannon Limit. The LDPC code studied here is the AR4JA (2048, 1024) code recommended by the Consultative Committee for Space Data Systems (CCSDS) and it has been chosen for some spacecraft design. Even though this code is designed as a powerful FEC code in the additive white Gaussian noise channel, simulation data and test results show that the performance of this LDPC decoder is severely degraded when exposed to the pulsed RFI specified in the spacecraft s transponder specifications. An analysis work (through modeling and simulation) has been conducted to evaluate the impact of the pulsed RFI and a few implemental techniques have been investigated to mitigate the pulsed RFI impact by reshuffling the soft-decision-data available at the input of the LDPC decoder. The simulation results show that the LDPC decoding performance of codeword error rate (CWER) under pulsed RFI can be improved up to four orders of magnitude through a simple soft-decision-data reshuffle scheme. This study reveals that an error floor of LDPC decoding performance appears around CWER=1E-4 when the proposed technique is applied to mitigate the pulsed RFI impact. The mechanism causing this error floor remains unknown, further investigation is necessary.
Phylogenetic Evidence for Lateral Gene Transfer in the Intestine of Marine Iguanas
Nelson, David M.; Cann, Isaac K. O.; Altermann, Eric; Mackie, Roderick I.
2010-01-01
Background Lateral gene transfer (LGT) appears to promote genotypic and phenotypic variation in microbial communities in a range of environments, including the mammalian intestine. However, the extent and mechanisms of LGT in intestinal microbial communities of non-mammalian hosts remains poorly understood. Methodology/Principal Findings We sequenced two fosmid inserts obtained from a genomic DNA library derived from an agar-degrading enrichment culture of marine iguana fecal material. The inserts harbored 16S rRNA genes that place the organism from which they originated within Clostridium cluster IV, a well documented group that habitats the mammalian intestinal tract. However, sequence analysis indicates that 52% of the protein-coding genes on the fosmids have top BLASTX hits to bacterial species that are not members of Clostridium cluster IV, and phylogenetic analysis suggests that at least 10 of 44 coding genes on the fosmids may have been transferred from Clostridium cluster XIVa to cluster IV. The fosmids encoded four transposase-encoding genes and an integrase-encoding gene, suggesting their involvement in LGT. In addition, several coding genes likely involved in sugar transport were probably acquired through LGT. Conclusion Our phylogenetic evidence suggests that LGT may be common among phylogenetically distinct members of the phylum Firmicutes inhabiting the intestinal tract of marine iguanas. PMID:20520734
NASA Technical Reports Server (NTRS)
Wang, John T.; Pineda, Evan J.; Ranatunga, Vipul; Smeltzer, Stanley S.
2015-01-01
A simple continuum damage mechanics (CDM) based 3D progressive damage analysis (PDA) tool for laminated composites was developed and implemented as a user defined material subroutine to link with a commercially available explicit finite element code. This PDA tool uses linear lamina properties from standard tests, predicts damage initiation with an easy-to-implement Hashin-Rotem failure criteria, and in the damage evolution phase, evaluates the degradation of material properties based on the crack band theory and traction-separation cohesive laws. It follows Matzenmiller et al.'s formulation to incorporate the degrading material properties into the damaged stiffness matrix. Since nonlinear shear and matrix stress-strain relations are not implemented, correction factors are used for slowing the reduction of the damaged shear stiffness terms to reflect the effect of these nonlinearities on the laminate strength predictions. This CDM based PDA tool is implemented as a user defined material (VUMAT) to link with the Abaqus/Explicit code. Strength predictions obtained, using this VUMAT, are correlated with test data for a set of notched specimens under tension and compression loads.
Constraining physical parameters of ultra-fast outflows in PDS 456 with Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Hagino, K.; Odaka, H.; Done, C.; Gandhi, P.; Takahashi, T.
2014-07-01
Deep absorption lines with extremely high velocity of ˜0.3c observed in PDS 456 spectra strongly indicate the existence of ultra-fast outflows (UFOs). However, the launching and acceleration mechanisms of UFOs are still uncertain. One possible way to solve this is to constrain physical parameters as a function of distance from the source. In order to study the spatial dependence of parameters, it is essential to adopt 3-dimensional Monte Carlo simulations that treat radiation transfer in arbitrary geometry. We have developed a new simulation code of X-ray radiation reprocessed in AGN outflow. Our code implements radiative transfer in 3-dimensional biconical disk wind geometry, based on Monte Carlo simulation framework called MONACO (Watanabe et al. 2006, Odaka et al. 2011). Our simulations reproduce FeXXV and FeXXVI absorption features seen in the spectra. Also, broad Fe emission lines, which reflects the geometry and viewing angle, is successfully reproduced. By comparing the simulated spectra with Suzaku data, we obtained constraints on physical parameters. We discuss launching and acceleration mechanisms of UFOs in PDS 456 based on our analysis.
Use of multiscale zirconium alloy deformation models in nuclear fuel behavior analysis
NASA Astrophysics Data System (ADS)
Montgomery, Robert; Tomé, Carlos; Liu, Wenfeng; Alankar, Alankar; Subramanian, Gopinath; Stanek, Christopher
2017-01-01
Accurate prediction of cladding mechanical behavior is a key aspect of modeling nuclear fuel behavior, especially for conditions of pellet-cladding interaction (PCI), reactivity-initiated accidents (RIA), and loss of coolant accidents (LOCA). Current approaches to fuel performance modeling rely on empirical constitutive models for cladding creep, growth and plastic deformation, which are limited to the materials and conditions for which the models were developed. To improve upon this approach, a microstructurally-based zirconium alloy mechanical deformation analysis capability is being developed within the United States Department of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL). Specifically, the viscoplastic self-consistent (VPSC) polycrystal plasticity modeling approach, developed by Lebensohn and Tomé [1], has been coupled with the BISON engineering scale fuel performance code to represent the mechanistic material processes controlling the deformation behavior of light water reactor (LWR) cladding. A critical component of VPSC is the representation of the crystallographic nature (defect and dislocation movement) and orientation of the grains within the matrix material and the ability to account for the role of texture on deformation. A future goal is for VPSC to obtain information on reaction rate kinetics from atomistic calculations to inform the defect and dislocation behavior models described in VPSC. The multiscale modeling of cladding deformation mechanisms allowed by VPSC far exceed the functionality of typical semi-empirical constitutive models employed in nuclear fuel behavior codes to model irradiation growth and creep, thermal creep, or plasticity. This paper describes the implementation of an interface between VPSC and BISON and provides initial results utilizing the coupled functionality.
Preparing Technical Requirements for Third Party Contracting of Army Facilities
1993-06-01
Boiler and Pressure Vessel Code Sec 9 Welding and Brazing Qualifications B 16.1 Cast Iron Pipe Flanges and Flanged...Control Terminology for Heating, Ventilating, Air Conditioning American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code B40.1...American National Standards Institute (ANSI) Boiler and Pressure Vessel Code (ASME) 125 Boilers and Pressure Vessels Code (ASTM) B31 Power
Program Helps To Determine Chemical-Reaction Mechanisms
NASA Technical Reports Server (NTRS)
Bittker, D. A.; Radhakrishnan, K.
1995-01-01
General Chemical Kinetics and Sensitivity Analysis (LSENS) computer code developed for use in solving complex, homogeneous, gas-phase, chemical-kinetics problems. Provides for efficient and accurate chemical-kinetics computations and provides for sensitivity analysis for variety of problems, including problems involving honisothermal conditions. Incorporates mathematical models for static system, steady one-dimensional inviscid flow, reaction behind incident shock wave (with boundary-layer correction), and perfectly stirred reactor. Computations of equilibrium properties performed for following assigned states: enthalpy and pressure, temperature and pressure, internal energy and volume, and temperature and volume. Written in FORTRAN 77 with exception of NAMELIST extensions used for input.
Periodic response of nonlinear systems
NASA Technical Reports Server (NTRS)
Nataraj, C.; Nelson, H. D.
1988-01-01
A procedure is developed to determine approximate periodic solutions of autonomous and non-autonomous systems. The trignometric collocation method (TCM) is formalized to allow for the analysis of relatively small order systems directly in physical coordinates. The TCM is extended to large order systems by utilizing modal analysis in a component mode synthesis strategy. The procedure was coded and verified by several check cases. Numerical results for two small order mechanical systems and one large order rotor dynamic system are presented. The method allows for the possibility of approximating periodic responses for large order forced and self-excited nonlinear systems.
Baddeley, Michelle; Tobler, Philippe N.; Schultz, Wolfram
2016-01-01
Given that the range of rewarding and punishing outcomes of actions is large but neural coding capacity is limited, efficient processing of outcomes by the brain is necessary. One mechanism to increase efficiency is to rescale neural output to the range of outcomes expected in the current context, and process only experienced deviations from this expectation. However, this mechanism comes at the cost of not being able to discriminate between unexpectedly low losses when times are bad versus unexpectedly high gains when times are good. Thus, too much adaptation would result in disregarding information about the nature and absolute magnitude of outcomes, preventing learning about the longer-term value structure of the environment. Here we investigate the degree of adaptation in outcome coding brain regions in humans, for directly experienced outcomes and observed outcomes. We scanned participants while they performed a social learning task in gain and loss blocks. Multivariate pattern analysis showed two distinct networks of brain regions adapt to the most likely outcomes within a block. Frontostriatal areas adapted to directly experienced outcomes, whereas lateral frontal and temporoparietal regions adapted to observed social outcomes. Critically, in both cases, adaptation was incomplete and information about whether the outcomes arose in a gain block or a loss block was retained. Univariate analysis confirmed incomplete adaptive coding in these regions but also detected nonadapting outcome signals. Thus, although neural areas rescale their responses to outcomes for efficient coding, they adapt incompletely and keep track of the longer-term incentives available in the environment. SIGNIFICANCE STATEMENT Optimal value-based choice requires that the brain precisely and efficiently represents positive and negative outcomes. One way to increase efficiency is to adapt responding to the most likely outcomes in a given context. However, too strong adaptation would result in loss of precise representation (e.g., when the avoidance of a loss in a loss-context is coded the same as receipt of a gain in a gain-context). We investigated an intermediate form of adaptation that is efficient while maintaining information about received gains and avoided losses. We found that frontostriatal areas adapted to directly experienced outcomes, whereas lateral frontal and temporoparietal regions adapted to observed social outcomes. Importantly, adaptation was intermediate, in line with influential models of reference dependence in behavioral economics. PMID:27683899
Recent development in the design, testing and impact-damage tolerance of stiffened composite panels
NASA Technical Reports Server (NTRS)
Williams, J. G.; Anderson, M. S.; Rhodes, M. D.; Starnes, J. H., Jr.; Stroud, W. J.
1979-01-01
Structural technology of laminated filamentary-composite stiffened-panel structures under combined inplane and lateral loadings is discussed. Attention is focused on: (1) methods for analyzing the behavior of these structures under load and for determining appropriate structural proportions for weight-efficient configurations; and (2) effects of impact damage and geometric imperfections on structural performance. Recent improvements in buckling analysis involving combined inplane compression and shear loadings and transverse shear deformations are presented. A computer code is described for proportioning or sizing laminate layers and cross-sectional dimensions, and the code is used to develop structural efficiency data for a variety of configurations, loading conditions, and constraint conditions. Experimental data on buckling of panels under inplane compression is presented. Mechanisms of impact damage initiation and propagation are described.
The Role of Code-Switching in Bilingual Creativity
ERIC Educational Resources Information Center
Kharkhurin, Anatoliy V.; Wei, Li
2015-01-01
This study further explores the theme of bilingual creativity with the present focus on code-switching. Specifically, it investigates whether code-switching practice has an impact on creativity. In line with the previous research, selective attention was proposed as a potential cognitive mechanism, which on the one hand would benefit from…
Investigation on the Capability of a Non Linear CFD Code to Simulate Wave Propagation
2003-02-01
Linear CFD Code to Simulate Wave Propagation Pedro de la Calzada Pablo Quintana Manuel Antonio Burgos ITP, S.A. Parque Empresarial Fernando avenida...mechanisms above presented, simulation of unsteady aerodynamics with linear and nonlinear CFD codes is an ongoing activity within the turbomachinery industry
10 CFR 851.27 - Reference sources.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) American Society of Mechanical Engineers (ASME), P.O. Box 2300 Fairfield, NJ 07007. Telephone: 800-843-2763... Electrical Code,” (2005). (5) NFPA 70E, “Standard for Electrical Safety in the Workplace,” (2004). (6... Engineers (ASME) Boilers and Pressure Vessel Code, sections I through XII including applicable Code Cases...
10 CFR 851.27 - Reference sources.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) American Society of Mechanical Engineers (ASME), P.O. Box 2300 Fairfield, NJ 07007. Telephone: 800-843-2763... Electrical Code,” (2005). (5) NFPA 70E, “Standard for Electrical Safety in the Workplace,” (2004). (6... Engineers (ASME) Boilers and Pressure Vessel Code, sections I through XII including applicable Code Cases...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Institute “Code for Pressure Piping, Power Piping.” ASME Code means the American Society of Mechanical Engineers “Boiler and Pressure Vessel Code.” ASME PVHO-1 means the ANSI/ASME standard “Safety Standard for Pressure Vessels for Human Occupancy.” ATA means a measure of pressure expressed in terms of atmosphere...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Institute “Code for Pressure Piping, Power Piping.” ASME Code means the American Society of Mechanical Engineers “Boiler and Pressure Vessel Code.” ASME PVHO-1 means the ANSI/ASME standard “Safety Standard for Pressure Vessels for Human Occupancy.” ATA means a measure of pressure expressed in terms of atmosphere...
Code of Federal Regulations, 2013 CFR
2013-10-01
... Institute “Code for Pressure Piping, Power Piping.” ASME Code means the American Society of Mechanical Engineers “Boiler and Pressure Vessel Code.” ASME PVHO-1 means the ANSI/ASME standard “Safety Standard for Pressure Vessels for Human Occupancy.” ATA means a measure of pressure expressed in terms of atmosphere...
Code of Federal Regulations, 2014 CFR
2014-10-01
... Institute “Code for Pressure Piping, Power Piping.” ASME Code means the American Society of Mechanical Engineers “Boiler and Pressure Vessel Code.” ASME PVHO-1 means the ANSI/ASME standard “Safety Standard for Pressure Vessels for Human Occupancy.” ATA means a measure of pressure expressed in terms of atmosphere...
Neural Population Coding of Multiple Stimuli
Ma, Wei Ji
2015-01-01
In natural scenes, objects generally appear together with other objects. Yet, theoretical studies of neural population coding typically focus on the encoding of single objects in isolation. Experimental studies suggest that neural responses to multiple objects are well described by linear or nonlinear combinations of the responses to constituent objects, a phenomenon we call stimulus mixing. Here, we present a theoretical analysis of the consequences of common forms of stimulus mixing observed in cortical responses. We show that some of these mixing rules can severely compromise the brain's ability to decode the individual objects. This cost is usually greater than the cost incurred by even large reductions in the gain or large increases in neural variability, explaining why the benefits of attention can be understood primarily in terms of a stimulus selection, or demixing, mechanism rather than purely as a gain increase or noise reduction mechanism. The cost of stimulus mixing becomes even higher when the number of encoded objects increases, suggesting a novel mechanism that might contribute to set size effects observed in myriad psychophysical tasks. We further show that a specific form of neural correlation and heterogeneity in stimulus mixing among the neurons can partially alleviate the harmful effects of stimulus mixing. Finally, we derive simple conditions that must be satisfied for unharmful mixing of stimuli. PMID:25740513
Brankatschk, Kerstin; Kamber, Tim; Pothier, Joël F; Duffy, Brion; Smits, Theo H M
2014-11-01
Sprouted seeds represent a great risk for infection by human enteric pathogens because of favourable growth conditions for pathogens during their germination. The aim of this study was to identify mechanisms of interactions of Salmonella enterica subsp. enterica Weltevreden with alfalfa sprouts. RNA-seq analysis of S. Weltevreden grown with sprouts in comparison with M9-glucose medium showed that among a total of 4158 annotated coding sequences, 177 genes (4.3%) and 345 genes (8.3%) were transcribed at higher levels with sprouts and in minimal medium respectively. Genes that were higher transcribed with sprouts are coding for proteins involved in mechanisms known to be important for attachment, motility and biofilm formation. Besides gene expression required for phenotypic adaption, genes involved in sulphate acquisition were higher transcribed, suggesting that the surface on alfalfa sprouts may be poor in sulphate. Genes encoding structural and effector proteins of Salmonella pathogenicity island 2, involved in survival within macrophages during infection of animal tissue, were higher transcribed with sprouts possibly as a response to environmental conditions. This study provides insight on additional mechanisms that may be important for pathogen interactions with sprouts. © 2013 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
Gap Analysis of Material Properties Data for Ferritic/Martensitic HT-9 Steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Neil R.; Serrano De Caro, Magdalena; Rodriguez, Edward A.
2012-08-28
The US Department of Energy (DOE), Office of Nuclear Energy (NE), is supporting the development of an ASME Code Case for adoption of 12Cr-1Mo-VW ferritic/martensitic (F/M) steel, commonly known as HT-9, primarily for use in elevated temperature design of liquid-metal fast reactors (LMFR) and components. In 2011, Los Alamos National Laboratory (LANL) nuclear engineering staff began assisting in the development of a small modular reactor (SMR) design concept, previously known as the Hyperion Module, now called the Gen4 Module. LANL staff immediately proposed HT-9 for the reactor vessel and components, as well as fuel clad and ducting, due to itsmore » superior thermal qualities. Although the ASME material Code Case, for adoption of HT-9 as an approved elevated temperature material for LMFR service, is the ultimate goal of this project, there are several key deliverables that must first be successfully accomplished. The most important key deliverable is the research, accumulation, and documentation of specific material parameters; physical, mechanical, and environmental, which becomes the basis for an ASME Code Case. Time-independent tensile and ductility data and time-dependent creep and creep-rupture behavior are some of the material properties required for a successful ASME Code case. Although this report provides a cursory review of the available data, a much more comprehensive study of open-source data would be necessary. This report serves three purposes: (a) provides a list of already existing material data information that could ultimately be made available to the ASME Code, (b) determines the HT-9 material properties data missing from available sources that would be required and (c) estimates the necessary material testing required to close the gap. Ultimately, the gap analysis demonstrates that certain material properties testing will be required to fulfill the necessary information package for an ASME Code Case.« less
GRABGAM Analysis of Ultra-Low-Level HPGe Gamma Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winn, W.G.
The GRABGAM code has been used successfully for ultra-low level HPGe gamma spectrometry analysis since its development in 1985 at Savannah River Technology Center (SRTC). Although numerous gamma analysis codes existed at that time, reviews of institutional and commercial codes indicated that none addressed all features that were desired by SRTC. Furthermore, it was recognized that development of an in-house code would better facilitate future evolution of the code to address SRTC needs based on experience with low-level spectra. GRABGAM derives its name from Gamma Ray Analysis BASIC Generated At MCA/PC.
Orthographic Coding: Brain Activation for Letters, Symbols, and Digits.
Carreiras, Manuel; Quiñones, Ileana; Hernández-Cabrera, Juan Andrés; Duñabeitia, Jon Andoni
2015-12-01
The present experiment investigates the input coding mechanisms of 3 common printed characters: letters, numbers, and symbols. Despite research in this area, it is yet unclear whether the identity of these 3 elements is processed through the same or different brain pathways. In addition, some computational models propose that the position-in-string coding of these elements responds to general flexible mechanisms of the visual system that are not character-specific, whereas others suggest that the position coding of letters responds to specific processes that are different from those that guide the position-in-string assignment of other types of visual objects. Here, in an fMRI study, we manipulated character position and character identity through the transposition or substitution of 2 internal elements within strings of 4 elements. Participants were presented with 2 consecutive visual strings and asked to decide whether they were the same or different. The results showed: 1) that some brain areas responded more to letters than to numbers and vice versa, suggesting that processing may follow different brain pathways; 2) that the left parietal cortex is involved in letter identity, and critically in letter position coding, specifically contributing to the early stages of the reading process; and that 3) a stimulus-specific mechanism for letter position coding is operating during orthographic processing. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Integrated Composite Analyzer (ICAN): Users and programmers manual
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1986-01-01
The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.
Li, Zhan; Wang, Ximin; Wang, Weizong; Du, Juanjuan; Wei, Jinqiu; Zhang, Yong; Wang, Jiangrong; Hou, Yinglong
2017-07-01
Electrical remodeling has been reported to play a major role in the initiation and maintenance of atrial fibrillation (AF). Long non-coding RNAs (lncRNAs) have been increasingly recognized as contributors to the pathology of heart diseases. However, the roles and mechanisms of lncRNAs in electrical remodeling during AF remain unknown. In this study, the lncRNA expression profiles of right atria were investigated in AF and non-AF rabbit models by using RNA sequencing technique and validated using quantitative real-time polymerase chain reaction (qRT-PCR). A total of 99,843 putative new lncRNAs were identified, in which 1220 differentially expressed transcripts exhibited >2-fold change. Bioinformatics analysis was conducted to predict the functions and interactions of the aberrantly expressed genes. On the basis of a series of filtering pipelines, one lncRNA, TCONS_00075467, was selected to explore its effects and mechanisms on electrical remodeling. The atrial effective refractory period was shortened in vivo and the L-type calcium current and action potential duration were decreased in vitro by silencing of TCONS_00075467 with lentiviruses. Besides, the expression of miRNA-328 was negatively correlated with TCONS_00075467. We further demonstrated that TCONS_00075467 could sponge miRNA-328 in vitro and in vivo to regulate the downstream protein coding gene CACNA1C. In addition, miRNA-328 could partly reverse the effects of TCONS_00075467 on electrical remodeling. In summary, dysregulated lncRNAs may play important roles in modulating electrical remodeling during AF. Our study may facilitate the mechanism studies of lncRNAs in AF pathogenesis and provide potential therapeutic targets for AF. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sugarman, Dawn E; Wigderson, Sara B; Iles, Brittany R; Kaufman, Julia S; Fitzmaurice, Garrett M; Hilario, E Yvette; Robbins, Michael S; Greenfield, Shelly F
2016-10-01
A Stage II, two-site randomized clinical trial compared the manualized, single-gender Women's Recovery Group (WRG) to mixed-gender group therapy (Group Drug Counseling; GDC) and demonstrated efficacy. Enhanced affiliation and support in the WRG is a hypothesized mechanism of efficacy. This study sought to extend results of the previous small Stage I trial that showed the rate of supportive affiliative statements occurred more frequently in WRG than GDC. Participants (N = 158; 100 women, 58 men) were 18 years or older, substance dependent, and had used substances within the past 60 days. Women were randomized to WRG (n = 52) or GDC (n = 48). Group therapy videos were coded by two independent raters; Rater 1 coded 20% of videos (n = 74); Rater 2 coded 25% of videos coded by Rater 1 (n = 19). The number of affiliative statements made in WRG was 66% higher than in GDC. Three of eight affiliative statement categories occurred more frequently in WRG than GDC: supportive, shared experience, and strategy statements. This larger Stage II trial provided a greater number of group therapy tapes available for analysis. Results extended our previous findings, demonstrating both greater frequency of all affiliative statements, as well as specific categories of statements, made in single-gender WRG than mixed-gender GDC. Greater frequency of affiliative statements among group members may be one mechanism of enhanced support and efficacy in women-only WRG compared with standard mixed-gender group therapy for substance use disorders. (Am J Addict 2016;25:573-580). © 2016 American Academy of Addiction Psychiatry.
Johari, Karim; Behroozmand, Roozbeh
2017-05-01
The predictive coding model suggests that neural processing of sensory information is facilitated for temporally-predictable stimuli. This study investigated how temporal processing of visually-presented sensory cues modulates movement reaction time and neural activities in speech and hand motor systems. Event-related potentials (ERPs) were recorded in 13 subjects while they were visually-cued to prepare to produce a steady vocalization of a vowel sound or press a button in a randomized order, and to initiate the cued movement following the onset of a go signal on the screen. Experiment was conducted in two counterbalanced blocks in which the time interval between visual cue and go signal was temporally-predictable (fixed delay at 1000 ms) or unpredictable (variable between 1000 and 2000 ms). Results of the behavioral response analysis indicated that movement reaction time was significantly decreased for temporally-predictable stimuli in both speech and hand modalities. We identified premotor ERP activities with a left-lateralized parietal distribution for hand and a frontocentral distribution for speech that were significantly suppressed in response to temporally-predictable compared with unpredictable stimuli. The premotor ERPs were elicited approximately -100 ms before movement and were significantly correlated with speech and hand motor reaction times only in response to temporally-predictable stimuli. These findings suggest that the motor system establishes a predictive code to facilitate movement in response to temporally-predictable sensory stimuli. Our data suggest that the premotor ERP activities are robust neurophysiological biomarkers of such predictive coding mechanisms. These findings provide novel insights into the temporal processing mechanisms of speech and hand motor systems.
Factors associated with delay in trauma team activation and impact on patient outcomes.
Connolly, Rory; Woo, Michael Y; Lampron, Jacinthe; Perry, Jeffrey J
2017-09-05
Trauma code activation is initiated by emergency physicians using physiological and anatomical criteria, mechanism of injury, and patient demographic factors. Our objective was to identify factors associated with delayed trauma team activation. We assessed consecutive cases from a regional trauma database from January 2008 to March 2014. We defined a delay in trauma code activation as a time greater than 30 minutes from the time of arrival. We conducted univariate analysis for factors potentially influencing trauma team activation, and we subsequently used multiple logistic regression analysis models for delayed activation in relation to mortality, length of stay, and time to operative management. Patients totalling 846 were included for our analysis; 4.1% (35/846) of trauma codes were activated after 30 minutes. Mean age was 40.8 years in the early group versus 49.2 in the delayed group (p=0.01). Patients were over age 70 years in 7.6% in the early activation group versus 17.1% in the delayed group (p=0.04). There was no significant difference in sex, type of injury, injury severity, or time from injury between the two groups. There was no significant difference in mortality, median length of stay, or median time to operative management. Delayed activation is linked with increasing age with no clear link to increased mortality. Given the severe injuries in the delayed cohort that required activation of the trauma team, further emphasis on the older trauma patient and interventions to recognize this vulnerable population should be made.
Reliable absolute analog code retrieval approach for 3D measurement
NASA Astrophysics Data System (ADS)
Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Chen, Deyun
2017-11-01
The wrapped phase of phase-shifting approach can be unwrapped by using Gray code, but both the wrapped phase error and Gray code decoding error can result in period jump error, which will lead to gross measurement error. Therefore, this paper presents a reliable absolute analog code retrieval approach. The combination of unequal-period Gray code and phase shifting patterns at high frequencies are used to obtain high-frequency absolute analog code, and at low frequencies, the same unequal-period combination patterns are used to obtain the low-frequency absolute analog code. Next, the difference between the two absolute analog codes was employed to eliminate period jump errors, and a reliable unwrapped result can be obtained. Error analysis was used to determine the applicable conditions, and this approach was verified through theoretical analysis. The proposed approach was further verified experimentally. Theoretical analysis and experimental results demonstrate that the proposed approach can perform reliable analog code unwrapping.
BNL program in support of LWR degraded-core accident analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ginsberg, T.; Greene, G.A.
1982-01-01
Two major sources of loading on dry watr reactor containments are steam generatin from core debris water thermal interactions and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described in this paper. 8 figures.
Zhu, Yan; Chen, Longxian; Zhang, Chengjun; Hao, Pei; Jing, Xinyun; Li, Xuan
2017-01-25
Selaginella moellendorffii, a lycophyte, is a model plant to study the early evolution and development of vascular plants. As the first and only sequenced lycophyte to date, the genome of S. moellendorffii revealed many conserved genes and pathways, as well as specialized genes different from flowering plants. Despite the progress made, little is known about long noncoding RNAs (lncRNA) and the alternative splicing (AS) of coding genes in S. moellendorffii. Its coding gene models have not been fully validated with transcriptome data. Furthermore, it remains important to understand whether the regulatory mechanisms similar to flowering plants are used, and how they operate in a non-seed primitive vascular plant. RNA-sequencing (RNA-seq) was performed for three S. moellendorffii tissues, root, stem, and leaf, by constructing strand-specific RNA-seq libraries from RNA purified using RiboMinus isolation protocol. A total of 176 million reads (44 Gbp) were obtained from three tissue types, and were mapped to S. moellendorffii genome. By comparing with 22,285 existing gene models of S. moellendorffii, we identified 7930 high-confidence novel coding genes (a 35.6% increase), and for the first time reported 4422 lncRNAs in a lycophyte. Further, we refined 2461 (11.0%) of existing gene models, and identified 11,030 AS events (for 5957 coding genes) revealed for the first time for lycophytes. Tissue-specific gene expression with functional implication was analyzed, and 1031, 554, and 269 coding genes, and 174, 39, and 17 lncRNAs were identified in root, stem, and leaf tissues, respectively. The expression of critical genes for vascular development stages, i.e. formation of provascular cells, xylem specification and differentiation, and phloem specification and differentiation, was compared in S. moellendorffii tissues, indicating a less complex regulatory mechanism in lycophytes than in flowering plants. The results were further strengthened by the evolutionary trend of seven transcription factor families related to vascular development, which was observed among four representative species of seed and non-seed vascular plants, and nonvascular land and aquatic plants. The deep RNA-seq study of S. moellendorffii discovered extensive new gene contents, including novel coding genes, lncRNAs, AS events, and refined gene models. Compared to flowering vascular plants, S. moellendorffii displayed a less complexity in both gene structure, alternative splicing, and regulatory elements of vascular development. The study offered important insight into the evolution of vascular plants, and the regulation mechanism of vascular development in a non-seed plant.
A Semantic Analysis Method for Scientific and Engineering Code
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.
1998-01-01
This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.
Neutronic Calculation Analysis for CN HCCB TBM-Set
NASA Astrophysics Data System (ADS)
Cao, Qixiang; Zhao, Fengchao; Zhao, Zhou; Wu, Xinghua; Li, Zaixin; Wang, Xiaoyu; Feng, Kaiming
2015-07-01
Using the Monte Carlo transport code MCNP, neutronic calculation analysis for China helium cooled ceramic breeder test blanket module (CN HCCB TBM) and the associated shield block (together called TBM-set) has been carried out based on the latest design of HCCB TBM-set and C-lite model. Key nuclear responses of HCCB TBM-set, such as the neutron flux, tritium production rate, nuclear heating and radiation damage, have been obtained and discussed. These nuclear performance data can be used as the basic input data for other analyses of HCCB TBM-set, such as thermal-hydraulics, thermal-mechanics and safety analysis. supported by the Major State Basic Research Development Program of China (973 Program) (No. 2013GB108000)
FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna
2016-01-01
Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.
Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu
2015-01-01
Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.
Hrdlickova, Barbara; Kumar, Vinod; Kanduri, Kartiek; Zhernakova, Daria V; Tripathi, Subhash; Karjalainen, Juha; Lund, Riikka J; Li, Yang; Ullah, Ubaid; Modderman, Rutger; Abdulahad, Wayel; Lähdesmäki, Harri; Franke, Lude; Lahesmaa, Riitta; Wijmenga, Cisca; Withoff, Sebo
2014-01-01
Although genome-wide association studies (GWAS) have identified hundreds of variants associated with a risk for autoimmune and immune-related disorders (AID), our understanding of the disease mechanisms is still limited. In particular, more than 90% of the risk variants lie in non-coding regions, and almost 10% of these map to long non-coding RNA transcripts (lncRNAs). lncRNAs are known to show more cell-type specificity than protein-coding genes. We aimed to characterize lncRNAs and protein-coding genes located in loci associated with nine AIDs which have been well-defined by Immunochip analysis and by transcriptome analysis across seven populations of peripheral blood leukocytes (granulocytes, monocytes, natural killer (NK) cells, B cells, memory T cells, naive CD4(+) and naive CD8(+) T cells) and four populations of cord blood-derived T-helper cells (precursor, primary, and polarized (Th1, Th2) T-helper cells). We show that lncRNAs mapping to loci shared between AID are significantly enriched in immune cell types compared to lncRNAs from the whole genome (α <0.005). We were not able to prioritize single cell types relevant for specific diseases, but we observed five different cell types enriched (α <0.005) in five AID (NK cells for inflammatory bowel disease, juvenile idiopathic arthritis, primary biliary cirrhosis, and psoriasis; memory T and CD8(+) T cells in juvenile idiopathic arthritis, primary biliary cirrhosis, psoriasis, and rheumatoid arthritis; Th0 and Th2 cells for inflammatory bowel disease, juvenile idiopathic arthritis, primary biliary cirrhosis, psoriasis, and rheumatoid arthritis). Furthermore, we show that co-expression analyses of lncRNAs and protein-coding genes can predict the signaling pathways in which these AID-associated lncRNAs are involved. The observed enrichment of lncRNA transcripts in AID loci implies lncRNAs play an important role in AID etiology and suggests that lncRNA genes should be studied in more detail to interpret GWAS findings correctly. The co-expression results strongly support a model in which the lncRNA and protein-coding genes function together in the same pathways.
ERIC Educational Resources Information Center
Emmorey, Karen; Petrich, Jennifer A. F.; Gollan, Tamar H.
2012-01-01
Bilinguals who are fluent in American Sign Language (ASL) and English often produce "code-blends"--simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization…
78 FR 37848 - ASME Code Cases Not Approved for Use
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-24
...The U.S. Nuclear Regulatory Commission (NRC) is issuing for public comment draft regulatory guide (DG), DG-1233, ``ASME Code Cases not Approved for Use.'' This regulatory guide lists the American Society of Mechanical Engineers (ASME) Code Cases that the NRC has determined not to be acceptable for use on a generic basis.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-04
... Approved for Incorporation by Reference Introductory text to paragraph Introductory text Revise to title... reference. ASME B&PV Code, Section III Introductory text to paragraph Introductory text Revise to clarify... editorial corrections and additions. Introductory text to paragraph Introductory text Revise to include the...
Aerothermo-Structural Analysis of Low Cost Composite Nozzle/Inlet Components
NASA Technical Reports Server (NTRS)
Shivakumar, Kuwigai; Challa, Preeli; Sree, Dave; Reddy, D.
1999-01-01
This research is a cooperative effort among the Turbomachinery and Propulsion Division of NASA Glenn, CCMR of NC A&T State University, and the Tuskegee University. The NC A&T is the lead center and Tuskegee University is the participating institution. Objectives of the research were to develop an integrated aerodynamic, thermal and structural analysis code for design of aircraft engine components, such as, nozzles and inlets made of textile composites; conduct design studies on typical inlets for hypersonic transportation vehicles and setup standards test examples and finally manufacture a scaled down composite inlet. These objectives are accomplished through the following seven tasks: (1) identify the relevant public domain codes for all three types of analysis; (2) evaluate the codes for the accuracy of results and computational efficiency; (3) develop aero-thermal and thermal structural mapping algorithms; (4) integrate all the codes into one single code; (5) write a graphical user interface to improve the user friendliness of the code; (6) conduct test studies for rocket based combined-cycle engine inlet; and finally (7) fabricate a demonstration inlet model using textile preform composites. Tasks one, two and six are being pursued. Selected and evaluated NPARC for flow field analysis, CSTEM for in-depth thermal analysis of inlets and nozzles and FRAC3D for stress analysis. These codes have been independently verified for accuracy and performance. In addition, graphical user interface based on micromechanics analysis for laminated as well as textile composites was developed. Demonstration of this code will be made at the conference. A rocket based combined cycle engine was selected for test studies. Flow field analysis of various inlet geometries were studied. Integration of codes is being continued. The codes developed are being applied to a candidate example of trailblazer engine proposed for space transportation. A successful development of the code will provide a simpler, faster and user-friendly tool for conducting design studies of aircraft and spacecraft engines, applicable in high speed civil transport and space missions.
AERODYNAMIC AND BLADING DESIGN OF MULTISTAGE AXIAL FLOW COMPRESSORS
NASA Technical Reports Server (NTRS)
Crouse, J. E.
1994-01-01
The axial-flow compressor is used for aircraft engines because it has distinct configuration and performance advantages over other compressor types. However, good potential performance is not easily obtained. The designer must be able to model the actual flows well enough to adequately predict aerodynamic performance. This computer program has been developed for computing the aerodynamic design of a multistage axial-flow compressor and, if desired, the associated blading geometry input for internal flow analysis. The aerodynamic solution gives velocity diagrams on selected streamlines of revolution at the blade row edges. The program yields aerodynamic and blading design results that can be directly used by flow and mechanical analysis codes. Two such codes are TSONIC, a blade-to-blade channel flow analysis code (COSMIC program LEW-10977), and MERIDL, a more detailed hub-to-shroud flow analysis code (COSMIC program LEW-12966). The aerodynamic and blading design program can reduce the time and effort required to obtain acceptable multistage axial-flow compressor configurations by generating good initial solutions and by being compatible with available analysis codes. The aerodynamic solution assumes steady, axisymmetric flow so that the problem is reduced to solving the two-dimensional flow field in the meridional plane. The streamline curvature method is used for the iterative aerodynamic solution at stations outside of the blade rows. If a blade design is desired, the blade elements are defined and stacked within the aerodynamic solution iteration. The blade element inlet and outlet angles are established by empirical incidence and deviation angles to the relative flow angles of the velocity diagrams. The blade element centerline is composed of two segments tangentially joined at a transition point. The local blade angle variation of each element can be specified as a fourth-degree polynomial function of path distance. Blade element thickness can also be specified with fourth-degree polynomial functions of path distance from the maximum thickness point. Input to the aerodynamic and blading design program includes the annulus profile, the overall compressor mass flow, the pressure ratio, and the rotative speed. A number of input parameters are also used to specify and control the blade row aerodynamics and geometry. The output from the aerodynamic solution has an overall blade row and compressor performance summary followed by blade element parameters for the individual blade rows. If desired, the blade coordinates in the streamwise direction for internal flow analysis codes and the coordinates on plane sections through blades for fabrication drawings may be stored and printed. The aerodynamic and blading design program for multistage axial-flow compressors is written in FORTRAN IV for batch execution and has been implemented on an IBM 360 series computer with a central memory requirement of approximately 470K of 8 bit bytes. This program was developed in 1981.
An emulator for minimizing computer resources for finite element analysis
NASA Technical Reports Server (NTRS)
Melosh, R.; Utku, S.; Islam, M.; Salama, M.
1984-01-01
A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).
Wake coupling to full potential rotor analysis code
NASA Technical Reports Server (NTRS)
Torres, Francisco J.; Chang, I-Chung; Oh, Byung K.
1990-01-01
The wake information from a helicopter forward flight code is coupled with two transonic potential rotor codes. The induced velocities for the near-, mid-, and far-wake geometries are extracted from a nonlinear rigid wake of a standard performance and analysis code. These, together with the corresponding inflow angles, computation points, and azimuth angles, are then incorporated into the transonic potential codes. The coupled codes can then provide an improved prediction of rotor blade loading at transonic speeds.
Comparative analysis of design codes for timber bridges in Canada, the United States, and Europe
James Wacker; James (Scott) Groenier
2010-01-01
The United States recently completed its transition from the allowable stress design code to the load and resistance factor design (LRFD) reliability-based code for the design of most highway bridges. For an international perspective on the LRFD-based bridge codes, a comparative analysis is presented: a study addressed national codes of the United States, Canada, and...
Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping
NASA Astrophysics Data System (ADS)
Kubica, Aleksander; Beverland, Michael E.; Brandão, Fernando; Preskill, John; Svore, Krysta M.
2018-05-01
Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p3DCC (1 )≃1.9 % and p3DCC (2 )≃27.6 % . We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.
Sierra/SolidMechanics 4.48 User's Guide: Addendum for Shock Capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose
This is an addendum to the Sierra/SolidMechanics 4.48 User's Guide that documents additional capabilities available only in alternate versions of the Sierra/SolidMechanics (Sierra/SM) code. These alternate versions are enhanced to provide capabilities that are regulated under the U.S. Department of State's International Traffic in Arms Regulations (ITAR) export-control rules. The ITAR regulated codes are only distributed to entities that comply with the ITAR export-control requirements. The ITAR enhancements to Sierra/SM in- clude material models with an energy-dependent pressure response (appropriate for very large deformations and strain rates) and capabilities for blast modeling. Since this is an addendum to the standardmore » Sierra/SM user's guide, please refer to that document first for general descriptions of code capability and use.« less
Self-Configuration and Localization in Ad Hoc Wireless Sensor Networks
2010-08-31
Goddard I. SUMMARY OF CONTRIBUTIONS We explored the error mechanisms of iterative decoding of low-density parity-check ( LDPC ) codes . This work has resulted...important problems in the area of channel coding , as their unpredictable behavior has impeded the deployment of LDPC codes in many real-world applications. We...tree-based decoders of LDPC codes , including the extrinsic tree decoder, and an investigation into their performance and bounding capabilities [5], [6
Multidisciplinary tailoring of hot composite structures
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.; Chamis, Christos C.
1993-01-01
A computational simulation procedure is described for multidisciplinary analysis and tailoring of layered multi-material hot composite engine structural components subjected to simultaneous multiple discipline-specific thermal, structural, vibration, and acoustic loads. The effect of aggressive environments is also simulated. The simulation is based on a three-dimensional finite element analysis technique in conjunction with structural mechanics codes, thermal/acoustic analysis methods, and tailoring procedures. The integrated multidisciplinary simulation procedure is general-purpose including the coupled effects of nonlinearities in structure geometry, material, loading, and environmental complexities. The composite material behavior is assessed at all composite scales, i.e., laminate/ply/constituents (fiber/matrix), via a nonlinear material characterization hygro-thermo-mechanical model. Sample tailoring cases exhibiting nonlinear material/loading/environmental behavior of aircraft engine fan blades, are presented. The various multidisciplinary loads lead to different tailored designs, even those competing with each other, as in the case of minimum material cost versus minimum structure weight and in the case of minimum vibration frequency versus minimum acoustic noise.
Creep-Rupture Behavior of Ni-Based Alloy Tube Bends for A-USC Boilers
NASA Astrophysics Data System (ADS)
Shingledecker, John
Advanced ultrasupercritical (A-USC) boiler designs will require the use of nickel-based alloys for superheaters and reheaters and thus tube bending will be required. The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code Section II PG-19 limits the amount of cold-strain for boiler tube bends for austenitic materials. In this summary and analysis of research conducted to date, a number of candidate nickel-based A-USC alloys were evaluated. These alloys include alloy 230, alloy 617, and Inconel 740/740H. Uniaxial creep and novel structural tests and corresponding post-test analysis, which included physical measurements, simplified analytical analysis, and detailed microscopy, showed that different damage mechanisms may operate based on test conditions, alloy, and cold-strain levels. Overall, creep strength and ductility were reduced in all the alloys, but the degree of degradation varied substantially. The results support the current cold-strain limits now incorporated in ASME for these alloys for long-term A-USC boiler service.
Metallic glass coating on metals plate by adjusted explosive welding technique
NASA Astrophysics Data System (ADS)
Liu, W. D.; Liu, K. X.; Chen, Q. Y.; Wang, J. T.; Yan, H. H.; Li, X. J.
2009-09-01
Using an adjusted explosive welding technique, an aluminum plate has been coated by a Fe-based metallic glass foil in this work. Scanning electronic micrographs reveal a defect-free metallurgical bonding between the Fe-based metallic glass foil and the aluminum plate. Experimental evidence indicates that the Fe-based metallic glass foil almost retains its amorphous state and mechanical properties after the explosive welding process. Additionally, the detailed explosive welding process has been simulated by a self-developed hydro-code and the bonding mechanism has been investigated by numerical analysis. The successful welding between the Fe-based metallic glass foil and the aluminum plate provides a new way to obtain amorphous coating on general metal substrates.
METCAN: The metal matrix composite analyzer
NASA Technical Reports Server (NTRS)
Hopkins, Dale A.; Murthy, Pappu L. N.
1988-01-01
Metal matrix composites (MMC) are the subject of intensive study and are receiving serious consideration for critical structural applications in advanced aerospace systems. MMC structural analysis and design methodologies are studied. Predicting the mechanical and thermal behavior and the structural response of components fabricated from MMC requires the use of a variety of mathematical models. These models relate stresses to applied forces, stress intensities at the tips of cracks to nominal stresses, buckling resistance to applied force, or vibration response to excitation forces. The extensive research in computational mechanics methods for predicting the nonlinear behavior of MMC are described. This research has culminated in the development of the METCAN (METal Matrix Composite ANalyzer) computer code.
NASA Technical Reports Server (NTRS)
Cooke, C. H.
1978-01-01
The paper describes the split-Cholesky strategy for banded matrices arising from the large systems of equations in certain fluid mechanics problems. The basic idea is that for a banded matrix the computation can be carried out in pieces, with only a small portion of the matrix residing in core. Mesh considerations are discussed by demonstrating the manner in which the assembly of finite element equations proceeds for linear trial functions on a triangular mesh. The FORTRAN code which implements the out-of-core decomposition strategy for banded symmetric positive definite matrices (mass matrices) of a coupled initial value problem is given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Pengcheng; Mcclure, Mark; Shiozawa, Sogo
A series of experiments performed at the Fenton Hill hot dry rock site after stage 2 drilling of Phase I reservoir provided intriguing field observations on the reservoir’s responses to injection and venting under various conditions. Two teams participating in the US DOE Geothermal Technologies Office (GTO)’s Code Comparison Study (CCS) used different numerical codes to model these five experiments with the objective of inferring the hydraulic stimulation mechanism involved. The codes used by the two teams are based on different numerical principles, and the assumptions made were also different, due to intrinsic limitations in the codes and the modelers’more » personal interpretations of the field observations. Both sets of models were able to produce the most important field observations and both found that it was the combination of the vertical gradient of the fracture opening pressure, injection volume, and the use/absence of proppant that yielded the different outcomes of the five experiments.« less
Multifunctional Fuel Additives for Reduced Jet Particulate Emissions
2006-06-01
additives, turbine engine emissions, particulates, chemical kinetics, combustion, JP-8 chemistry 16. SECURITY CLASSIFICATION OF: 19a. NAME OF...from the UNICORN CFD code using the full and skeletal versions of the Violi et al JP-8 mechanism ...................114 Figure 64. Comparison of...calculated jet flame benzene mole fraction contours from the UNICORN CFD code using the full and skeletal versions of the Violi et al JP-8 mechanism
Activity-Dependent Human Brain Coding/Noncoding Gene Regulatory Networks
Lipovich, Leonard; Dachet, Fabien; Cai, Juan; Bagla, Shruti; Balan, Karina; Jia, Hui; Loeb, Jeffrey A.
2012-01-01
While most gene transcription yields RNA transcripts that code for proteins, a sizable proportion of the genome generates RNA transcripts that do not code for proteins, but may have important regulatory functions. The brain-derived neurotrophic factor (BDNF) gene, a key regulator of neuronal activity, is overlapped by a primate-specific, antisense long noncoding RNA (lncRNA) called BDNFOS. We demonstrate reciprocal patterns of BDNF and BDNFOS transcription in highly active regions of human neocortex removed as a treatment for intractable seizures. A genome-wide analysis of activity-dependent coding and noncoding human transcription using a custom lncRNA microarray identified 1288 differentially expressed lncRNAs, of which 26 had expression profiles that matched activity-dependent coding genes and an additional 8 were adjacent to or overlapping with differentially expressed protein-coding genes. The functions of most of these protein-coding partner genes, such as ARC, include long-term potentiation, synaptic activity, and memory. The nuclear lncRNAs NEAT1, MALAT1, and RPPH1, composing an RNAse P-dependent lncRNA-maturation pathway, were also upregulated. As a means to replicate human neuronal activity, repeated depolarization of SY5Y cells resulted in sustained CREB activation and produced an inverse pattern of BDNF-BDNFOS co-expression that was not achieved with a single depolarization. RNAi-mediated knockdown of BDNFOS in human SY5Y cells increased BDNF expression, suggesting that BDNFOS directly downregulates BDNF. Temporal expression patterns of other lncRNA-messenger RNA pairs validated the effect of chronic neuronal activity on the transcriptome and implied various lncRNA regulatory mechanisms. lncRNAs, some of which are unique to primates, thus appear to have potentially important regulatory roles in activity-dependent human brain plasticity. PMID:22960213
Jiao, Shuming; Jin, Zhi; Zhou, Changyuan; Zou, Wenbin; Li, Xia
2018-01-01
Quick response (QR) code has been employed as a data carrier for optical cryptosystems in many recent research works, and the error-correction coding mechanism allows the decrypted result to be noise free. However, in this paper, we point out for the first time that the Reed-Solomon coding algorithm in QR code is not a very suitable option for the nonlocally distributed speckle noise in optical cryptosystems from an information coding perspective. The average channel capacity is proposed to measure the data storage capacity and noise-resistant capability of different encoding schemes. We design an alternative 2D barcode scheme based on Bose-Chaudhuri-Hocquenghem (BCH) coding, which demonstrates substantially better average channel capacity than QR code in numerical simulated optical cryptosystems.
NASA Astrophysics Data System (ADS)
Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.
2018-05-01
One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.
Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO
NASA Technical Reports Server (NTRS)
Stallworth, R.; Meyers, C. A.; Stinson, H. C.
1989-01-01
Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.
Neural coding of sound envelope in reverberant environments.
Slama, Michaël C C; Delgutte, Bertrand
2015-03-11
Speech reception depends critically on temporal modulations in the amplitude envelope of the speech signal. Reverberation encountered in everyday environments can substantially attenuate these modulations. To assess the effect of reverberation on the neural coding of amplitude envelope, we recorded from single units in the inferior colliculus (IC) of unanesthetized rabbit using sinusoidally amplitude modulated (AM) broadband noise stimuli presented in simulated anechoic and reverberant environments. Although reverberation degraded both rate and temporal coding of AM in IC neurons, in most neurons, the degradation in temporal coding was smaller than the AM attenuation in the stimulus. This compensation could largely be accounted for by the compressive shape of the modulation input-output function (MIOF), which describes the nonlinear transformation of modulation depth from acoustic stimuli into neural responses. Additionally, in a subset of neurons, the temporal coding of AM was better for reverberant stimuli than for anechoic stimuli having the same modulation depth at the ear. Using hybrid anechoic stimuli that selectively possess certain properties of reverberant sounds, we show that this reverberant advantage is not caused by envelope distortion, static interaural decorrelation, or spectral coloration. Overall, our results suggest that the auditory system may possess dual mechanisms that make the coding of amplitude envelope relatively robust in reverberation: one general mechanism operating for all stimuli with small modulation depths, and another mechanism dependent on very specific properties of reverberant stimuli, possibly the periodic fluctuations in interaural correlation at the modulation frequency. Copyright © 2015 the authors 0270-6474/15/354452-17$15.00/0.
Miyakawa, Hiroe; Miyamoto, Toshinobu; Koh, Eitetsu; Tsujimura, Akira; Miyagawa, Yasushi; Saijo, Yasuaki; Namiki, Mikio; Sengoku, Kazuo
2012-01-01
Genetic mechanisms have been implicated as a cause of some cases of male infertility. Recently, 10 novel genes involved in human spermatogenesis, including human SEPTIN12, were identified by expression microarray analysis of human testicular tissue. Septin12 is a member of the septin family of conserved cytoskeletal GTPases that form heteropolymeric filamentous structures in interphase cells. It is expressed specifically in the testis. Therefore, we hypothesized that mutation or polymorphisms of SEPTIN12 participate in male infertility, especially Sertoli cell-only syndrome (SCOS). To investigate whether SEPTIN12 gene defects are associated with azoospermia caused by SCOS, mutational analysis was performed in 100 Japanese patients by direct sequencing of coding regions. Statistical analysis was performed in patients with SCOS and in 140 healthy control men. No mutations were found in SEPTIN12 ; however, 8 coding single-nucleotide polymorphisms (SNP1-SNP8) could be detected in the patients with SCOS. The genotype and allele frequencies in SNP3, SNP4, and SNP6 were notably higher in the SCOS group than in the control group (P < .001). These results suggest that SEPTIN12 might play a critical role in human spermatogenesis.
An Integrated Model of Cognitive Control in Task Switching
ERIC Educational Resources Information Center
Altmann, Erik M.; Gray, Wayne D.
2008-01-01
A model of cognitive control in task switching is developed in which controlled performance depends on the system maintaining access to a code in episodic memory representing the most recently cued task. The main constraint on access to the current task code is proactive interference from old task codes. This interference and the mechanisms that…
Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frambati, S.; Frignani, M.
2012-07-01
We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less
Falconi, M; Oteri, F; Eliseo, T; Cicero, D O; Desideri, A
2008-08-01
The structural dynamics of the DNA binding domains of the human papillomavirus strain 16 and the bovine papillomavirus strain 1, complexed with their DNA targets, has been investigated by modeling, molecular dynamics simulations, and nuclear magnetic resonance analysis. The simulations underline different dynamical features of the protein scaffolds and a different mechanical interaction of the two proteins with DNA. The two protein structures, although very similar, show differences in the relative mobility of secondary structure elements. Protein structural analyses, principal component analysis, and geometrical and energetic DNA analyses indicate that the two transcription factors utilize a different strategy in DNA recognition and deformation. Results show that the protein indirect DNA readout is not only addressable to the DNA molecule flexibility but it is finely tuned by the mechanical and dynamical properties of the protein scaffold involved in the interaction.
Numerical analysis of natural convection in liquid droplets by phase change
NASA Astrophysics Data System (ADS)
Duh, J. C.; Yang, Wen-Jei
1989-09-01
A numerical analysis is performed on thermocapillary buoyancy convection induced by phase change in a liquid droplet. A finite-difference code is developed using an alternating-direction implicit (ADI) scheme. The intercoupling relation between thermocapillary force, buoyancy force, fluid property, heat transfer, and phase change, along with their effects on the induced flow patterns, are disclosed. The flow is classified into three types: thermocapillary, buoyancy, and combined convection. Among the three mechanisms, the combined convection simulates the experimental observations quite well, and the basic mechanism of the observed convection inside evaporating sessile drops is thus identified. It is disclosed that evaporation initiates unstable convection, while condensation always brings about a stable density distribution which eventually damps out all fluid disturbances. Another numerical model is presented to study the effect of boundary recession due to evaporation, and the 'peeling-off' effect (the removal of the surface layer of fluid by evaporation) is shown to be relevant.
NASA Technical Reports Server (NTRS)
Darling, Douglas; Radhakrishnan, Krishnan; Oyediran, Ayo
1995-01-01
Premixed combustors, which are being considered for low NOx engines, are susceptible to instabilities due to feedback between pressure perturbations and combustion. This feedback can cause damaging mechanical vibrations of the system as well as degrade the emissions characteristics and combustion efficiency. In a lean combustor instabilities can also lead to blowout. A model was developed to perform linear combustion-acoustic stability analysis using detailed chemical kinetic mechanisms. The Lewis Kinetics and Sensitivity Analysis Code, LSENS, was used to calculate the sensitivities of the heat release rate to perturbations in density and temperature. In the present work, an assumption was made that the mean flow velocity was small relative to the speed of sound. Results of this model showed the regions of growth of perturbations to be most sensitive to the reflectivity of the boundary when reflectivities were close to unity.
Structural optimization of Beach-Cleaner snatch mechanism
NASA Astrophysics Data System (ADS)
Ouyang, Lian-ge; Wei, Qin-rui; Zhou, Shui-ting; Peng, Qian; Zhao, Yuan-jiang; Wang, Fang
2017-12-01
In the working process of one Beach-Cleaner snatch institution, the second knuckle arm angular speed was too high, which resulted in the pick-up device would crash into the basic arm in the fold process. The rational position of joint to reduce the second knuckle arm angular speed and the force along the axis direction of the most dangerous point can be obtained from the kinematics simulation of snatch institution in the code of Automatic Dynamic Analysis off Mechanical Systems (ADAAMS). The feasible of scheme was validated by analyzing the optimized model in the software of ANSYS. The analysis results revealed: the open angle between the basic arm and the second knuckle arm improved from 125.0° too 135.24°, thee second knuckle arm angular speed decreased from 990.74rad/s to 58.53 rad/s, Not only improved work efficiency of snatch institution, but also prolonged its operation smoothness.
NASA Astrophysics Data System (ADS)
Tarfaoui, M.; Nachtane, M.; Khadimallah, H.; Saifaoui, D.
2018-04-01
Issues such as energy generation/transmission and greenhouse gas emissions are the two energy problems we face today. In this context, renewable energy sources are a necessary part of the solution essentially winds power, which is one of the most profitable sources of competition with new fossil energy facilities. This paper present the simulation of mechanical behavior and damage of a 48 m composite wind turbine blade under critical wind loads. The finite element analysis was performed by using ABAQUS code to predict the most critical damage behavior and to apprehend and obtain knowledge of the complex structural behavior of wind turbine blades. The approach developed based on the nonlinear FE analysis using mean values for the material properties and the failure criteria of Tsai-Hill to predict failure modes in large structures and to identify the sensitive zones.
Alexandrou, Angelos; Papaevripidou, Ioannis; Tsangaras, Kyriakos; Alexandrou, Ioanna; Tryfonidis, Marios; Christophidou-Anastasiadou, Violetta; Zamba-Papanicolaou, Eleni; Koumbaris, George; Neocleous, Vassos; Phylactou, Leonidas A; Skordis, Nicos; Tanteles, George A; Sismani, Carolina
2016-12-01
Haploinsufficiency of the short stature homeobox contaning SHOX gene has been shown to result in a spectrum of phenotypes ranging from Leri-Weill dyschondrosteosis (LWD) at the more severe end to SHOX-related short stature at the milder end of the spectrum. Most alterations are whole gene deletions, point mutations within the coding region, or microdeletions in its flanking sequences. Here, we present the clinical and molecular data as well as the potential molecular mechanism underlying a novel microdeletion, causing a variable SHOX-related haploinsufficiency disorder in a three-generation family. The phenotype resembles that of LWD in females, in males, however, the phenotypic expression is milder. The 15523-bp SHOX intragenic deletion, encompassing exons 3-6, was initially detected by array-CGH, followed by MLPA analysis. Sequencing of the breakpoints indicated an Alu recombination-mediated deletion (ARMD) as the potential causative mechanism.
Numerical analysis of natural convection in liquid droplets by phase change
NASA Technical Reports Server (NTRS)
Duh, J. C.; Yang, Wen-Jei
1989-01-01
A numerical analysis is performed on thermocapillary buoyancy convection induced by phase change in a liquid droplet. A finite-difference code is developed using an alternating-direction implicit (ADI) scheme. The intercoupling relation between thermocapillary force, buoyancy force, fluid property, heat transfer, and phase change, along with their effects on the induced flow patterns, are disclosed. The flow is classified into three types: thermocapillary, buoyancy, and combined convection. Among the three mechanisms, the combined convection simulates the experimental observations quite well, and the basic mechanism of the observed convection inside evaporating sessile drops is thus identified. It is disclosed that evaporation initiates unstable convection, while condensation always brings about a stable density distribution which eventually damps out all fluid disturbances. Another numerical model is presented to study the effect of boundary recession due to evaporation, and the 'peeling-off' effect (the removal of the surface layer of fluid by evaporation) is shown to be relevant.
NASA Astrophysics Data System (ADS)
Leith, Alex P.; Ratan, Rabindra A.; Wohn, Donghee Yvette
2016-08-01
Given the diversity and complexity of education game mechanisms and topics, this article contributes to a theoretical understanding of how game mechanisms "map" to educational topics through inquiry-based learning. Namely, the article examines the presence of evolution through natural selection (ENS) in digital games. ENS is a fundamentally important and widely misunderstood theory. This analysis of ENS portrayal in digital games provides insight into the use of games in teaching ENS. Systematic database search results were coded for the three principles of ENS: phenotypic variation, differential fitness, and fitness heritability. Though thousands of games use the term evolution, few presented elements of evolution, and even fewer contained all principles of ENS. Games developed to specifically teach evolution were difficult to find through Web searches. These overall deficiencies in ENS games reflect the inherent incompatibility between game control elements and the automatic process of ENS.
Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-01-01
Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625
Qualitative data analysis for health services research: developing taxonomy, themes, and theory.
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-08-01
To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.
Role of Alternative Polyadenylation during Adipogenic Differentiation: An In Silico Approach
Spangenberg, Lucía; Correa, Alejandro; Dallagiovanna, Bruno; Naya, Hugo
2013-01-01
Post-transcriptional regulation of stem cell differentiation is far from being completely understood. Changes in protein levels are not fully correlated with corresponding changes in mRNAs; the observed differences might be partially explained by post-transcriptional regulation mechanisms, such as alternative polyadenylation. This would involve changes in protein binding, transcript usage, miRNAs and other non-coding RNAs. In the present work we analyzed the distribution of alternative transcripts during adipogenic differentiation and the potential role of miRNAs in post-transcriptional regulation. Our in silico analysis suggests a modest, consistent, bias in 3′UTR lengths during differentiation enabling a fine-tuned transcript regulation via small non-coding RNAs. Including these effects in the analyses partially accounts for the observed discrepancies in relative abundance of protein and mRNA. PMID:24143171
Numerical study of shock-wave/boundary layer interactions in premixed hydrogen-air hypersonic flows
NASA Technical Reports Server (NTRS)
Yungster, Shaye
1991-01-01
A computational study of shock wave/boundary layer interactions involving premixed combustible gases, and the resulting combustion processes is presented. The analysis is carried out using a new fully implicit, total variation diminishing (TVD) code developed for solving the fully coupled Reynolds-averaged Navier-Stokes equations and species continuity equations in an efficient manner. To accelerate the convergence of the basic iterative procedure, this code is combined with vector extrapolation methods. The chemical nonequilibrium processes are simulated by means of a finite-rate chemistry model for hydrogen-air combustion. Several validation test cases are presented and the results compared with experimental data or with other computational results. The code is then applied to study shock wave/boundary layer interactions in a ram accelerator configuration. Results indicate a new combustion mechanism in which a shock wave induces combustion in the boundary layer, which then propagates outwards and downstream. At higher Mach numbers, spontaneous ignition in part of the boundary layer is observed, which eventually extends along the entire boundary layer at still higher values of the Mach number.
Real-time chirp-coded imaging with a programmable ultrasound biomicroscope.
Bosisio, Mattéo R; Hasquenoph, Jean-Michel; Sandrin, Laurent; Laugier, Pascal; Bridal, S Lori; Yon, Sylvain
2010-03-01
Ultrasound biomicroscopy (UBM) of mice can provide a testing ground for new imaging strategies. The UBM system presented in this paper facilitates the development of imaging and measurement methods with programmable design, arbitrary waveform coding, broad bandwidth (2-80 MHz), digital filtering, programmable processing, RF data acquisition, multithread/multicore real-time display, and rapid mechanical scanning (
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonen, F.A.; Johnson, K.I.; Liebetrau, A.M.
The VISA-II (Vessel Integrity Simulation Analysis code was originally developed as part of the NRC staff evaluation of pressurized thermal shock. VISA-II uses Monte Carlo simulation to evaluate the failure probability of a pressurized water reactor (PWR) pressure vessel subjected to a pressure and thermal transient specified by the user. Linear elastic fracture mechanics methods are used to model crack initiation and propagation. Parameters for initial crack size and location, copper content, initial reference temperature of the nil-ductility transition, fluence, crack-initiation fracture toughness, and arrest fracture toughness are treated as random variables. This report documents an upgraded version of themore » original VISA code as described in NUREG/CR-3384. Improvements include a treatment of cladding effects, a more general simulation of flaw size, shape and location, a simulation of inservice inspection, an updated simulation of the reference temperature of the nil-ductility transition, and treatment of vessels with multiple welds and initial flaws. The code has been extensively tested and verified and is written in FORTRAN for ease of installation on different computers. 38 refs., 25 figs.« less
NASA Astrophysics Data System (ADS)
Islam, M. S.; Nakashima, Y.; Hatayama, A.
2017-12-01
The linear divertor analysis with fluid model (LINDA) code has been developed in order to simulate plasma behavior in the end-cell of linear fusion device GAMMA 10/PDX. This paper presents the basic structure and simulated results of the LINDA code. The atomic processes of hydrogen and impurities have been included in the present model in order to investigate energy loss processes and mechanism of plasma detachment. A comparison among Ar, Kr and Xe shows that Xe is the most effective gas on the reduction of electron and ion temperature. Xe injection leads to strong reduction in the temperature of electron and ion. The energy loss terms for both the electron and the ion are enhanced significantly during Xe injection. It is shown that the major energy loss channels for ion and electron are charge-exchange loss and radiative power loss of the radiator gas, respectively. These outcomes indicate that Xe injection in the plasma edge region is effective for reducing plasma energy and generating detached plasma in linear device GAMMA 10/PDX.
Numerical study of shock-wave/boundary layer interactions in premixed hydrogen-air hypersonic flows
NASA Technical Reports Server (NTRS)
Yungster, Shaye
1990-01-01
A computational study of shock wave/boundary layer interactions involving premixed combustible gases, and the resulting combustion processes is presented. The analysis is carried out using a new fully implicit, total variation diminishing (TVD) code developed for solving the fully coupled Reynolds-averaged Navier-Stokes equations and species continuity equations in an efficient manner. To accelerate the convergence of the basic iterative procedure, this code is combined with vector extrapolation methods. The chemical nonequilibrium processes are simulated by means of a finite-rate chemistry model for hydrogen-air combustion. Several validation test cases are presented and the results compared with experimental data or with other computational results. The code is then applied to study shock wave/boundary layer interactions in a ram accelerator configuration. Results indicate a new combustion mechanism in which a shock wave induces combustion in the boundary layer, which then propagates outwards and downstream. At higher Mach numbers, spontaneous ignition in part of the boundary layer is observed, which eventually extends along the entire boundary layer at still higher values of the Mach number.
Qiu, Jia-jun; Ren, Zhao-rui; Yan, Jing-bin
2016-01-01
Epigenetics regulations have an important role in fertilization and proper embryonic development, and several human diseases are associated with epigenetic modification disorders, such as Rett syndrome, Beckwith-Wiedemann syndrome and Angelman syndrome. However, the dynamics and functions of long non-coding RNAs (lncRNAs), one type of epigenetic regulators, in human pre-implantation development have not yet been demonstrated. In this study, a comprehensive analysis of human and mouse early-stage embryonic lncRNAs was performed based on public single-cell RNA sequencing data. Expression profile analysis revealed that lncRNAs are expressed in a developmental stage–specific manner during human early-stage embryonic development, whereas a more temporal-specific expression pattern was identified in mouse embryos. Weighted gene co-expression network analysis suggested that lncRNAs involved in human early-stage embryonic development are associated with several important functions and processes, such as oocyte maturation, zygotic genome activation and mitochondrial functions. We also found that the network of lncRNAs involved in zygotic genome activation was highly preservative between human and mouse embryos, whereas in other stages no strong correlation between human and mouse embryo was observed. This study provides insight into the molecular mechanism underlying lncRNA involvement in human pre-implantation embryonic development. PMID:27542205
Mechanical Properties of Sprinting in Elite Rugby Union and Rugby League.
Cross, Matt R; Brughelli, Matt; Brown, Scott R; Samozino, Pierre; Gill, Nicholas D; Cronin, John B; Morin, Jean-Benoît
2015-09-01
To compare mechanical properties of overground sprint running in elite rugby union and rugby league athletes. Thirty elite rugby code (15 rugby union and 15 rugby league) athletes participated in this cross-sectional analysis. Radar was used to measure maximal overground sprint performance over 20 or 30 m (forwards and backs, respectively). In addition to time at 2, 5, 10, 20, and 30 m, velocity-time signals were analyzed to derive external horizontal force-velocity relationships with a recently validated method. From this relationship, the maximal theoretical velocity, external relative and absolute horizontal force, horizontal power, and optimal horizontal force for peak power production were determined. While differences in maximal velocity were unclear between codes, rugby union backs produced moderately faster split times, with the most substantial differences occurring at 2 and 5 m (ES 0.95 and 0.86, respectively). In addition, rugby union backs produced moderately larger relative horizontal force, optimal force, and peak power capabilities than rugby league backs (ES 0.73-0.77). Rugby union forwards had a higher absolute force (ES 0.77) despite having ~12% more body weight than rugby league forwards. In this elite sample, rugby union athletes typically displayed greater short-distance sprint performance, which may be linked to an ability to generate high levels of horizontal force and power. The acceleration characteristics presented in this study could be a result of the individual movement and positional demands of each code.
On models of the genetic code generated by binary dichotomic algorithms.
Gumbel, Markus; Fimmel, Elena; Danielli, Alberto; Strüngmann, Lutz
2015-02-01
In this paper we introduce the concept of a BDA-generated model of the genetic code which is based on binary dichotomic algorithms (BDAs). A BDA-generated model is based on binary dichotomic algorithms (BDAs). Such a BDA partitions the set of 64 codons into two disjoint classes of size 32 each and provides a generalization of known partitions like the Rumer dichotomy. We investigate what partitions can be generated when a set of different BDAs is applied sequentially to the set of codons. The search revealed that these models are able to generate code tables with very different numbers of classes ranging from 2 to 64. We have analyzed whether there are models that map the codons to their amino acids. A perfect matching is not possible. However, we present models that describe the standard genetic code with only few errors. There are also models that map all 64 codons uniquely to 64 classes showing that BDAs can be used to identify codons precisely. This could serve as a basis for further mathematical analysis using coding theory, for example. The hypothesis that BDAs might reflect a molecular mechanism taking place in the decoding center of the ribosome is discussed. The scan demonstrated that binary dichotomic partitions are able to model different aspects of the genetic code very well. The search was performed with our tool Beady-A. This software is freely available at http://mi.informatik.hs-mannheim.de/beady-a. It requires a JVM version 6 or higher. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Divergent transcription is associated with promoters of transcriptional regulators
2013-01-01
Background Divergent transcription is a wide-spread phenomenon in mammals. For instance, short bidirectional transcripts are a hallmark of active promoters, while longer transcripts can be detected antisense from active genes in conditions where the RNA degradation machinery is inhibited. Moreover, many described long non-coding RNAs (lncRNAs) are transcribed antisense from coding gene promoters. However, the general significance of divergent lncRNA/mRNA gene pair transcription is still poorly understood. Here, we used strand-specific RNA-seq with high sequencing depth to thoroughly identify antisense transcripts from coding gene promoters in primary mouse tissues. Results We found that a substantial fraction of coding-gene promoters sustain divergent transcription of long non-coding RNA (lncRNA)/mRNA gene pairs. Strikingly, upstream antisense transcription is significantly associated with genes related to transcriptional regulation and development. Their promoters share several characteristics with those of transcriptional developmental genes, including very large CpG islands, high degree of conservation and epigenetic regulation in ES cells. In-depth analysis revealed a unique GC skew profile at these promoter regions, while the associated coding genes were found to have large first exons, two genomic features that might enforce bidirectional transcription. Finally, genes associated with antisense transcription harbor specific H3K79me2 epigenetic marking and RNA polymerase II enrichment profiles linked to an intensified rate of early transcriptional elongation. Conclusions We concluded that promoters of a class of transcription regulators are characterized by a specialized transcriptional control mechanism, which is directly coupled to relaxed bidirectional transcription. PMID:24365181
Fracture Mechanics Analysis of Single and Double Rows of Fastener Holes Loaded in Bearing
1976-04-01
the following subprograms for execution: 1. ASRL FEABL-2 subroutines ASMLTV, ASMSUB, BCON, FACT, ORK, QBACK, SETUP, SIMULQ, STACON, and XTRACT. 2. IBM ...based on program code generated by IBM FORTRAN-G1 and FORTRAN-H compilers, with demonstration runs made on an IBM 370/168 computer. Programs SROW and...DROW are supplied ready to execute on systems with IBM -standard FORTRAN unit members for the card reader (unit 5) and line printer (unit 6). The
The Analysis of Visual Motion: From Computational Theory to Neuronal Mechanisms.
1986-12-01
neuronb. Brain Res. 151:599-603. Frost, B . J., Nakayama, K . 1983. Single visual neurons code opposing motion independent JW of direction. Science 220:744...Biol. Cybern. 42:195-204. llolden, A. 1. 1977. Responses of directional ganglion cells in the pigeon retina. J. Physiol., 270:2,53 269. Horn. B . K . P...R. Soc. Iond. B . 223:165-175. 51 % Computations Underlying Motion ttildret ik Koch %V. Longuet-Iliggins, H. C., Prazdny. K . 1981. The interpretation
Finite element modelling of non-linear magnetic circuits using Cosmic NASTRAN
NASA Technical Reports Server (NTRS)
Sheerer, T. J.
1986-01-01
The general purpose Finite Element Program COSMIC NASTRAN currently has the ability to model magnetic circuits with constant permeablilities. An approach was developed which, through small modifications to the program, allows modelling of non-linear magnetic devices including soft magnetic materials, permanent magnets and coils. Use of the NASTRAN code resulted in output which can be used for subsequent mechanical analysis using a variation of the same computer model. Test problems were found to produce theoretically verifiable results.
Regulation of the spoVM gene of Bacillus subtilis.
Le, Ai Thi Thuy; Schumann, Wolfgang
2008-11-01
The spoVM gene of Bacillus subtilis codes for a 26 amino-acid peptide that is essential for sporulation. Analysis of the expression of the spoVM gene revealed that wild-type cells started to synthesize a spoVM-specific transcript at t2, whereas the SpoVM peptide accumulated at t4. Both the transcript and the peptide were absent from an spoVM knockout strain. The 5' untranslated region of the spoVM transcript increased expression of SpoVM. Possible regulation mechanisms are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moss, Nicholas
The Kokkos Clang compiler is a version of the Clang C++ compiler that has been modified to perform targeted code generation for Kokkos constructs in the goal of generating highly optimized code and to provide semantic (domain) awareness throughout the compilation toolchain of these constructs such as parallel for and parallel reduce. This approach is taken to explore the possibilities of exposing the developer’s intentions to the underlying compiler infrastructure (e.g. optimization and analysis passes within the middle stages of the compiler) instead of relying solely on the restricted capabilities of C++ template metaprogramming. To date our current activities havemore » focused on correct GPU code generation and thus we have not yet focused on improving overall performance. The compiler is implemented by recognizing specific (syntactic) Kokkos constructs in order to bypass normal template expansion mechanisms and instead use the semantic knowledge of Kokkos to directly generate code in the compiler’s intermediate representation (IR); which is then translated into an NVIDIA-centric GPU program and supporting runtime calls. In addition, by capturing and maintaining the higher-level semantics of Kokkos directly within the lower levels of the compiler has the potential for significantly improving the ability of the compiler to communicate with the developer in the terms of their original programming model/semantics.« less
Code Help: Can This Unique State Regulatory Intervention Improve Emergency Department Crowding?
Michael, Sean S; Broach, John P; Kotkowski, Kevin A; Brush, D Eric; Volturo, Gregory A; Reznek, Martin A
2018-05-01
Emergency department (ED) crowding adversely affects multiple facets of high-quality care. The Commonwealth of Massachusetts mandates specific, hospital action plans to reduce ED boarding via a mechanism termed "Code Help." Because implementation appears inconsistent even when hospital conditions should have triggered its activation, we hypothesized that compliance with the Code Help policy would be associated with reduction in ED boarding time and total ED length of stay (LOS) for admitted patients, compared to patients seen when the Code Help policy was not followed. This was a retrospective analysis of data collected from electronic, patient-care, timestamp events and from a prospective Code Help registry for consecutive adult patients admitted from the ED at a single academic center during a 15-month period. For each patient, we determined whether the concurrent hospital status complied with the Code Help policy or violated it at the time of admission decision. We then compared ED boarding time and overall ED LOS for patients cared for during periods of Code Help policy compliance and during periods of Code Help policy violation, both with reference to patients cared for during normal operations. Of 89,587 adult patients who presented to the ED during the study period, 24,017 (26.8%) were admitted to an acute care or critical care bed. Boarding time ranged from zero to 67 hours 30 minutes (median 4 hours 31 minutes). Total ED LOS for admitted patients ranged from 11 minutes to 85 hours 25 minutes (median nine hours). Patients admitted during periods of Code Help policy violation experienced significantly longer boarding times (median 20 minutes longer) and total ED LOS (median 46 minutes longer), compared to patients admitted under normal operations. However, patients admitted during Code Help policy compliance did not experience a significant increase in either metric, compared to normal operations. In this single-center experience, implementation of the Massachusetts Code Help regulation was associated with reduced ED boarding time and ED LOS when the policy was consistently followed, but there were adverse effects on both metrics during violations of the policy.
A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes
NASA Technical Reports Server (NTRS)
Schnittman, Jeremy David; Krolik, Julian H.
2013-01-01
We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.
Probabilistic evaluation of SSME structural components
NASA Astrophysics Data System (ADS)
Rajagopal, K. R.; Newell, J. F.; Ho, H.
1991-05-01
The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.
Combustion: Structural interaction in a viscoelastic material
NASA Technical Reports Server (NTRS)
Chang, T. Y.; Chang, J. P.; Kumar, M.; Kuo, K. K.
1980-01-01
The effect of interaction between combustion processes and structural deformation of solid propellant was considered. The combustion analysis was performed on the basis of deformed crack geometry, which was determined from the structural analysis. On the other hand, input data for the structural analysis, such as pressure distribution along the crack boundary and ablation velocity of the crack, were determined from the combustion analysis. The interaction analysis was conducted by combining two computer codes, a combustion analysis code and a general purpose finite element structural analysis code.
Yaish, Mahmoud W; El-Kereamy, Ashraf; Zhu, Tong; Beatty, Perrin H; Good, Allen G; Bi, Yong-Mei; Rothstein, Steven J
2010-09-09
The interaction between phytohormones is an important mechanism which controls growth and developmental processes in plants. Deciphering these interactions is a crucial step in helping to develop crops with enhanced yield and resistance to environmental stresses. Controlling the expression level of OsAP2-39 which includes an APETALA 2 (AP2) domain leads to phenotypic changes in rice. Overexpression of OsAP2-39 leads to a reduction in yield by decreasing the biomass and the number of seeds in the transgenic rice lines. Global transcriptome analysis of the OsAP2-39 overexpression transgenic rice revealed the upregulation of a key abscisic acid (ABA) biosynthetic gene OsNCED-I which codes for 9-cis-epoxycarotenoid dioxygenase and leads to an increase in the endogenous ABA level. In addition to OsNCED-1, the gene expression analysis revealed the upregulation of a gene that codes for the Elongation of Upper most Internode (EUI) protein, an enzyme that catalyzes 16α, 17-epoxidation of non-13-hydroxylated GAs, which has been shown to deactivate gibberellins (GAs) in rice. The exogenous application of GA restores the wild-type phenotype in the transgenic line and ABA application induces the expression of EUI and suppresses the expression of OsAP2-39 in the wild-type line. These observations clarify the antagonistic relationship between ABA and GA and illustrate a mechanism that leads to homeostasis of these hormones. In vivo and in vitro analysis showed that the expression of both OsNCED-1 and EUI are directly controlled by OsAP2-39. Together, these results reveal a novel mechanism for the control of the ABA/GA balance in rice which is regulated by OsAP2-39 that in turn regulates plant growth and seed production.
Yaish, Mahmoud W.; El-kereamy, Ashraf; Zhu, Tong; Beatty, Perrin H.; Good, Allen G.; Bi, Yong-Mei; Rothstein, Steven J.
2010-01-01
The interaction between phytohormones is an important mechanism which controls growth and developmental processes in plants. Deciphering these interactions is a crucial step in helping to develop crops with enhanced yield and resistance to environmental stresses. Controlling the expression level of OsAP2-39 which includes an APETALA 2 (AP2) domain leads to phenotypic changes in rice. Overexpression of OsAP2-39 leads to a reduction in yield by decreasing the biomass and the number of seeds in the transgenic rice lines. Global transcriptome analysis of the OsAP2-39 overexpression transgenic rice revealed the upregulation of a key Abscisic Acid (ABA) biosynthetic gene OsNCED-I which codes for 9-cis-epoxycarotenoid dioxygenase and leads to an increase in the endogenous ABA level. In addition to OsNCED-1, the gene expression analysis revealed the upregulation of a gene that codes for the Elongation of Upper most Internode (EUI) protein, an enzyme that catalyzes 16α, 17-epoxidation of non-13-hydroxylated GAs, which has been shown to deactivate gibberellins (GAs) in rice. The exogenous application of GA restores the wild-type phenotype in the transgenic line and ABA application induces the expression of EUI and suppresses the expression of OsAP2-39 in the wild-type line. These observations clarify the antagonistic relationship between ABA and GA and illustrate a mechanism that leads to homeostasis of these hormones. In vivo and in vitro analysis showed that the expression of both OsNCED-1 and EUI are directly controlled by OsAP2-39. Together, these results reveal a novel mechanism for the control of the ABA/GA balance in rice which is regulated by OsAP2-39 that in turn regulates plant growth and seed production. PMID:20838584
Use of multiscale zirconium alloy deformation models in nuclear fuel behavior analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montgomery, Robert, E-mail: robert.montgomery@pnnl.gov; Tomé, Carlos, E-mail: tome@lanl.gov; Liu, Wenfeng, E-mail: wenfeng.liu@anatech.com
Accurate prediction of cladding mechanical behavior is a key aspect of modeling nuclear fuel behavior, especially for conditions of pellet-cladding interaction (PCI), reactivity-initiated accidents (RIA), and loss of coolant accidents (LOCA). Current approaches to fuel performance modeling rely on empirical constitutive models for cladding creep, growth and plastic deformation, which are limited to the materials and conditions for which the models were developed. To improve upon this approach, a microstructurally-based zirconium alloy mechanical deformation analysis capability is being developed within the United States Department of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL). Specifically, the viscoplastic self-consistent (VPSC)more » polycrystal plasticity modeling approach, developed by Lebensohn and Tomé [1], has been coupled with the BISON engineering scale fuel performance code to represent the mechanistic material processes controlling the deformation behavior of light water reactor (LWR) cladding. A critical component of VPSC is the representation of the crystallographic nature (defect and dislocation movement) and orientation of the grains within the matrix material and the ability to account for the role of texture on deformation. A future goal is for VPSC to obtain information on reaction rate kinetics from atomistic calculations to inform the defect and dislocation behavior models described in VPSC. The multiscale modeling of cladding deformation mechanisms allowed by VPSC far exceed the functionality of typical semi-empirical constitutive models employed in nuclear fuel behavior codes to model irradiation growth and creep, thermal creep, or plasticity. This paper describes the implementation of an interface between VPSC and BISON and provides initial results utilizing the coupled functionality.« less
Xiao, Jingfa; Hao, Lirui; Crowley, David E.; Zhang, Zhewen; Yu, Jun; Huang, Ning; Huo, Mingxin; Wu, Jiayan
2015-01-01
Cupriavidus sp. are generally heavy metal tolerant bacteria with the ability to degrade a variety of aromatic hydrocarbon compounds, although the degradation pathways and substrate versatilities remain largely unknown. Here we studied the bacterium Cupriavidus gilardii strain CR3, which was isolated from a natural asphalt deposit, and which was shown to utilize naphthenic acids as a sole carbon source. Genome sequencing of C. gilardii CR3 was carried out to elucidate possible mechanisms for the naphthenic acid biodegradation. The genome of C. gilardii CR3 was composed of two circular chromosomes chr1 and chr2 of respectively 3,539,530 bp and 2,039,213 bp in size. The genome for strain CR3 encoded 4,502 putative protein-coding genes, 59 tRNA genes, and many other non-coding genes. Many genes were associated with xenobiotic biodegradation and metal resistance functions. Pathway prediction for degradation of cyclohexanecarboxylic acid, a representative naphthenic acid, suggested that naphthenic acid undergoes initial ring-cleavage, after which the ring fission products can be degraded via several plausible degradation pathways including a mechanism similar to that used for fatty acid oxidation. The final metabolic products of these pathways are unstable or volatile compounds that were not toxic to CR3. Strain CR3 was also shown to have tolerance to at least 10 heavy metals, which was mainly achieved by self-detoxification through ion efflux, metal-complexation and metal-reduction, and a powerful DNA self-repair mechanism. Our genomic analysis suggests that CR3 is well adapted to survive the harsh environment in natural asphalts containing naphthenic acids and high concentrations of heavy metals. PMID:26301592
Shekhar, Himanshu; Doyley, Marvin M.
2013-01-01
The current excitation strategy for harmonic and subharmonic imaging (HI and SHI) uses short sine-bursts. However, alternate pulsing strategies may be useful for enhancing nonlinear emissions from ultrasound contrast agents. The goal of this study was to corroborate the hypothesis that chirp-coded excitation can improve the performance of high-frequency HI and SHI. A secondary goal was to understand the mechanisms that govern the response of ultrasound contrast agents to chirp-coded and sine-burst excitation schemes. Numerical simulations and acoustic measurements were conducted to evaluate the response of a commercial contrast agent (Targestar-P®) to chirp-coded and sine-burst excitation (10 MHz frequency, peak pressures 290 kPa). The results of the acoustic measurements revealed an improvement in signal-to-noise ratio by 4 to 14 dB, and a two- to threefold reduction in the subharmonic threshold with chirp-coded excitation. Simulations conducted with the Marmottant model suggest that an increase in expansion-dominated radial excursion of microbubbles was the mechanism responsible for the stronger nonlinear response. Additionally, chirp-coded excitation detected the nonlinear response for a wider range of agent concentrations than sine-bursts. Therefore, chirp-coded excitation could be a viable approach for enhancing the performance of HI and SHI. PMID:23654417
Shekhar, Himanshu; Doyley, Marvin M
2013-05-01
The current excitation strategy for harmonic and subharmonic imaging (HI and SHI) uses short sine-bursts. However, alternate pulsing strategies may be useful for enhancing nonlinear emissions from ultrasound contrast agents. The goal of this study was to corroborate the hypothesis that chirp-coded excitation can improve the performance of high-frequency HI and SHI. A secondary goal was to understand the mechanisms that govern the response of ultrasound contrast agents to chirp-coded and sine-burst excitation schemes. Numerical simulations and acoustic measurements were conducted to evaluate the response of a commercial contrast agent (Targestar-P(®)) to chirp-coded and sine-burst excitation (10 MHz frequency, peak pressures 290 kPa). The results of the acoustic measurements revealed an improvement in signal-to-noise ratio by 4 to 14 dB, and a two- to threefold reduction in the subharmonic threshold with chirp-coded excitation. Simulations conducted with the Marmottant model suggest that an increase in expansion-dominated radial excursion of microbubbles was the mechanism responsible for the stronger nonlinear response. Additionally, chirp-coded excitation detected the nonlinear response for a wider range of agent concentrations than sine-bursts. Therefore, chirp-coded excitation could be a viable approach for enhancing the performance of HI and SHI.
Broadband transmission-type coding metamaterial for wavefront manipulation for airborne sound
NASA Astrophysics Data System (ADS)
Li, Kun; Liang, Bin; Yang, Jing; Yang, Jun; Cheng, Jian-chun
2018-07-01
The recent advent of coding metamaterials, as a new class of acoustic metamaterials, substantially reduces the complexity in the design and fabrication of acoustic functional devices capable of manipulating sound waves in exotic manners by arranging coding elements with discrete phase states in specific sequences. It is therefore intriguing, both physically and practically, to pursue a mechanism for realizing broadband acoustic coding metamaterials that control transmitted waves with a fine resolution of the phase profile. Here, we propose the design of a transmission-type acoustic coding device and demonstrate its metamaterial-based implementation. The mechanism is that, instead of relying on resonant coding elements that are necessarily narrow-band, we build weak-resonant coding elements with a helical-like metamaterial with a continuously varying pitch that effectively expands the working bandwidth while maintaining the sub-wavelength resolution of the phase profile that is vital for the production of complicated wave fields. The effectiveness of our proposed scheme is numerically verified via the demonstration of three distinctive examples of acoustic focusing, anomalous refraction, and vortex beam generation in the prescribed frequency band on the basis of 1- and 2-bit coding sequences. Simulation results agree well with theoretical predictions, showing that the designed coding devices with discrete phase profiles are efficient in engineering the wavefront of outcoming waves to form the desired spatial pattern. We anticipate the realization of coding metamaterials with broadband functionality and design flexibility to open up possibilities for novel acoustic functional devices for the special manipulation of transmitted waves and underpin diverse applications ranging from medical ultrasound imaging to acoustic detections.
Implementation of a Post-Code Pause: Extending Post-Event Debriefing to Include Silence.
Copeland, Darcy; Liska, Heather
2016-01-01
This project arose out of a need to address two issues at our hospital: we lacked a formal debriefing process for code/trauma events and the emergency department wanted to address the psychological and spiritual needs of code/trauma responders. We developed a debriefing process for code/trauma events that intentionally included mechanisms to facilitate recognition, acknowledgment, and, when needed, responses to the psychological and spiritual needs of responders. A post-code pause process was implemented in the emergency department with the aims of standardizing a debriefing process, encouraging a supportive team-based culture, improving transition back to "normal" activities after responding to code/trauma events, and providing responders an opportunity to express reverence for patients involved in code/trauma events. The post-code pause process incorporates a moment of silence and the addition of two simple questions to a traditional operational debrief. Implementation of post-code pauses was feasible despite the fast paced nature of the department. At the end of the 1-year pilot period, staff members reported increases in feeling supported by peers and leaders, their ability to pay homage to patients, and having time to regroup prior to returning to their assignment. There was a decrease in the number of respondents reporting having thoughts or feelings associated with the event within 24 hr. The pauses create a mechanism for operational team debriefing, provide an opportunity for staff members to honor their work and their patients, and support an environment in which the psychological and spiritual effects of responding to code/trauma events can be acknowledged.
Valenzuela-Miranda, Diego; Gallardo-Escárate, Cristian
2016-12-01
Despite the high prevalence and impact to Chilean salmon aquaculture of the intracellular bacterium Piscirickettsia salmonis, the molecular underpinnings of host-pathogen interactions remain unclear. Herein, the interplay of coding and non-coding transcripts has been proposed as a key mechanism involved in immune response. Therefore, the aim of this study was to evidence how coding and non-coding transcripts are modulated during the infection process of Atlantic salmon with P. salmonis. For this, RNA-seq was conducted in brain, spleen, and head kidney samples, revealing different transcriptional profiles according to bacterial load. Additionally, while most of the regulated genes annotated for diverse biological processes during infection, a common response associated with clathrin-mediated endocytosis and iron homeostasis was present in all tissues. Interestingly, while endocytosis-promoting factors and clathrin inductions were upregulated, endocytic receptors were mainly downregulated. Furthermore, the regulation of genes related to iron homeostasis suggested an intracellular accumulation of iron, a process in which heme biosynthesis/degradation pathways might play an important role. Regarding the non-coding response, 918 putative long non-coding RNAs were identified, where 425 were newly characterized for S. salar. Finally, co-localization and co-expression analyses revealed a strong correlation between the modulations of long non-coding RNAs and genes associated with endocytosis and iron homeostasis. These results represent the first comprehensive study of putative interplaying mechanisms of coding and non-coding RNAs during bacterial infection in salmonids. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
Current and anticipated uses of thermalhydraulic and neutronic codes at PSI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G.
1997-07-01
The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together withmore » some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized.« less
Internal fluid mechanics research on supercomputers for aerospace propulsion systems
NASA Technical Reports Server (NTRS)
Miller, Brent A.; Anderson, Bernhard H.; Szuch, John R.
1988-01-01
The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid mechanics (ICFM) to a state of practical application for aerospace propulsion systems. The strategies used to achieve this goal are to: (1) pursue an understanding of flow physics, surface heat transfer, and combustion via analysis and fundamental experiments, (2) incorporate improved understanding of these phenomena into verified 3-D CFD codes, and (3) utilize state-of-the-art computational technology to enhance experimental and CFD research. Presented is an overview of the ICFM program in high-speed propulsion, including work in inlets, turbomachinery, and chemical reacting flows. Ongoing efforts to integrate new computer technologies, such as parallel computing and artificial intelligence, into high-speed aeropropulsion research are described.
Neale, Dave; Clackson, Kaili; Georgieva, Stanimira; Dedetas, Hatice; Scarpate, Melissa; Wass, Sam; Leong, Victoria
2018-01-01
Play during early life is a ubiquitous activity, and an individual’s propensity for play is positively related to cognitive development and emotional well-being. Play behavior (which may be solitary or shared with a social partner) is diverse and multi-faceted. A challenge for current research is to converge on a common definition and measurement system for play – whether examined at a behavioral, cognitive or neurological level. Combining these different approaches in a multimodal analysis could yield significant advances in understanding the neurocognitive mechanisms of play, and provide the basis for developing biologically grounded play models. However, there is currently no integrated framework for conducting a multimodal analysis of play that spans brain, cognition and behavior. The proposed coding framework uses grounded and observable behaviors along three dimensions (sensorimotor, cognitive and socio-emotional), to compute inferences about playful behavior in a social context, and related social interactional states. Here, we illustrate the sensitivity and utility of the proposed coding framework using two contrasting dyadic corpora (N = 5) of mother-infant object-oriented interactions during experimental conditions that were either non-conducive (Condition 1) or conducive (Condition 2) to the emergence of playful behavior. We find that the framework accurately identifies the modal form of social interaction as being either non-playful (Condition 1) or playful (Condition 2), and further provides useful insights about differences in the quality of social interaction and temporal synchronicity within the dyad. It is intended that this fine-grained coding of play behavior will be easily assimilated with, and inform, future analysis of neural data that is also collected during adult–infant play. In conclusion, here, we present a novel framework for analyzing the continuous time-evolution of adult–infant play patterns, underpinned by biologically informed state coding along sensorimotor, cognitive and socio-emotional dimensions. We expect that the proposed framework will have wide utility amongst researchers wishing to employ an integrated, multimodal approach to the study of play, and lead toward a greater understanding of the neuroscientific basis of play. It may also yield insights into a new biologically grounded taxonomy of play interactions. PMID:29618994
Neale, Dave; Clackson, Kaili; Georgieva, Stanimira; Dedetas, Hatice; Scarpate, Melissa; Wass, Sam; Leong, Victoria
2018-01-01
Play during early life is a ubiquitous activity, and an individual's propensity for play is positively related to cognitive development and emotional well-being. Play behavior (which may be solitary or shared with a social partner) is diverse and multi-faceted. A challenge for current research is to converge on a common definition and measurement system for play - whether examined at a behavioral, cognitive or neurological level. Combining these different approaches in a multimodal analysis could yield significant advances in understanding the neurocognitive mechanisms of play, and provide the basis for developing biologically grounded play models. However, there is currently no integrated framework for conducting a multimodal analysis of play that spans brain, cognition and behavior. The proposed coding framework uses grounded and observable behaviors along three dimensions (sensorimotor, cognitive and socio-emotional), to compute inferences about playful behavior in a social context, and related social interactional states. Here, we illustrate the sensitivity and utility of the proposed coding framework using two contrasting dyadic corpora ( N = 5) of mother-infant object-oriented interactions during experimental conditions that were either non-conducive (Condition 1) or conducive (Condition 2) to the emergence of playful behavior. We find that the framework accurately identifies the modal form of social interaction as being either non-playful (Condition 1) or playful (Condition 2), and further provides useful insights about differences in the quality of social interaction and temporal synchronicity within the dyad. It is intended that this fine-grained coding of play behavior will be easily assimilated with, and inform, future analysis of neural data that is also collected during adult-infant play. In conclusion, here, we present a novel framework for analyzing the continuous time-evolution of adult-infant play patterns, underpinned by biologically informed state coding along sensorimotor, cognitive and socio-emotional dimensions. We expect that the proposed framework will have wide utility amongst researchers wishing to employ an integrated, multimodal approach to the study of play, and lead toward a greater understanding of the neuroscientific basis of play. It may also yield insights into a new biologically grounded taxonomy of play interactions.
Reed-Solomon error-correction as a software patch mechanism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pendley, Kevin D.
This report explores how error-correction data generated by a Reed-Solomon code may be used as a mechanism to apply changes to an existing installed codebase. Using the Reed-Solomon code to generate error-correction data for a changed or updated codebase will allow the error-correction data to be applied to an existing codebase to both validate and introduce changes or updates from some upstream source to the existing installed codebase.
NASA Technical Reports Server (NTRS)
Johnson, Theodore F.; Waters, W. Allen; Singer, Thomas N.; Haftka, Raphael T.
2004-01-01
A next generation reusable launch vehicle (RLV) will require thermally efficient and light-weight cryogenic propellant tank structures. Since these tanks will be weight-critical, analytical tools must be developed to aid in sizing the thickness of insulation layers and structural geometry for optimal performance. Finite element method (FEM) models of the tank and insulation layers were created to analyze the thermal performance of the cryogenic insulation layer and thermal protection system (TPS) of the tanks. The thermal conditions of ground-hold and re-entry/soak-through for a typical RLV mission were used in the thermal sizing study. A general-purpose nonlinear FEM analysis code, capable of using temperature and pressure dependent material properties, was used as the thermal analysis code. Mechanical loads from ground handling and proof-pressure testing were used to size the structural geometry of an aluminum cryogenic tank wall. Nonlinear deterministic optimization and reliability optimization techniques were the analytical tools used to size the geometry of the isogrid stiffeners and thickness of the skin. The results from the sizing study indicate that a commercial FEM code can be used for thermal analyses to size the insulation thicknesses where the temperature and pressure were varied. The results from the structural sizing study show that using combined deterministic and reliability optimization techniques can obtain alternate and lighter designs than the designs obtained from deterministic optimization methods alone.
Trypsteen, Wim; Mohammadi, Pejman; Van Hecke, Clarissa; Mestdagh, Pieter; Lefever, Steve; Saeys, Yvan; De Bleser, Pieter; Vandesompele, Jo; Ciuffi, Angela; Vandekerckhove, Linos; De Spiegelaere, Ward
2016-10-26
Studying the effects of HIV infection on the host transcriptome has typically focused on protein-coding genes. However, recent advances in the field of RNA sequencing revealed that long non-coding RNAs (lncRNAs) add an extensive additional layer to the cell's molecular network. Here, we performed transcriptome profiling throughout a primary HIV infection in vitro to investigate lncRNA expression at the different HIV replication cycle processes (reverse transcription, integration and particle production). Subsequently, guilt-by-association, transcription factor and co-expression analysis were performed to infer biological roles for the lncRNAs identified in the HIV-host interplay. Many lncRNAs were suggested to play a role in mechanisms relying on proteasomal and ubiquitination pathways, apoptosis, DNA damage responses and cell cycle regulation. Through transcription factor binding analysis, we found that lncRNAs display a distinct transcriptional regulation profile as compared to protein coding mRNAs, suggesting that mRNAs and lncRNAs are independently modulated. In addition, we identified five differentially expressed lncRNA-mRNA pairs with mRNA involvement in HIV pathogenesis with possible cis regulatory lncRNAs that control nearby mRNA expression and function. Altogether, the present study demonstrates that lncRNAs add a new dimension to the HIV-host interplay and should be further investigated as they may represent targets for controlling HIV replication.
Biochemical and genetic analysis of the role of the viral polymerase in enterovirus recombination.
Woodman, Andrew; Arnold, Jamie J; Cameron, Craig E; Evans, David J
2016-08-19
Genetic recombination in single-strand, positive-sense RNA viruses is a poorly understand mechanism responsible for generating extensive genetic change and novel phenotypes. By moving a critical cis-acting replication element (CRE) from the polyprotein coding region to the 3' non-coding region we have further developed a cell-based assay (the 3'CRE-REP assay) to yield recombinants throughout the non-structural coding region of poliovirus from dually transfected cells. We have additionally developed a defined biochemical assay in which the only protein present is the poliovirus RNA dependent RNA polymerase (RdRp), which recapitulates the strand transfer events of the recombination process. We have used both assays to investigate the role of the polymerase fidelity and nucleotide turnover rates in recombination. Our results, of both poliovirus intertypic and intratypic recombination in the CRE-REP assay and using a range of polymerase variants in the biochemical assay, demonstrate that RdRp fidelity is a fundamental determinant of recombination frequency. High fidelity polymerases exhibit reduced recombination and low fidelity polymerases exhibit increased recombination in both assays. These studies provide the basis for the analysis of poliovirus recombination throughout the non-structural region of the virus genome and provide a defined biochemical assay to further dissect this important evolutionary process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping.
Kubica, Aleksander; Beverland, Michael E; Brandão, Fernando; Preskill, John; Svore, Krysta M
2018-05-04
Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p_{3DCC}^{(1)}≃1.9% and p_{3DCC}^{(2)}≃27.6%. We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.
Content Analysis Coding Schemes for Online Asynchronous Discussion
ERIC Educational Resources Information Center
Weltzer-Ward, Lisa
2011-01-01
Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…
Mechanisms of Long Non-Coding RNAs in the Assembly and Plasticity of Neural Circuitry.
Wang, Andi; Wang, Junbao; Liu, Ying; Zhou, Yan
2017-01-01
The mechanisms underlying development processes and functional dynamics of neural circuits are far from understood. Long non-coding RNAs (lncRNAs) have emerged as essential players in defining identities of neural cells, and in modulating neural activities. In this review, we summarized latest advances concerning roles and mechanisms of lncRNAs in assembly, maintenance and plasticity of neural circuitry, as well as lncRNAs' implications in neurological disorders. We also discussed technical advances and challenges in studying functions and mechanisms of lncRNAs in neural circuitry. Finally, we proposed that lncRNA studies would advance our understanding on how neural circuits develop and function in physiology and disease conditions.
Checkpointing in speculative versioning caches
Eichenberger, Alexandre E; Gara, Alan; Gschwind, Michael K; Ohmacht, Martin
2013-08-27
Mechanisms for generating checkpoints in a speculative versioning cache of a data processing system are provided. The mechanisms execute code within the data processing system, wherein the code accesses cache lines in the speculative versioning cache. The mechanisms further determine whether a first condition occurs indicating a need to generate a checkpoint in the speculative versioning cache. The checkpoint is a speculative cache line which is made non-speculative in response to a second condition occurring that requires a roll-back of changes to a cache line corresponding to the speculative cache line. The mechanisms also generate the checkpoint in the speculative versioning cache in response to a determination that the first condition has occurred.
Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code
NASA Technical Reports Server (NTRS)
Hendricks, Eric S.
2016-01-01
The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.
Optimized scalar promotion with load and splat SIMD instructions
Eichenberger, Alexander E; Gschwind, Michael K; Gunnels, John A
2013-10-29
Mechanisms for optimizing scalar code executed on a single instruction multiple data (SIMD) engine are provided. Placement of vector operation-splat operations may be determined based on an identification of scalar and SIMD operations in an original code representation. The original code representation may be modified to insert the vector operation-splat operations based on the determined placement of vector operation-splat operations to generate a first modified code representation. Placement of separate splat operations may be determined based on identification of scalar and SIMD operations in the first modified code representation. The first modified code representation may be modified to insert or delete separate splat operations based on the determined placement of the separate splat operations to generate a second modified code representation. SIMD code may be output based on the second modified code representation for execution by the SIMD engine.
Optimized scalar promotion with load and splat SIMD instructions
Eichenberger, Alexandre E [Chappaqua, NY; Gschwind, Michael K [Chappaqua, NY; Gunnels, John A [Yorktown Heights, NY
2012-08-28
Mechanisms for optimizing scalar code executed on a single instruction multiple data (SIMD) engine are provided. Placement of vector operation-splat operations may be determined based on an identification of scalar and SIMD operations in an original code representation. The original code representation may be modified to insert the vector operation-splat operations based on the determined placement of vector operation-splat operations to generate a first modified code representation. Placement of separate splat operations may be determined based on identification of scalar and SIMD operations in the first modified code representation. The first modified code representation may be modified to insert or delete separate splat operations based on the determined placement of the separate splat operations to generate a second modified code representation. SIMD code may be output based on the second modified code representation for execution by the SIMD engine.
ERIC Educational Resources Information Center
Mayhew, Matthew J.; Simonoff, Jeffrey S.
2015-01-01
The purpose of this paper is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, multi-raced independent variables in higher education research. Not only may effect coding enable researchers to get closer to respondents' original intentions, it allows for more accurate analyses of all race…
Drugs, Guns, and Disadvantaged Youths: Co-Occurring Behavior and the Code of the Street
ERIC Educational Resources Information Center
Allen, Andrea N.; Lo, Celia C.
2012-01-01
Guided by Anderson's theory of the code of the street, this study explored social mechanisms linking individual-level disadvantage factors with the adoption of beliefs grounded in the code of the street and with drug trafficking and gun carrying--the co-occurring behavior shaping violence among young men in urban areas. Secondary data were…
Coupled Physics Environment (CouPE) library - Design, Implementation, and Release
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay S.
Over several years, high fidelity, validated mono-physics solvers with proven scalability on peta-scale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a unified mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. In this report, we present details on the design decisions and developments on CouPE, an acronym that stands for Coupled Physics Environment that orchestrates a coupled physics solver through the interfaces exposed by MOAB array-based unstructured mesh, both of which are part of SIGMA (Scalable Interfaces for Geometry and Mesh-Based Applications) toolkit.more » The SIGMA toolkit contains libraries that enable scalable geometry and unstructured mesh creation and handling in a memory and computationally efficient implementation. The CouPE version being prepared for a full open-source release along with updated documentation will contain several useful examples that will enable users to start developing their applications natively using the native MOAB mesh and couple their models to existing physics applications to analyze and solve real world problems of interest. An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is also being investigated as part of the NEAMS RPL, to tightly couple neutron transport, thermal-hydraulics and structural mechanics physics under the SHARP framework. This report summarizes the efforts that have been invested in CouPE to bring together several existing physics applications namely PROTEUS (neutron transport code), Nek5000 (computational fluid-dynamics code) and Diablo (structural mechanics code). The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The design of CouPE along with motivations that led to implementation choices are also discussed. The first release of the library will be different from the current version of the code that integrates the components in SHARP and explanation on the need for forking the source base will also be provided. Enhancements in the functionality and improved user guides will be available as part of the release. CouPE v0.1 is scheduled for an open-source release in December 2014 along with SIGMA v1.1 components that provide support for language-agnostic mesh loading, traversal and query interfaces along with scalable solution transfer of fields between different physics codes. The coupling methodology and software interfaces of the library are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the CouPE library.« less
Gadadhar, Sudarshan; Bodakuntla, Satish; Natarajan, Kathiresan; Janke, Carsten
2017-04-15
Microtubules are key cytoskeletal elements of all eukaryotic cells and are assembled of evolutionarily conserved α-tubulin-β-tubulin heterodimers. Despite their uniform structure, microtubules fulfill a large diversity of functions. A regulatory mechanism to control the specialization of the microtubule cytoskeleton is the 'tubulin code', which is generated by (i) expression of different α- and β-tubulin isotypes, and by (ii) post-translational modifications of tubulin. In this Cell Science at a Glance article and the accompanying poster, we provide a comprehensive overview of the molecular components of the tubulin code, and discuss the mechanisms by which these components contribute to the generation of functionally specialized microtubules. © 2017. Published by The Company of Biologists Ltd.
SIERRA Code Coupling Module: Arpeggio User Manual Version 4.44
DOE Office of Scientific and Technical Information (OSTI.GOV)
Subia, Samuel R.; Overfelt, James R.; Baur, David G.
2017-04-01
The SNL Sierra Mechanics code suite is designed to enable simulation of complex multiphysics scenarios. The code suite is composed of several specialized applications which can operate either in standalone mode or coupled with each other. Arpeggio is a supported utility that enables loose coupling of the various Sierra Mechanics applications by providing access to Framework services that facilitate the coupling. More importantly Arpeggio orchestrates the execution of applications that participate in the coupling. This document describes the various components of Arpeggio and their operability. The intent of the document is to provide a fast path for analysts interested inmore » coupled applications via simple examples of its usage.« less
DYNA3D Code Practices and Developments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, L.; Zywicz, E.; Raboin, P.
2000-04-21
DYNA3D is an explicit, finite element code developed to solve high rate dynamic simulations for problems of interest to the engineering mechanics community. The DYNA3D code has been under continuous development since 1976[1] by the Methods Development Group in the Mechanical Engineering Department of Lawrence Livermore National Laboratory. The pace of code development activities has substantially increased in the past five years, growing from one to between four and six code developers. This has necessitated the use of software tools such as CVS (Concurrent Versions System) to help manage multiple version updates. While on-line documentation with an Adobe PDF manualmore » helps to communicate software developments, periodically a summary document describing recent changes and improvements in DYNA3D software is needed. The first part of this report describes issues surrounding software versions and source control. The remainder of this report details the major capability improvements since the last publicly released version of DYNA3D in 1996. Not included here are the many hundreds of bug corrections and minor enhancements, nor the development in DYNA3D between the manual release in 1993[2] and the public code release in 1996.« less
Towards a Consistent and Scientifically Accurate Drug Ontology.
Hogan, William R; Hanna, Josh; Joseph, Eric; Brochhausen, Mathias
2013-01-01
Our use case for comparative effectiveness research requires an ontology of drugs that enables querying National Drug Codes (NDCs) by active ingredient, mechanism of action, physiological effect, and therapeutic class of the drug products they represent. We conducted an ontological analysis of drugs from the realist perspective, and evaluated existing drug terminology, ontology, and database artifacts from (1) the technical perspective, (2) the perspective of pharmacology and medical science (3) the perspective of description logic semantics (if they were available in Web Ontology Language or OWL), and (4) the perspective of our realism-based analysis of the domain. No existing resource was sufficient. Therefore, we built the Drug Ontology (DrOn) in OWL, which we populated with NDCs and other classes from RxNorm using only content created by the National Library of Medicine. We also built an application that uses DrOn to query for NDCs as outlined above, available at: http://ingarden.uams.edu/ingredients. The application uses an OWL-based description logic reasoner to execute end-user queries. DrOn is available at http://code.google.com/p/dr-on.
Delamination Modeling of Composites for Improved Crash Analysis
NASA Technical Reports Server (NTRS)
Fleming, David C.
1999-01-01
Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Srivastava, R.; Mehmed, Oral
2002-01-01
An aeroelastic analysis system for flutter and forced response analysis of turbomachines based on a two-dimensional linearized unsteady Euler solver has been developed. The ASTROP2 code, an aeroelastic stability analysis program for turbomachinery, was used as a basis for this development. The ASTROP2 code uses strip theory to couple a two dimensional aerodynamic model with a three dimensional structural model. The code was modified to include forced response capability. The formulation was also modified to include aeroelastic analysis with mistuning. A linearized unsteady Euler solver, LINFLX2D is added to model the unsteady aerodynamics in ASTROP2. By calculating the unsteady aerodynamic loads using LINFLX2D, it is possible to include the effects of transonic flow on flutter and forced response in the analysis. The stability is inferred from an eigenvalue analysis. The revised code, ASTROP2-LE for ASTROP2 code using Linearized Euler aerodynamics, is validated by comparing the predictions with those obtained using linear unsteady aerodynamic solutions.
Direction-selective circuits shape noise to ensure a precise population code
Zylberberg, Joel; Cafaro, Jon; Turner, Maxwell H
2016-01-01
Summary Neural responses are noisy, and circuit structure can correlate this noise across neurons. Theoretical studies show that noise correlations can have diverse effects on population coding, but these studies rarely explore stimulus dependence of noise correlations. Here, we show that noise correlations in responses of ON-OFF direction-selective retinal ganglion cells are strongly stimulus dependent and we uncover the circuit mechanisms producing this stimulus dependence. A population model based on these mechanistic studies shows that stimulus-dependent noise correlations improve the encoding of motion direction two-fold compared to independent noise. This work demonstrates a mechanism by which a neural circuit effectively shapes its signal and noise in concert, minimizing corruption of signal by noise. Finally, we generalize our findings beyond direction coding in the retina and show that stimulus-dependent correlations will generally enhance information coding in populations of diversely tuned neurons. PMID:26796691
Semantic representations in the temporal pole predict false memories
Chadwick, Martin J.; Anjum, Raeesa S.; Kumaran, Dharshan; Schacter, Daniel L.; Spiers, Hugo J.; Hassabis, Demis
2016-01-01
Recent advances in neuroscience have given us unprecedented insight into the neural mechanisms of false memory, showing that artificial memories can be inserted into the memory cells of the hippocampus in a way that is indistinguishable from true memories. However, this alone is not enough to explain how false memories can arise naturally in the course of our daily lives. Cognitive psychology has demonstrated that many instances of false memory, both in the laboratory and the real world, can be attributed to semantic interference. Whereas previous studies have found that a diverse set of regions show some involvement in semantic false memory, none have revealed the nature of the semantic representations underpinning the phenomenon. Here we use fMRI with representational similarity analysis to search for a neural code consistent with semantic false memory. We find clear evidence that false memories emerge from a similarity-based neural code in the temporal pole, a region that has been called the “semantic hub” of the brain. We further show that each individual has a partially unique semantic code within the temporal pole, and this unique code can predict idiosyncratic patterns of memory errors. Finally, we show that the same neural code can also predict variation in true-memory performance, consistent with an adaptive perspective on false memory. Taken together, our findings reveal the underlying structure of neural representations of semantic knowledge, and how this semantic structure can both enhance and distort our memories. PMID:27551087
Semantic representations in the temporal pole predict false memories.
Chadwick, Martin J; Anjum, Raeesa S; Kumaran, Dharshan; Schacter, Daniel L; Spiers, Hugo J; Hassabis, Demis
2016-09-06
Recent advances in neuroscience have given us unprecedented insight into the neural mechanisms of false memory, showing that artificial memories can be inserted into the memory cells of the hippocampus in a way that is indistinguishable from true memories. However, this alone is not enough to explain how false memories can arise naturally in the course of our daily lives. Cognitive psychology has demonstrated that many instances of false memory, both in the laboratory and the real world, can be attributed to semantic interference. Whereas previous studies have found that a diverse set of regions show some involvement in semantic false memory, none have revealed the nature of the semantic representations underpinning the phenomenon. Here we use fMRI with representational similarity analysis to search for a neural code consistent with semantic false memory. We find clear evidence that false memories emerge from a similarity-based neural code in the temporal pole, a region that has been called the "semantic hub" of the brain. We further show that each individual has a partially unique semantic code within the temporal pole, and this unique code can predict idiosyncratic patterns of memory errors. Finally, we show that the same neural code can also predict variation in true-memory performance, consistent with an adaptive perspective on false memory. Taken together, our findings reveal the underlying structure of neural representations of semantic knowledge, and how this semantic structure can both enhance and distort our memories.
Spatial Learning and Action Planning in a Prefrontal Cortical Network Model
Martinet, Louis-Emmanuel; Sheynikhovich, Denis; Benchenane, Karim; Arleo, Angelo
2011-01-01
The interplay between hippocampus and prefrontal cortex (PFC) is fundamental to spatial cognition. Complementing hippocampal place coding, prefrontal representations provide more abstract and hierarchically organized memories suitable for decision making. We model a prefrontal network mediating distributed information processing for spatial learning and action planning. Specific connectivity and synaptic adaptation principles shape the recurrent dynamics of the network arranged in cortical minicolumns. We show how the PFC columnar organization is suitable for learning sparse topological-metrical representations from redundant hippocampal inputs. The recurrent nature of the network supports multilevel spatial processing, allowing structural features of the environment to be encoded. An activation diffusion mechanism spreads the neural activity through the column population leading to trajectory planning. The model provides a functional framework for interpreting the activity of PFC neurons recorded during navigation tasks. We illustrate the link from single unit activity to behavioral responses. The results suggest plausible neural mechanisms subserving the cognitive “insight” capability originally attributed to rodents by Tolman & Honzik. Our time course analysis of neural responses shows how the interaction between hippocampus and PFC can yield the encoding of manifold information pertinent to spatial planning, including prospective coding and distance-to-goal correlates. PMID:21625569
2006-01-01
collected, code both. Code Type of Analysis Code Type of Analysis A Physical properties I Common ions/trace elements B Common ions J Sanitary analysis and...1) A ground-water site is coded as if it is a single point, not a geographic area or property . (2) Latitude and longitude should be determined at a...terrace from an adjacent upland on one side, and a lowland coast or valley on the other. Due to the effects of erosion, the terrace surface may not be as
NASA Astrophysics Data System (ADS)
Boss, Alan P.
2009-03-01
The disk instability mechanism for giant planet formation is based on the formation of clumps in a marginally gravitationally unstable protoplanetary disk, which must lose thermal energy through a combination of convection and radiative cooling if they are to survive and contract to become giant protoplanets. While there is good observational support for forming at least some giant planets by disk instability, the mechanism has become theoretically contentious, with different three-dimensional radiative hydrodynamics codes often yielding different results. Rigorous code testing is required to make further progress. Here we present two new analytical solutions for radiative transfer in spherical coordinates, suitable for testing the code employed in all of the Boss disk instability calculations. The testing shows that the Boss code radiative transfer routines do an excellent job of relaxing to and maintaining the analytical results for the radial temperature and radiative flux profiles for a spherical cloud with high or moderate optical depths, including the transition from optically thick to optically thin regions. These radial test results are independent of whether the Eddington approximation, diffusion approximation, or flux-limited diffusion approximation routines are employed. The Boss code does an equally excellent job of relaxing to and maintaining the analytical results for the vertical (θ) temperature and radiative flux profiles for a disk with a height proportional to the radial distance. These tests strongly support the disk instability mechanism for forming giant planets.
Lawrence, David W; Hutchison, Michael G; Cusimano, Michael D; Singh, Tanveer; Li, Luke
2014-09-01
Interrater agreement evaluation of a tool to document and code the situational factors and mechanisms of knockouts (KOs) and technical knockouts (TKOs) in mixed martial arts (MMA). Retrospective case series. Professional MMA matches from the Ultimate Fighting Championship-2006-2012. Two nonmedically trained independent raters. The MMA Knockout Tool (MMA-KT) consists of 20 factors and captures and codes information on match characteristics, situational context preceding KOs and TKOs, as well as describing competitor states during these outcomes. The MMA-KT also evaluates the mechanism of action and subsequent events surrounding a KO. The 2 raters coded 125 unique events for a total of 250 events. The 8 factors of Part A had an average κ of 0.87 (SD = 0.10; range = 0.65-0.98); 7 were considered "substantial" agreement and 1 "moderate." Part B consists of 12 factors with an average κ of 0.84 (SD = 0.16; range = 0.59-1.0); 7 classified as "substantial" agreement, 4 "moderate," and 1 "fair." The majority of the factors in the MMA-KT demonstrated substantial interrater agreement, with an average κ of 0.86 (SD = 0.13; range = 0.59-1.0). The MMA-KT is a reliable tool to extract and code relevant information to investigate the situational factors and mechanism of KOs and TKOs in MMA competitions.
Development of Testing Methodologies for the Mechanical Properties of MEMS
NASA Technical Reports Server (NTRS)
Ekwaro-Osire, Stephen
2003-01-01
This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.
NASA Astrophysics Data System (ADS)
Chen, Tongsheng; Xing, Da; Gao, Xuejuan; Wang, Fang
2006-09-01
Bcl-2 family proteins (such as Bid and Bak/Bax) and 14-3-3 proteins play a key role in the mitochondria-mediated cell apoptosis induced by cell death factors such as TNF-α and lower power laser irradiation (LPLI). In this report, fluorescence resonance energy transfer (FRET) has been used to study the molecular mechanism of apoptosis in living cells on a fluorescence scanning confocal microscope. Based on the genetic code technique and the green fluorescent proteins (GFPs), single-cell dynamic analysis of caspase3 activation, caspase8 activation, and PKCs activation are performed during apoptosis induced by laser irradiation in real-time. To investigate the cellular effect and mechanism of laser irradiation, human lung adenocarcinoma cells (ASTC-a-1) transfected with plasmid SCAT3 (pSCAT3)/ CKAR FRET reporter, were irradiated and monitored noninvasively with both FRET imaging. Our results show that high fluence lower power laser irradiation (HFLPLI) can induce an increase of caspase3 activation and a decrease of PKCs activation, and that LPLI induces the ASTC-a-1 cell proliferation by specifically activating PKCs.
EAC: A program for the error analysis of STAGS results for plates
NASA Technical Reports Server (NTRS)
Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.
1989-01-01
A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.
NASA Astrophysics Data System (ADS)
Folley, Christopher; Bronowicki, Allen
2005-09-01
Prediction of optical performance for large, deployable telescopes under environmental conditions and mechanical disturbances is a crucial part of the design verification process of such instruments for all phases of design and operation: ground testing, commissioning, and on-orbit operation. A Structural-Thermal-Optical-Performance (STOP) analysis methodology is often created that integrates the output of one analysis with the input of another. The integration of thermal environment predictions with structural models is relatively well understood, while the integration of structural deformation results into optical analysis/design software is less straightforward. A Matlab toolbox has been created that effectively integrates the predictions of mechanical deformations on optical elements generated by, for example, finite element analysis, and computes optical path differences for the distorted prescription. The engine of the toolbox is the real ray-tracing algorithm that allows the optical surfaces to be defined in a single, global coordinate system thereby allowing automatic alignment of the mechanical coordinate system with the optical coordinate system. Therefore, the physical location of the optical surfaces is identical in the optical prescription and the finite element model. The application of rigid body displacements to optical surfaces, however, is more general than for use solely in STOP analysis, such as the analysis of misalignments during the commissioning process. Furthermore, all the functionality of Matlab is available for optimization and control. Since this is a new tool for use on flight programs, it has been verified against CODE V. The toolbox' functionality, to date, is described, verification results are presented, and, as an example of its utility, results of a thermal distortion analysis are presented using the James Webb Space Telescope (JWST) prescription.
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Initial Probabilistic Evaluation of Reactor Pressure Vessel Fracture with Grizzly and Raven
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Hoffman, William; Sen, Sonat
2015-10-01
The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled reactor pressure vessels (RPVs). Grizzly can be used to model the thermal/mechanical response of an RPV under transient conditions that would be observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtainmore » stress intensity factors, which can in turn be used to assess whether a fracture would initiate at a pre-existing flaw. These capabilities have been demonstrated previously. A typical RPV is likely to contain a large population of pre-existing flaws introduced during the manufacturing process. This flaw population is characterized stastistically through probability density functions of the flaw distributions. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation during a transient event. This report documents initial work to perform probabilistic analysis of RPV fracture during a PTS event using a combination of the RAVEN risk analysis code and Grizzly. This work is limited in scope, considering only a single flaw with deterministic geometry, but with uncertainty introduced in the parameters that influence fracture toughness. These results are benchmarked against equivalent models run in the FAVOR code. When fully developed, the RAVEN/Grizzly methodology for modeling probabilistic fracture in RPVs will provide a general capability that can be used to consider a wider variety of vessel and flaw conditions that are difficult to consider with current tools. In addition, this will provide access to advanced probabilistic techniques provided by RAVEN, including adaptive sampling and parallelism, which can dramatically decrease run times.« less
Jiang, Hui; Qin, Xiu-Juan; Li, Wei-Ping; Ma, Rong; Wang, Ting; Li, Zhu-Qing
2016-11-15
Long non-coding RNAs (LncRNAs) are an important class of widespread molecules involved in diverse biological functions, which are exceptionally expressed in numerous types of diseases. Currently, limited study on LncRNA in rheumatoid arthritis (RA) is available. In this study, we aimed to identify the specifically expressed LncRNA that are relevant to adjuvant-induced arthritis (AA) in rats, and to explore the possible molecular mechanisms of RA pathogenesis. To identify LncRNAs specifically expressed in rheumatoid arthritis, the expression of LncRNAs in synoviums of rats from the model group (n=3) was compared with that in the control group (n=3) using Arraystar Rat LncRNA/mRNA microarray and real-time polymerase chain reaction (RT-PCR). Up to 260 LncRNAs were found to be differentially expressed (≥1.5-fold-change) in the synoviums between AA model and the normal rats (170 up-regulated and 90 down-regulated LncRNAs in AA rats compared with normal rats). Coding-non-coding gene co-expression networks (CNC network) were drawn based on the correlation analysis between the differentially expressed LncRNAs and mRNAs. Six LncRNAs, XR_008357, U75927, MRAK046251, XR_006457, DQ266363 and MRAK003448, were selected to analyze the relationship between LncRNAs and RA via the CNC network and GO analysis. Real-time PCR result confirmed that the six LncRNAs were specifically expressed in the AA rats. These results revealed that clusters of LncRNAs were uniquely expressed in AA rats compared with controls, which manifests that these differentially expressed LncRNAs in AA rats might play a vital role in RA development. Up-regulation or down-regulation of the six LncRNAs might contribute to the molecular mechanism underlying RA. To sum up, our study provides potential targets for treatment of RA and novel profound understanding of the pathogenesis of RA. Copyright © 2016. Published by Elsevier B.V.
QOS-aware error recovery in wireless body sensor networks using adaptive network coding.
Razzaque, Mohammad Abdur; Javadi, Saeideh S; Coulibaly, Yahaya; Hira, Muta Tah
2014-12-29
Wireless body sensor networks (WBSNs) for healthcare and medical applications are real-time and life-critical infrastructures, which require a strict guarantee of quality of service (QoS), in terms of latency, error rate and reliability. Considering the criticality of healthcare and medical applications, WBSNs need to fulfill users/applications and the corresponding network's QoS requirements. For instance, for a real-time application to support on-time data delivery, a WBSN needs to guarantee a constrained delay at the network level. A network coding-based error recovery mechanism is an emerging mechanism that can be used in these systems to support QoS at very low energy, memory and hardware cost. However, in dynamic network environments and user requirements, the original non-adaptive version of network coding fails to support some of the network and user QoS requirements. This work explores the QoS requirements of WBSNs in both perspectives of QoS. Based on these requirements, this paper proposes an adaptive network coding-based, QoS-aware error recovery mechanism for WBSNs. It utilizes network-level and user-/application-level information to make it adaptive in both contexts. Thus, it provides improved QoS support adaptively in terms of reliability, energy efficiency and delay. Simulation results show the potential of the proposed mechanism in terms of adaptability, reliability, real-time data delivery and network lifetime compared to its counterparts.
Miller-Delaney, Suzanne F.C.; Bryan, Kenneth; Das, Sudipto; McKiernan, Ross C.; Bray, Isabella M.; Reynolds, James P.; Gwinn, Ryder; Stallings, Raymond L.
2015-01-01
Temporal lobe epilepsy is associated with large-scale, wide-ranging changes in gene expression in the hippocampus. Epigenetic changes to DNA are attractive mechanisms to explain the sustained hyperexcitability of chronic epilepsy. Here, through methylation analysis of all annotated C-phosphate-G islands and promoter regions in the human genome, we report a pilot study of the methylation profiles of temporal lobe epilepsy with or without hippocampal sclerosis. Furthermore, by comparative analysis of expression and promoter methylation, we identify methylation sensitive non-coding RNA in human temporal lobe epilepsy. A total of 146 protein-coding genes exhibited altered DNA methylation in temporal lobe epilepsy hippocampus (n = 9) when compared to control (n = 5), with 81.5% of the promoters of these genes displaying hypermethylation. Unique methylation profiles were evident in temporal lobe epilepsy with or without hippocampal sclerosis, in addition to a common methylation profile regardless of pathology grade. Gene ontology terms associated with development, neuron remodelling and neuron maturation were over-represented in the methylation profile of Watson Grade 1 samples (mild hippocampal sclerosis). In addition to genes associated with neuronal, neurotransmitter/synaptic transmission and cell death functions, differential hypermethylation of genes associated with transcriptional regulation was evident in temporal lobe epilepsy, but overall few genes previously associated with epilepsy were among the differentially methylated. Finally, a panel of 13, methylation-sensitive microRNA were identified in temporal lobe epilepsy including MIR27A, miR-193a-5p (MIR193A) and miR-876-3p (MIR876), and the differential methylation of long non-coding RNA documented for the first time. The present study therefore reports select, genome-wide DNA methylation changes in human temporal lobe epilepsy that may contribute to the molecular architecture of the epileptic brain. PMID:25552301
Wang, Hongbo; Zhao, Yingchao; Chen, Mingyue; Cui, Jie
2017-01-01
Cervical cancer is the third most common cancer worldwide and the fourth leading cause of cancer-associated mortality in women. Accumulating evidence indicates that long non-coding RNAs (lncRNAs) and circular RNAs (circRNAs) may play key roles in the carcinogenesis of different cancers; however, little is known about the mechanisms of lncRNAs and circRNAs in the progression and metastasis of cervical cancer. In this study, we explored the expression profiles of lncRNAs, circRNAs, miRNAs, and mRNAs in HPV16 (human papillomavirus genotype 16) mediated cervical squamous cell carcinoma and matched adjacent non-tumor (ATN) tissues from three patients with high-throughput RNA sequencing (RNA-seq). In total, we identified 19 lncRNAs, 99 circRNAs, 28 miRNAs, and 304 mRNAs that were commonly differentially expressed (DE) in different patients. Among the non-coding RNAs, 3 lncRNAs and 44 circRNAs are novel to our knowledge. Functional enrichment analysis showed that DE lncRNAs, miRNAs, and mRNAs were enriched in pathways crucial to cancer as well as other gene ontology (GO) terms. Furthermore, the co-expression network and function prediction suggested that all 19 DE lncRNAs could play different roles in the carcinogenesis and development of cervical cancer. The competing endogenous RNA (ceRNA) network based on DE coding and non-coding RNAs showed that each miRNA targeted a number of lncRNAs and circRNAs. The link between part of the miRNAs in the network and cervical cancer has been validated in previous studies, and these miRNAs targeted the majority of the novel non-coding RNAs, thus suggesting that these novel non-coding RNAs may be involved in cervical cancer. Taken together, our study shows that DE non-coding RNAs could be further developed as diagnostic and therapeutic biomarkers of cervical cancer. The complex ceRNA network also lays the foundation for future research of the roles of coding and non-coding RNAs in cervical cancer. PMID:28970820
Utility of QR codes in biological collections
Diazgranados, Mauricio; Funk, Vicki A.
2013-01-01
Abstract The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers’ electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections. PMID:24198709
Utility of QR codes in biological collections.
Diazgranados, Mauricio; Funk, Vicki A
2013-01-01
The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers' electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.
Ensemble coding of face identity is not independent of the coding of individual identity.
Neumann, Markus F; Ng, Ryan; Rhodes, Gillian; Palermo, Romina
2018-06-01
Information about a group of similar objects can be summarized into a compressed code, known as ensemble coding. Ensemble coding of simple stimuli (e.g., groups of circles) can occur in the absence of detailed exemplar coding, suggesting dissociable processes. Here, we investigate whether a dissociation would still be apparent when coding facial identity, where individual exemplar information is much more important. We examined whether ensemble coding can occur when exemplar coding is difficult, as a result of large sets or short viewing times, or whether the two types of coding are positively associated. We found a positive association, whereby both ensemble and exemplar coding were reduced for larger groups and shorter viewing times. There was no evidence for ensemble coding in the absence of exemplar coding. At longer presentation times, there was an unexpected dissociation, where exemplar coding increased yet ensemble coding decreased, suggesting that robust information about face identity might suppress ensemble coding. Thus, for face identity, we did not find the classic dissociation-of access to ensemble information in the absence of detailed exemplar information-that has been used to support claims of distinct mechanisms for ensemble and exemplar coding.
Energy Savings Analysis of the Proposed NYStretch-Energy Code 2018
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Bing; Zhang, Jian; Chen, Yan
This study was conducted by the Pacific Northwest National Laboratory (PNNL) in support of the stretch energy code development led by the New York State Energy Research and Development Authority (NYSERDA). In 2017 NYSERDA developed its 2016 Stretch Code Supplement to the 2016 New York State Energy Conservation Construction Code (hereinafter referred to as “NYStretch-Energy”). NYStretch-Energy is intended as a model energy code for statewide voluntary adoption that anticipates other code advancements culminating in the goal of a statewide Net Zero Energy Code by 2028. Since then, NYSERDA continues to develop the NYStretch-Energy Code 2018 edition. To support the effort,more » PNNL conducted energy simulation analysis to quantify the energy savings of proposed commercial provisions of the NYStretch-Energy Code (2018) in New York. The focus of this project is the 20% improvement over existing commercial model energy codes. A key requirement of the proposed stretch code is that it be ‘adoptable’ as an energy code, meaning that it must align with current code scope and limitations, and primarily impact building components that are currently regulated by local building departments. It is largely limited to prescriptive measures, which are what most building departments and design projects are most familiar with. This report describes a set of energy-efficiency measures (EEMs) that demonstrate 20% energy savings over ANSI/ASHRAE/IES Standard 90.1-2013 (ASHRAE 2013) across a broad range of commercial building types and all three climate zones in New York. In collaboration with New Building Institute, the EEMs were developed from national model codes and standards, high-performance building codes and standards, regional energy codes, and measures being proposed as part of the on-going code development process. PNNL analyzed these measures using whole building energy models for selected prototype commercial buildings and multifamily buildings representing buildings in New York. Section 2 of this report describes the analysis methodology, including the building types and construction area weights update for this analysis, the baseline, and the method to conduct the energy saving analysis. Section 3 provides detailed specifications of the EEMs and bundles. Section 4 summarizes the results of individual EEMs and EEM bundles by building type, energy end-use and climate zone. Appendix A documents detailed descriptions of the selected prototype buildings. Appendix B provides energy end-use breakdown results by building type for both the baseline code and stretch code in all climate zones.« less
Seed dormancy and germination—emerging mechanisms and new hypotheses
Nonogaki, Hiroyuki
2014-01-01
Seed dormancy has played a significant role in adaptation and evolution of seed plants. While its biological significance is clear, molecular mechanisms underlying seed dormancy induction, maintenance and alleviation still remain elusive. Intensive efforts have been made to investigate gibberellin and abscisic acid metabolism in seeds, which greatly contributed to the current understanding of seed dormancy mechanisms. Other mechanisms, which might be independent of hormones, or specific to the seed dormancy pathway, are also emerging from genetic analysis of “seed dormancy mutants.” These studies suggest that chromatin remodeling through histone ubiquitination, methylation and acetylation, which could lead to transcription elongation or gene silencing, may play a significant role in seed dormancy regulation. Small interfering RNA and/or long non-coding RNA might be a trigger of epigenetic changes at the seed dormancy or germination loci, such as DELAY OF GERMINATION1. While new mechanisms are emerging from genetic studies of seed dormancy, novel hypotheses are also generated from seed germination studies with high throughput gene expression analysis. Recent studies on tissue-specific gene expression in tomato and Arabidopsis seeds, which suggested possible “mechanosensing” in the regulatory mechanisms, advanced our understanding of embryo-endosperm interaction and have potential to re-draw the traditional hypotheses or integrate them into a comprehensive scheme. The progress in basic seed science will enable knowledge translation, another frontier of research to be expanded for food and fuel production. PMID:24904627
The small stellated dodecahedron code and friends.
Conrad, J; Chamberland, C; Breuckmann, N P; Terhal, B M
2018-07-13
We explore a distance-3 homological CSS quantum code, namely the small stellated dodecahedron code, for dense storage of quantum information and we compare its performance with the distance-3 surface code. The data and ancilla qubits of the small stellated dodecahedron code can be located on the edges respectively vertices of a small stellated dodecahedron, making this code suitable for three-dimensional connectivity. This code encodes eight logical qubits into 30 physical qubits (plus 22 ancilla qubits for parity check measurements) in contrast with one logical qubit into nine physical qubits (plus eight ancilla qubits) for the surface code. We develop fault-tolerant parity check circuits and a decoder for this code, allowing us to numerically assess the circuit-based pseudo-threshold.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Authors.
Interactive QR code beautification with full background image embedding
NASA Astrophysics Data System (ADS)
Lin, Lijian; Wu, Song; Liu, Sijiang; Jiang, Bo
2017-06-01
QR (Quick Response) code is a kind of two dimensional barcode that was first developed in automotive industry. Nowadays, QR code has been widely used in commercial applications like product promotion, mobile payment, product information management, etc. Traditional QR codes in accordance with the international standard are reliable and fast to decode, but are lack of aesthetic appearance to demonstrate visual information to customers. In this work, we present a novel interactive method to generate aesthetic QR code. By given information to be encoded and an image to be decorated as full QR code background, our method accepts interactive user's strokes as hints to remove undesired parts of QR code modules based on the support of QR code error correction mechanism and background color thresholds. Compared to previous approaches, our method follows the intention of the QR code designer, thus can achieve more user pleasant result, while keeping high machine readability.
Characterizing the Properties of a Woven SiC/SiC Composite Using W-CEMCAN Computer Code
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.
1999-01-01
A micromechanics based computer code to predict the thermal and mechanical properties of woven ceramic matrix composites (CMC) is developed. This computer code, W-CEMCAN (Woven CEramic Matrix Composites ANalyzer), predicts the properties of two-dimensional woven CMC at any temperature and takes into account various constituent geometries and volume fractions. This computer code is used to predict the thermal and mechanical properties of an advanced CMC composed of 0/90 five-harness (5 HS) Sylramic fiber which had been chemically vapor infiltrated (CVI) with boron nitride (BN) and SiC interphase coatings and melt-infiltrated (MI) with SiC. The predictions, based on the bulk constituent properties from the literature, are compared with measured experimental data. Based on the comparison. improved or calibrated properties for the constituent materials are then developed for use by material developers/designers. The computer code is then used to predict the properties of a composite with the same constituents but with different fiber volume fractions. The predictions are compared with measured data and a good agreement is achieved.
Verification and Validation of the BISON Fuel Performance Code for PCMI Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamble, Kyle Allan Lawrence; Novascone, Stephen Rhead; Gardner, Russell James
2016-06-01
BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described. Validation for application to light water reactor (LWR) PCMI problems is assessed by comparing predicted and measured rod diameter following base irradiation andmore » power ramps. Results indicate a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. Initial rod diameter comparisons have led to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.« less