DOE Office of Scientific and Technical Information (OSTI.GOV)
Dearing, J F; Rose, S D; Nelson, W R
The predicted computational results of two well-known sub-channel analysis codes, COBRA-III-C and SABRE-I (wire wrap version), have been evaluated by comparison with steady state temperature data from the THORS Facility at ORNL. Both codes give good predictions of transverse and axial temperatures when compared with wire wrap thermocouple data. The crossflow velocity profiles predicted by these codes are similar which is encouraging since the wire wrap models are based on different assumptions.
Resistance of the Opossum (Didelphis Virginiana) to Envenomation by Snakes of the Crotalidae Family.
1976-12-01
ni., .d in Eiock 20. If dtli.r., ( ha. R.p.f ) IS S UPPL€ MEN TA RY NOTES Animal studies in relation to chemical agents 5 I E Y WO RDS (Co&Mu. ., r...cobra 1 .07 mg/kg, iv Died in 30 mm Na/a na/a atra 0 Snakebite Died in 45 min Chinese cobra Na/a nivea 03 1.38 mg/kg, iv Died in 1 hr Cape cobra Micn
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dearing, J F; Nelson, W R; Rose, S D
Computational thermal-hydraulic models of a 19-pin, electrically heated, wire-wrap liquid-metal fast breeder reactor test bundle were developed using two well-known subchannel analysis codes, COBRA III-C and SABRE-1 (wire-wrap version). These two codes use similar subchannel control volumes for the finite difference conservation equations but vary markedly in solution strategy and modeling capability. In particular, the empirical wire-wrap-forced diversion crossflow models are different. Surprisingly, however, crossflow velocity predictions of the two codes are very similar. Both codes show generally good agreement with experimental temperature data from a test in which a large radial temperature gradient was imposed. Differences between data andmore » code results are probably caused by experimental pin bowing, which is presently the limiting factor in validating coded empirical models.« less
COBRApy: COnstraints-Based Reconstruction and Analysis for Python.
Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R
2013-08-08
COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khan, E.U.; George, T.L.; Rector, D.R.
The natural circulation tests of the Fast Flux Test Facility (FFTF) demonstrated a safe and stable transition from forced convection to natural convection and showed that natural convection may adequately remove decay heat from the reactor core. The COBRA-WC computer code was developed by the Pacific Northwest laboratory (PNL) to account for buoyancy-induced coolant flow redistribution and interassembly heat transfer, effects that become important in mitigating temperature gradients and reducing reactor core temperatures when coolant flow rate in the core is low. This report presents work sponsored by the US Department of Energy (DOE) with the objective of checking themore » validity of COBRA-WC during the first 220 seconds (sec) of the FFTF natural-circulation (plant-startup) tests using recorded data from two instrumented Fuel Open Test Assemblies (FOTAs). Comparison of COBRA-WC predictions of the FOTA data is a part of the final confirmation of the COBRA-WC methodology for core natural-convection analysis.« less
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
Rajesh, Ramanna V; Layer, Paul G; Boopathy, Rathanam
2009-01-01
Investigation of the non-classical functions of cholinesterases (ChEs) has been the subject of interest in the past three decades. One of which is aryl acylamidase (AAA) activity associated with ChEs, but characterized in in vitro, as an enzyme, splitting the artificial substrate o-nitroacetanilide with unknown physiological function. In the present study, we have compared levels of AAA activity of AChE from different sources like goat brain, electric eel organ and from venoms of different snakes. Remarkably cobra venom showed the highest AAA activity and also high AAA/AChE ratio. Both serotonergenic and cholinergic inhibitors inhibited the cobra venom AAA activity in a concentration dependent manner, which also underlines the association of AAA with AChE of cobra venom. The study becomes interesting because of i) the cobra venom AChE exists in monomeric globular forms; ii) in Alzheimer's disease too the most abundant forms of cholinesterases are monomeric globular forms, thought to be involved in the pathogenesis of Alzheimer's disease; iii) the effect of Alzheimer's disease drugs on the AAA activity of cobra venom, indicated that AAA activity of cobra venom was more sensitive than AChE and iv) Huperzine and Tacrine showed more pronounced effect on AAA. Thus, this study elucidates the high AAA associated with cobra venom AChE may serve as one of the prominent activity to test the pharmacological effect of AD drugs, as other sources were found to have lower activity.
MOFA Software for the COBRA Toolbox
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griesemer, Marc; Navid, Ali
MOFA-COBRA is a software code for Matlab that performs Multi-Objective Flux Analysis (MOFA), a solving of linear programming problems. Teh leading software package for conducting different types of analyses using constrain-based models is the COBRA Toolbox for Matlab. MOFA-COBRA is an added tool for COBRA that solves multi-objective problems using a novel algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, Maria N.; Salko, Robert K.
Coolant-Boiling in Rod Arrays|Two Fluids (COBRA-TF) is a thermal/ hydraulic (T/H) simulation code designed for light water reactor (LWR) vessel analysis. It uses a two-fluid, three-field (i.e. fluid film, fluid drops, and vapor) modeling approach. Both sub-channel and 3D Cartesian forms of 9 conservation equations are available for LWR modeling. The code was originally developed by Pacific Northwest Laboratory in 1980 and had been used and modified by several institutions over the last few decades. COBRA-TF also found use at the Pennsylvania State University (PSU) by the Reactor Dynamics and Fuel Management Group (RDFMG) and has been improved, updated, andmore » subsequently re-branded as CTF. As part of the improvement process, it was necessary to generate sufficient documentation for the open-source code which had lacked such material upon being adopted by RDFMG. This document serves mainly as a theory manual for CTF, detailing the many two-phase heat transfer, drag, and important accident scenario models contained in the code as well as the numerical solution process utilized. Coding of the models is also discussed, all with consideration for updates that have been made when transitioning from COBRA-TF to CTF. Further documentation outside of this manual is also available at RDFMG which focus on code input deck generation and source code global variable and module listings.« less
Improvements and applications of COBRA-TF for stand-alone and coupled LWR safety analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, M.; Cuervo, D.; Ivanov, K.
2006-07-01
The advanced thermal-hydraulic subchannel code COBRA-TF has been recently improved and applied for stand-alone and coupled LWR core calculations at the Pennsylvania State Univ. in cooperation with AREVA NP GmbH (Germany)) and the Technical Univ. of Madrid. To enable COBRA-TF for academic and industrial applications including safety margins evaluations and LWR core design analyses, the code programming, numerics, and basic models were revised and substantially improved. The code has undergone through an extensive validation, verification, and qualification program. (authors)
NASA Technical Reports Server (NTRS)
Fleming, David P.; Poplawski, J. V.
2002-01-01
Rolling-element bearing forces vary nonlinearly with bearing deflection. Thus an accurate rotordynamic transient analysis requires bearing forces to be determined at each step of the transient solution. Analyses have been carried out to show the effect of accurate bearing transient forces (accounting for non-linear speed and load dependent bearing stiffness) as compared to conventional use of average rolling-element bearing stiffness. Bearing forces were calculated by COBRA-AHS (Computer Optimized Ball and Roller Bearing Analysis - Advanced High Speed) and supplied to the rotordynamics code ARDS (Analysis of Rotor Dynamic Systems) for accurate simulation of rotor transient behavior. COBRA-AHS is a fast-running 5 degree-of-freedom computer code able to calculate high speed rolling-element bearing load-displacement data for radial and angular contact ball bearings and also for cylindrical and tapered roller beatings. Results show that use of nonlinear bearing characteristics is essential for accurate prediction of rotordynamic behavior.
Michener, Thomas E.; Rector, David R.; Cuta, Judith M.
2017-09-01
COBRA-SFS, a thermal-hydraulics code developed for steady-state and transient analysis of multi-assembly spent-fuel storage and transportation systems, has been incorporated into the Used Nuclear Fuel-Storage, Transportation and Disposal Analysis Resource and Data System tool as a module devoted to spent fuel package thermal analysis. This paper summarizes the basic formulation of the equations and models used in the COBRA-SFS code, showing that COBRA-SFS fully captures the important physical behavior governing the thermal performance of spent fuel storage systems, with internal and external natural convection flow patterns, and heat transfer by convection, conduction, and thermal radiation. Of particular significance is themore » capability for detailed thermal radiation modeling within the fuel rod array.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michener, Thomas E.; Rector, David R.; Cuta, Judith M.
COBRA-SFS, a thermal-hydraulics code developed for steady-state and transient analysis of multi-assembly spent-fuel storage and transportation systems, has been incorporated into the Used Nuclear Fuel-Storage, Transportation and Disposal Analysis Resource and Data System tool as a module devoted to spent fuel package thermal analysis. This paper summarizes the basic formulation of the equations and models used in the COBRA-SFS code, showing that COBRA-SFS fully captures the important physical behavior governing the thermal performance of spent fuel storage systems, with internal and external natural convection flow patterns, and heat transfer by convection, conduction, and thermal radiation. Of particular significance is themore » capability for detailed thermal radiation modeling within the fuel rod array.« less
CASL VMA FY16 Milestone Report (L3:VMA.VUQ.P13.07) Westinghouse Mixing with COBRA-TF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, Natalie
2016-09-30
COBRA-TF (CTF) is a low-resolution code currently maintained as CASL's subchannel analysis tool. CTF operates as a two-phase, compressible code over a mesh comprised of subchannels and axial discretized nodes. In part because CTF is a low-resolution code, simulation run time is not computationally expensive, only on the order of minutes. Hi-resolution codes such as STAR-CCM+ can be used to train lower-fidelity codes such as CTF. Unlike STAR-CCM+, CTF has no turbulence model, only a two-phase turbulent mixing coefficient, β. β can be set to a constant value or calculated in terms of Reynolds number using an empirical correlation. Resultsmore » from STAR-CCM+ can be used to inform the appropriate value of β. Once β is calibrated, CTF runs can be an inexpensive alternative to costly STAR-CCM+ runs for scoping analyses. Based on the results of CTF runs, STAR-CCM+ can be run for specific parameters of interest. CASL areas of application are CIPS for single phase analysis and DNB-CTF for two-phase analysis.« less
Improvement of COBRA-TF for modeling of PWR cold- and hot-legs during reactor transients
NASA Astrophysics Data System (ADS)
Salko, Robert K.
COBRA-TF is a two-phase, three-field (liquid, vapor, droplets) thermal-hydraulic modeling tool that has been developed by the Pacific Northwest Laboratory under sponsorship of the NRC. The code was developed for Light Water Reactor analysis starting in the 1980s; however, its development has continued to this current time. COBRA-TF still finds wide-spread use throughout the nuclear engineering field, including nuclear-power vendors, academia, and research institutions. It has been proposed that extension of the COBRA-TF code-modeling region from vessel-only components to Pressurized Water Reactor (PWR) coolant-line regions can lead to improved Loss-of-Coolant Accident (LOCA) analysis. Improved modeling is anticipated due to COBRA-TF's capability to independently model the entrained-droplet flow-field behavior, which has been observed to impact delivery to the core region[1]. Because COBRA-TF was originally developed for vertically-dominated, in-vessel, sub-channel flow, extension of the COBRA-TF modeling region to the horizontal-pipe geometries of the coolant-lines required several code modifications, including: • Inclusion of the stratified flow regime into the COBRA-TF flow regime map, along with associated interfacial drag, wall drag and interfacial heat transfer correlations, • Inclusion of a horizontal-stratification force between adjacent mesh cells having unequal levels of stratified flow, and • Generation of a new code-input interface for the modeling of coolant-lines. The sheer number of COBRA-TF modifications that were required to complete this work turned this project into a code-development project as much as it was a study of thermal-hydraulics in reactor coolant-lines. The means for achieving these tasks shifted along the way, ultimately leading the development of a separate, nearly completely independent one-dimensional, two-phase-flow modeling code geared toward reactor coolant-line analysis. This developed code has been named CLAP, for Coolant-Line-Analysis Package. Versions were created that were both coupled to COBRA-TF and standalone, with the most recent version being a standalone code. This code performs a separate, simplified, 1-D solution of the conservation equations while making special considerations for coolant-line geometry and flow phenomena. The end of this project saw a functional code package that demonstrates a stable numerical solution and that has gone through a series of Validation and Verification tests using the Two-Phase Testing Facility (TPTF) experimental data[2]. The results indicate that CLAP is under-performing RELAP5-MOD3 in predicting the experimental void of the TPTF facility in some cases. There is no apparent pattern, however, to point to a consistent type of case that the code fails to predict properly (e.g., low-flow, high-flow, discharging to full vessel, or discharging to empty vessel). Pressure-profile predictions are sometimes unrealistic, which indicates that there may be a problem with test-case boundary conditions or with the coupling of continuity and momentum equations in the solution algorithm. The code does predict the flow regime correctly for all cases with the stratification-force model off. Turning the stratification model on can cause the low-flow case void profiles to over-react to the force and the flow regime to transition out of stratified flow. The code would benefit from an increased amount of Validation & Verification testing. The development of CLAP was significant, as it is a cleanly written, logical representation of the reactor coolant-line geometry. It is stable and capable of modeling basic flow physics in the reactor coolant-line. Code development and debugging required the temporary removal of the energy equation and mass-transfer terms in governing equations. The reintroduction of these terms will allow future coupling to RELAP and re-coupling with COBRA-TF. Adding in more applicable entrainment and de-entrainment models would allow the capture of more advanced physics in the coolant-line that can be expected during Loss-of-Coolant Accident. One of the package's benefits is its ability to be used as a platform for future coolant-line model development and implementation, including capturing of the important de-entrainment behavior in reactor hot-legs (steam-binding effect) and flow convection in the upper-plenum region of the vessel.
Akça, Ozan; Wadhwa, Anupama; Sengupta, Papiya; Durrani, Jaleel; Hanni, Keith; Wenke, Mary; Yücel, Yüksel; Lenhardt, Rainer; Doufas, Anthony G.; Sessler, Daniel I.
2006-01-01
The Laryngeal Mask Airway (LMA) is a frequently-used efficient airway device, yet it sometimes seals poorly, thus reducing the efficacy of positive-pressure ventilation. The Perilaryngeal Airway (CobraPLA) is a novel airway device with a larger pharyngeal cuff (when inflated). We tested the hypothesis that the CobraPLA was superior to LMA with regard to insertion time and airway sealing pressure and comparable to LMA in airway adequacy and recovery characteristics. After midazolam and fentanyl, 81 ASA I-II outpatients having elective surgery were randomized to receive an LMA or CobraPLA. Anesthesia was induced with propofol (2.5 mg/kg, IV), and the airway inserted. We measured 1) insertion time; 2) adequacy of the airway (no leak at 15-cm-H2O peak pressure or tidal volume of 5 ml/kg); 3) airway sealing pressure; 4) number of repositioning attempts; and 5) sealing quality (no leak at tidal volume of 8 ml/kg). At the end of surgery, gastric insufflation, postoperative sore throat, dysphonia, and dysphagia were evaluated. Data were compared with unpaired t-tests, chi-square tests, or Fisher’s Exact tests; P<0.05 was significant. Patient characteristics, insertion times, airway adequacy, number of repositioning attempts, and recovery were similar in each group. Airway sealing pressure was significantly greater with CobraPLA (23±6 cm H2O) than LMA (18±5 cm H2O, P<0.001). The CobraPLA has insertion characteristics similar to LMA, but better airway sealing capabilities. PMID:15281543
Software Developed for Analyzing High- Speed Rolling-Element Bearings
NASA Technical Reports Server (NTRS)
Fleming, David P.
2005-01-01
COBRA-AHS (Computer Optimized Ball & Roller Bearing Analysis--Advanced High Speed, J.V. Poplawski & Associates, Bethlehem, PA) is used for the design and analysis of rolling element bearings operating at high speeds under complex mechanical and thermal loading. The code estimates bearing fatigue life by calculating three-dimensional subsurface stress fields developed within the bearing raceways. It provides a state-of-the-art interactive design environment for bearing engineers within a single easy-to-use design-analysis package. The code analyzes flexible or rigid shaft systems containing up to five bearings acted upon by radial, thrust, and moment loads in 5 degrees of freedom. Bearing types include high-speed ball, cylindrical roller, and tapered roller bearings. COBRA-AHS is the first major upgrade in 30 years of such commercially available bearing software. The upgrade was developed under a Small Business Innovation Research contract from the NASA Glenn Research Center, and incorporates the results of 30 years of NASA and industry bearing research and technology.
1980-08-01
knots Figure 14. Current profile. 84 6; * .4. 0 E U U U -~ U U (.4 U @0 85 I UECfLI ?E)r eAtE NjKC 7 frCAd I o .,01 U.I 75o* ANL I U,) I000. 0.) AKC 3 U...NAVSCOLCECOFF C35 Port Hueneme, CA NAVSEASYSCOM Code SEA OOC Washington. DC NAVSEC Code 6034 (Library), Washington DC NAVSHIPREPFAC Library. Guam NAVSHIPYD Code
Oukkache, Naoual; Jaoudi, Rachid El; Ghalim, Noreddine; Chgoury, Fatima; Bouhaouala, Balkiss; Mdaghri, Naima El; Sabatier, Jean-Marc
2014-01-01
Scorpion stings and snake bites are major health hazards that lead to suffering of victims and high mortality. Thousands of injuries associated with such stings and bites of venomous animals occur every year worldwide. In North Africa, more than 100,000 scorpion stings and snake bites are reported annually. An appropriate determination of the 50% lethal doses (LD50) of scorpion and snake venoms appears to be an important step to assess (and compare) venom toxic activity. Such LD50 values are also commonly used to evaluate the neutralizing capacity of specific anti-venom batches. In the present work, we determined experimentally the LD50 values of reference scorpion and snake venoms in Swiss mice, and evaluated the influence of two main venom injection routes (i.e., intraperitoneal (IP) versus intravenous (IV)). The analysis of experimental LD50 values obtained with three collected scorpion venoms indicates that Androctonus mauretanicus (Am) is intrinsically more toxic than Androctonus australis hector (Aah) species, whereas the latter is more toxic than Buthus occitanus (Bo). Similar analysis of three representative snake venoms of the Viperidae family shows that Cerastes cerastes (Cc) is more toxic than either Bitis arietans (Ba) or Macrovipera lebetina (Ml) species. Interestingly, the venom of Elapidae cobra snake Naja haje (Nh) is far more toxic than viper venoms Cc, Ml and Ba, in agreement with the known severity of cobra-related envenomation. Also, our data showed that viper venoms are about three-times less toxic when injected IP as compared to IV, distinct from cobra venom Nh which exhibited a similar toxicity when injected IP or IV. Overall, this study clearly highlights the usefulness of procedure standardization, especially regarding the administration route, for evaluating the relative toxicity of individual animal venoms. It also evidenced a marked difference in lethal activity between venoms of cobra and vipers, which, apart from the nature of toxins, might be attributed to the rich composition of high molecular weight enzymes in the case of viper venoms. PMID:24926799
Oukkache, Naoual; El Jaoudi, Rachid; Ghalim, Noreddine; Chgoury, Fatima; Bouhaouala, Balkiss; Mdaghri, Naima El; Sabatier, Jean-Marc
2014-06-12
Scorpion stings and snake bites are major health hazards that lead to suffering of victims and high mortality. Thousands of injuries associated with such stings and bites of venomous animals occur every year worldwide. In North Africa, more than 100,000 scorpion stings and snake bites are reported annually. An appropriate determination of the 50% lethal doses (LD₅₀) of scorpion and snake venoms appears to be an important step to assess (and compare) venom toxic activity. Such LD₅₀ values are also commonly used to evaluate the neutralizing capacity of specific anti-venom batches. In the present work, we determined experimentally the LD₅₀ values of reference scorpion and snake venoms in Swiss mice, and evaluated the influence of two main venom injection routes (i.e., intraperitoneal (IP) versus intravenous (IV)). The analysis of experimental LD₅₀ values obtained with three collected scorpion venoms indicates that Androctonus mauretanicus (Am) is intrinsically more toxic than Androctonus australis hector (Aah) species, whereas the latter is more toxic than Buthus occitanus (Bo). Similar analysis of three representative snake venoms of the Viperidae family shows that Cerastes cerastes (Cc) is more toxic than either Bitis arietans (Ba) or Macrovipera lebetina (Ml) species. Interestingly, the venom of Elapidae cobra snake Naja haje (Nh) is far more toxic than viper venoms Cc, Ml and Ba, in agreement with the known severity of cobra-related envenomation. Also, our data showed that viper venoms are about three-times less toxic when injected IP as compared to IV, distinct from cobra venom Nh which exhibited a similar toxicity when injected IP or IV. Overall, this study clearly highlights the usefulness of procedure standardization, especially regarding the administration route, for evaluating the relative toxicity of individual animal venoms. It also evidenced a marked difference in lethal activity between venoms of cobra and vipers, which, apart from the nature of toxins, might be attributed to the rich composition of high molecular weight enzymes in the case of viper venoms.
for the game. Subsequent duels , flown with single armed escorts, calculated reduction in losses and damage states. For the study, hybrid computer...6) a duel between a ground weapon, armed escort, and formation of lift aircraft. (Author)
Analysis of internal flows relative to the space shuttle main engine
NASA Technical Reports Server (NTRS)
1987-01-01
Cooperative efforts between the Lockheed-Huntsville Computational Mechanics Group and the NASA-MSFC Computational Fluid Dynamics staff has resulted in improved capabilities for numerically simulating incompressible flows generic to the Space Shuttle Main Engine (SSME). A well established and documented CFD code was obtained, modified, and applied to laminar and turbulent flows of the type occurring in the SSME Hot Gas Manifold. The INS3D code was installed on the NASA-MSFC CRAY-XMP computer system and is currently being used by NASA engineers. Studies to perform a transient analysis of the FPB were conducted. The COBRA/TRAC code is recommended for simulating the transient flow of oxygen into the LOX manifold. Property data for modifying the code to represent LOX/GOX flow was collected. The ALFA code was developed and recommended for representing the transient combustion in the preburner. These two codes will couple through the transient boundary conditions to simulate the startup and/or shutdown of the fuel preburner. A study, NAS8-37461, is currently being conducted to implement this modeling effort.
Testing and COBRA-SFS analysis of the VSC-17 ventilated concrete, spent fuel storage cask
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinnon, M.A.; Dodge, R.E.; Schmitt, R.C.
1992-04-01
A performance test of a Pacific Sierra Nuclear VSC-17 ventilated concrete storage cask loaded with 17 canisters of consolidated PWR spent fuel generating approximately 15 kW was conducted. The performance test included measuring the cask surface, concrete, air channel surface, and fuel temperatures, as well as cask surface gamma and neutron dose rates. Testing was performed using vacuum, nitrogen, and helium backfill environments. Pretest predictions of cask thermal performance were made using the COBRA-SFS computer code. Analysis results were within 15{degrees}C of measured peak fuel temperature. Peak fuel temperature for normal operation was 321{degrees}C. In general, the surface dose ratesmore » were less than 30 mrem/h on the side of the cask and 40 mrem/h on the top of the cask.« less
Advanced Software for Analysis of High-Speed Rolling-Element Bearings
NASA Technical Reports Server (NTRS)
Poplawski, J. V.; Rumbarger, J. H.; Peters, S. M.; Galatis, H.; Flower, R.
2003-01-01
COBRA-AHS is a package of advanced software for analysis of rigid or flexible shaft systems supported by rolling-element bearings operating at high speeds under complex mechanical and thermal loads. These loads can include centrifugal and thermal loads generated by motions of bearing components. COBRA-AHS offers several improvements over prior commercial bearing-analysis programs: It includes innovative probabilistic fatigue-life-estimating software that provides for computation of three-dimensional stress fields and incorporates stress-based (in contradistinction to prior load-based) mathematical models of fatigue life. It interacts automatically with the ANSYS finite-element code to generate finite-element models for estimating distributions of temperature and temperature-induced changes in dimensions in iterative thermal/dimensional analyses: thus, for example, it can be used to predict changes in clearances and thermal lockup. COBRA-AHS provides an improved graphical user interface that facilitates the iterative cycle of analysis and design by providing analysis results quickly in graphical form, enabling the user to control interactive runs without leaving the program environment, and facilitating transfer of plots and printed results for inclusion in design reports. Additional features include roller-edge stress prediction and influence of shaft and housing distortion on bearing performance.
COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
Sakthivel, G.; Dey, Amitabha; Nongalleima, Kh.; Chavali, Murthy; Rimal Isaac, R. S.; Singh, N. Surjit; Deb, Lokesh
2013-01-01
The present study emphasizes to reveal the antivenom activity of Aristolochia bracteolata Lam., Tylophora indica (Burm.f.) Merrill, and Leucas aspera S. which were evaluated against venoms of Daboia russelli russelli (Russell's viper) and Naja naja (Indian cobra). The aqueous extracts of leaves and roots of the above-mentioned plants and their polyherbal (1 : 1 : 1) formulation at a dose of 200 mg/kg showed protection against envenomed mice with LD50 doses of 0.44 mg/kg and 0.28 mg/kg against Russell's viper and cobra venom, respectively. In in vitro antioxidant activities sample extracts showed free radical scavenging effects in dose dependent manner. Computational drug design and docking studies were carried out to predict the neutralizing principles of type I phospholipase A2 (PLA2) from Indian common krait venom. This confirmed that aristolochic acid and leucasin can neutralize type I PLA2 enzyme. Results suggest that these plants could serve as a source of natural antioxidants and common antidote for snake bite. However, further studies are needed to identify the lead molecule responsible for antidote activity. PMID:23533518
Collaborative Beamfocusing Radio (COBRA)
NASA Astrophysics Data System (ADS)
Rode, Jeremy P.; Hsu, Mark J.; Smith, David; Husain, Anis
2013-05-01
A Ziva team has recently demonstrated a novel technique called Collaborative Beamfocusing Radios (COBRA) which enables an ad-hoc collection of distributed commercial off-the-shelf software defined radios to coherently align and beamform to a remote radio. COBRA promises to operate even in high multipath and non-line-of-sight environments as well as mobile applications without resorting to computationally expensive closed loop techniques that are currently unable to operate with significant movement. COBRA exploits two key technologies to achieve coherent beamforming. The first is Time Reversal (TR) which compensates for multipath and automatically discovers the optimal spatio-temporal matched filter to enable peak signal gains (up to 20 dB) and diffraction-limited focusing at the intended receiver in NLOS and severe multipath environments. The second is time-aligned buffering which enables TR to synchronize distributed transmitters into a collaborative array. This time alignment algorithm avoids causality violations through the use of reciprocal buffering. Preserving spatio-temporal reciprocity through the TR capture and retransmission process achieves coherent alignment across multiple radios at ~GHz carriers using only standard quartz-oscillators. COBRA has been demonstrated in the lab, aligning two off-the-shelf software defined radios over-the-air to an accuracy of better than 2 degrees of carrier alignment at 450 MHz. The COBRA algorithms are lightweight, with computation in 5 ms on a smartphone class microprocessor. COBRA also has low start-up latency, achieving high accuracy from a cold-start in 30 ms. The COBRA technique opens up a large number of new capabilities in communications, and electronic warfare including selective spatial jamming, geolocation and anti-geolocation.
Venom-gland transcriptome and venom proteome of the Malaysian king cobra (Ophiophagus hannah).
Tan, Choo Hock; Tan, Kae Yi; Fung, Shin Yee; Tan, Nget Hong
2015-09-10
The king cobra (Ophiophagus hannah) is widely distributed throughout many parts of Asia. This study aims to investigate the complexity of Malaysian Ophiophagus hannah (MOh) venom for a better understanding of king cobra venom variation and its envenoming pathophysiology. The venom gland transcriptome was investigated using the Illumina HiSeq™ platform, while the venom proteome was profiled by 1D-SDS-PAGE-nano-ESI-LCMS/MS. Transcriptomic results reveal high redundancy of toxin transcripts (3357.36 FPKM/transcript) despite small cluster numbers, implying gene duplication and diversification within restricted protein families. Among the 23 toxin families identified, three-finger toxins (3FTxs) and snake-venom metalloproteases (SVMPs) have the most diverse isoforms. These 2 toxin families are also the most abundantly transcribed, followed in descending order by phospholipases A2 (PLA2s), cysteine-rich secretory proteins (CRISPs), Kunitz-type inhibitors (KUNs), and L-amino acid oxidases (LAAOs). Seventeen toxin families exhibited low mRNA expression, including hyaluronidase, DPP-IV and 5'-nucleotidase that were not previously reported in the venom-gland transcriptome of a Balinese O. hannah. On the other hand, the MOh proteome includes 3FTxs, the most abundantly expressed proteins in the venom (43 % toxin sbundance). Within this toxin family, there are 6 long-chain, 5 short-chain and 2 non-conventional 3FTx. Neurotoxins comprise the major 3FTxs in the MOh venom, consistent with rapid neuromuscular paralysis reported in systemic envenoming. The presence of toxic enzymes such as LAAOs, SVMPs and PLA2 would explain tissue inflammation and necrotising destruction in local envenoming. Dissimilarities in the subtypes and sequences between the neurotoxins of MOh and Naja kaouthia (monocled cobra) are in agreement with the poor cross-neutralization activity of N. kaouthia antivenom used against MOh venom. Besides, the presence of cobra venom factor, nerve growth factors, phosphodiesterase, 5'-nucleotidase, and DPP-IV in the venom proteome suggests its probable hypotensive action in subduing prey. This study reports the diversity and abundance of toxins in the venom of the Malaysian king cobra (MOh). The results correlate with the pathophysiological actions of MOh venom, and dispute the use of Naja cobra antivenoms to treat MOh envenomation. The findings also provide a deeper insight into venom variations due to geography, which is crucial for the development of a useful pan-regional antivenom.
Quantitative prediction of cellular metabolism with constraint-based models: the COBRA Toolbox v2.0
Schellenberger, Jan; Que, Richard; Fleming, Ronan M. T.; Thiele, Ines; Orth, Jeffrey D.; Feist, Adam M.; Zielinski, Daniel C.; Bordbar, Aarash; Lewis, Nathan E.; Rahmanian, Sorena; Kang, Joseph; Hyduke, Daniel R.; Palsson, Bernhard Ø.
2012-01-01
Over the past decade, a growing community of researchers has emerged around the use of COnstraint-Based Reconstruction and Analysis (COBRA) methods to simulate, analyze and predict a variety of metabolic phenotypes using genome-scale models. The COBRA Toolbox, a MATLAB package for implementing COBRA methods, was presented earlier. Here we present a significant update of this in silico ToolBox. Version 2.0 of the COBRA Toolbox expands the scope of computations by including in silico analysis methods developed since its original release. New functions include: (1) network gap filling, (2) 13C analysis, (3) metabolic engineering, (4) omics-guided analysis, and (5) visualization. As with the first version, the COBRA Toolbox reads and writes Systems Biology Markup Language formatted models. In version 2.0, we improved performance, usability, and the level of documentation. A suite of test scripts can now be used to learn the core functionality of the Toolbox and validate results. This Toolbox lowers the barrier of entry to use powerful COBRA methods. PMID:21886097
Logistics Support Analysis Techniques Guide
1985-03-15
LANGUAGE (DATA RECORDS) FORTRAN CDC 6600 D&V FSD P/D A H REMA-RKS: Program n-s-ists of F PLIATIffIONS, approx 4000 line of coding , 3 Safegard, AN/FSC... FORTRAN IV -EW-RAK9-- The model consz.sts of IT--k-LIC- I-U-0NS: approximately 367 lines of SiNCGARS, PERSHING II coding . %.’. ~ LSA TASK INTERFACE...system supported by Computer’ Systems Command. The current version of LADEN is coded totally in FORTRAN for virtual memory operating system
COBRA ATD multispectral camera response model
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.
Carter, Donald M.; Darby, Christopher A.; Lefoley, Bradford C.; Crevar, Corey J.; Alefantis, Timothy; Oomen, Raymond; Anderson, Stephen F.; Strugnell, Tod; Cortés-Garcia, Guadalupe; Vogel, Thorsten U.; Parrington, Mark; Kleanthous, Harold
2016-01-01
ABSTRACT One of the challenges of developing influenza A vaccines is the diversity of antigenically distinct isolates. Previously, a novel hemagglutinin (HA) for H5N1 influenza was derived from a methodology termed computationally optimized broadly reactive antigen (COBRA). This COBRA HA elicited a broad antibody response against H5N1 isolates from different clades. We now report the development and characterization of a COBRA-based vaccine for both seasonal and pandemic H1N1 influenza virus isolates. Nine prototype H1N1 COBRA HA proteins were developed and tested in mice using a virus-like particle (VLP) format for the elicitation of broadly reactive, functional antibody responses and protection against viral challenge. These candidates were designed to recognize H1N1 viruses isolated within the last 30 years. In addition, several COBRA candidates were designed based on sequences of H1N1 viruses spanning the past 100 years, including modern pandemic H1N1 isolates. Four of the 9 H1N1 COBRA HA proteins (X1, X3, X6, and P1) had the broadest hemagglutination inhibition (HAI) activity against a panel of 17 H1N1 viruses. These vaccines were used in cocktails or prime-boost combinations. The most effective regimens that both elicited the broadest HAI response and protected mice against a pandemic H1N1 challenge were vaccines that contained the P1 COBRA VLP and either the X3 or X6 COBRA VLP vaccine. These mice had little or no detectable viral replication, comparable to that observed with a matched licensed vaccine. This is the first report describing a COBRA-based HA vaccine strategy that elicits a universal, broadly reactive, protective response against seasonal and pandemic H1N1 isolates. IMPORTANCE Universal influenza vaccine approaches have the potential to be paradigm shifting for the influenza vaccine field, with the goal of replacing the current standard of care with broadly cross-protective vaccines. We have used COBRA technology to develop an HA head-based strategy that elicits antibodies against many H1 strains that have undergone genetic drift and has potential as a “subtype universal” vaccine. Nine HA COBRA candidates were developed, and these vaccines were used alone, in cocktails or in prime-boost combinations. The most effective regimens elicited the broadest hemagglutination inhibition (HAI) response against a panel of H1N1 viruses isolated over the past 100 years. This is the first report describing a COBRA-based HA vaccine strategy that elicits a broadly reactive response against seasonal and pandemic H1N1 isolates. PMID:26912624
Implementing the UCSD PASCAL system on the MODCOMP computer. [deep space network
NASA Technical Reports Server (NTRS)
Wolfe, T.
1980-01-01
The implementation of an interactive software development system (UCSD PASCAL) on the MODCOMP computer is discussed. The development of an interpreter for the MODCOMP II and the MODCOMP IV computers, written in MODCOMP II assembly language, is described. The complete Pascal programming system was run successfully on a MODCOMP II and MODCOMP IV under both the MAX II/III and MAX IV operating systems. The source code for an 8080 microcomputer version of the interpreter was used as the design for the MODCOMP interpreter. A mapping of the functions within the 8080 interpreter into MODCOMP II assembly language was the method used to code the interpreter.
1988-07-01
8217) END IF C..... SOLUTION STEP 9 C COMPUTE THE AMPLITUDE A(A.’) C CALL AMPAP c WERKY1) NOW COtJ1AINS ACCA ,’). THE kF.,ELTED DIRECT BEAM C C END OF...FUNCTID IN IS USED;’//, SIGMA(V.COS(PSI)) = )’Y)I(44PI) wHERE/fl 2’ IV V S(V) ALPHA(Y) S,/ALPH-A!) 102 FORMAT ( H 14, F8 .IF8 .3,FI10.3, FlI1.3 END 95 §5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watanabe, Kousuke; Emoto, Noriko; Sunohara, Mitsuhiro
2010-08-27
Research highlights: {yields} Incubating PCR products at a high temperature causes smears in gel electrophoresis. {yields} Smears interfere with the interpretation of methylation analysis using COBRA. {yields} Treatment with exonuclease I and heat-labile alkaline phosphatase eliminates smears. {yields} The elimination of smears improves the visibility of COBRA. -- Abstract: DNA methylation plays a vital role in the regulation of gene expression. Abnormal promoter hypermethylation is an important mechanism of inactivating tumor suppressor genes in human cancers. Combined bisulfite restriction analysis (COBRA) is a widely used method for identifying the DNA methylation of specific CpG sites. Here, we report that exonucleasemore » I and heat-labile alkaline phosphatase can be used for PCR purification for COBRA, improving the visibility of gel electrophoresis after restriction digestion. This improvement is observed when restriction digestion is performed at a high temperature, such as 60 {sup o}C or 65 {sup o}C, with BstUI and TaqI, respectively. This simple method can be applied instead of DNA purification using spin columns or phenol/chloroform extraction. It can also be applied to other situations when PCR products are digested by thermophile-derived restriction enzymes, such as PCR restriction fragment length polymorphism (RFLP) analysis.« less
COBRA ATD minefield detection results for the Joint Countermine ACTD Demonstrations
NASA Astrophysics Data System (ADS)
Stetson, Suzanne P.; Witherspoon, Ned H.; Holloway, John H., Jr.; Suiter, Harold R.; Crosby, Frank J.; Hilton, Russell J.; McCarley, Karen A.
2000-08-01
The Coastal Battlefield Reconnaissance and Analysis)COBRA) system described here was a Marine Corps Advanced Technology Demonstration (ATD) development consisting of an unmanned aerial vehicle (UAV) airborne multispectral video sensor system and ground station which processes the multispectral video data to automatically detect minefields along the flight path. After successful completion of the ATD, the residual COBRA ATD system participated in the Joint Countermine (JCM) Advanced Concept Technology Demonstration (ACTD) Demo I held at Camp Lejeune, North Carolina in conjunction with JTFX97 and Demo II held in Stephenville, Newfoundland in conjunction with MARCOT98. These exercises demonstrated the COBRA ATD system in an operational environment, detecting minefields that included several different mine types in widely varying backgrounds. The COBRA system performed superbly during these demonstrations, detecting mines under water, in the surf zone, on the beach, and inland, and has transitioned to an acquisition program. This paper describes the COBRA operation and performance results for these demonstrations, which represent the first demonstrated capability for remote tactical minefield detection from a UAV. The successful COBRA technologies and techniques demonstrated for tactical UAV minefield detection in the Joint Countermine Advanced Concept Technology Demonstrations have formed the technical foundation for future developments in Marine Corps, Navy, and Army tactical remote airborne mine detection systems.
Wong, Terianne M.; Allen, James D.; Bebin-Blackwell, Anne-Gaelle; Carter, Donald M.; Alefantis, Timothy; DiNapoli, Joshua; Kleanthous, Harold
2017-01-01
ABSTRACT Each influenza season, a set of wild-type viruses, representing one H1N1, one H3N2, and one to two influenza B isolates, are selected for inclusion in the annual seasonal influenza vaccine. In order to develop broadly reactive subtype-specific influenza vaccines, a methodology called computationally optimized broadly reactive antigens (COBRA) was used to design novel hemagglutinin (HA) vaccine immunogens. COBRA technology was effectively used to design HA immunogens that elicited antibodies that neutralized H5N1 and H1N1 isolates. In this report, the development and characterization of 17 prototype H3N2 COBRA HA proteins were screened in mice and ferrets for the elicitation of antibodies with HA inhibition (HAI) activity against human seasonal H3N2 viruses that were isolated over the last 48 years. The most effective COBRA HA vaccine regimens elicited antibodies with broader HAI activity against a panel of H3N2 viruses than wild-type H3 HA vaccines. The top leading COBRA HA candidates were tested against cocirculating variants. These variants were not efficiently detected by antibodies elicited by the wild-type HA from viruses selected as the vaccine candidates. The T-11 COBRA HA vaccine elicited antibodies with HAI and neutralization activity against all cocirculating variants from 2004 to 2007. This is the first report demonstrating broader breadth of vaccine-induced antibodies against cocirculating H3N2 strains compared to the wild-type HA antigens that were represented in commercial influenza vaccines. IMPORTANCE There is a need for an improved influenza vaccine that elicits immune responses that recognize a broader number of influenza virus strains to prevent infection and transmission. Using the COBRA approach, a set of vaccines against influenza viruses in the H3N2 subtype was tested for the ability to elicit antibodies that neutralize virus infection against not only historical vaccine strains of H3N2 but also a set of cocirculating variants that circulated between 2004 and 2007. Three of the H3N2 COBRA vaccines recognized all of the cocirculating strains during this era, but the chosen wild-type vaccine strains were not able to elicit antibodies with HAI activity against these cocirculating strains. Therefore, the COBRA vaccines have the ability to elicit protective antibodies against not only the dominant vaccine strains but also minor circulating strains that can evolve into the dominant vaccine strains in the future. PMID:28978710
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilkey, Lindsay
This milestone presents a demonstration of the High-to-Low (Hi2Lo) process in the VVI focus area. Validation and additional calculations with the commercial computational fluid dynamics code, STAR-CCM+, were performed using a 5x5 fuel assembly with non-mixing geometry and spacer grids. This geometry was based on the benchmark experiment provided by Westinghouse. Results from the simulations were compared to existing experimental data and to the subchannel thermal-hydraulics code COBRA-TF (CTF). An uncertainty quantification (UQ) process was developed for the STAR-CCM+ model and results of the STAR UQ were communicated to CTF. Results from STAR-CCM+ simulations were used as experimental design pointsmore » in CTF to calibrate the mixing parameter β and compared to results obtained using experimental data points. This demonstrated that CTF’s β parameter can be calibrated to match existing experimental data more closely. The Hi2Lo process for the STAR-CCM+/CTF code coupling was documented in this milestone and closely linked L3:VVI.H2LP15.01 milestone report.« less
Fooshee, David R.; Nguyen, Tran B.; Nizkorodov, Sergey A.; Laskin, Julia; Laskin, Alexander; Baldi, Pierre
2012-01-01
Atmospheric organic aerosols (OA) represent a significant fraction of airborne particulate matter and can impact climate, visibility, and human health. These mixtures are difficult to characterize experimentally due to their complex and dynamic chemical composition. We introduce a novel Computational Brewing Application (COBRA) and apply it to modeling oligomerization chemistry stemming from condensation and addition reactions in OA formed by photooxidation of isoprene. COBRA uses two lists as input: a list of chemical structures comprising the molecular starting pool, and a list of rules defining potential reactions between molecules. Reactions are performed iteratively, with products of all previous iterations serving as reactants for the next. The simulation generated thousands of structures in the mass range of 120–500 Da, and correctly predicted ~70% of the individual OA constituents observed by high-resolution mass spectrometry. Select predicted structures were confirmed with tandem mass spectrometry. Esterification was shown to play the most significant role in oligomer formation, with hemiacetal formation less important, and aldol condensation insignificant. COBRA is not limited to atmospheric aerosol chemistry; it should be applicable to the prediction of reaction products in other complex mixtures for which reasonable reaction mechanisms and seed molecules can be supplied by experimental or theoretical methods. PMID:22568707
DOT National Transportation Integrated Search
1980-06-01
A specific goal of safety is to reduce the number of injuries that may result from the collision of two trains. In Volume IV, a computer code for the simulated crash of two railcar consists is described. The code is capable of simulating the mechanic...
COBRA-Seq: Sensitive and Quantitative Methylome Profiling
Varinli, Hilal; Statham, Aaron L.; Clark, Susan J.; Molloy, Peter L.; Ross, Jason P.
2015-01-01
Combined Bisulfite Restriction Analysis (COBRA) quantifies DNA methylation at a specific locus. It does so via digestion of PCR amplicons produced from bisulfite-treated DNA, using a restriction enzyme that contains a cytosine within its recognition sequence, such as TaqI. Here, we introduce COBRA-seq, a genome wide reduced methylome method that requires minimal DNA input (0.1–1.0 μg) and can either use PCR or linear amplification to amplify the sequencing library. Variants of COBRA-seq can be used to explore CpG-depleted as well as CpG-rich regions in vertebrate DNA. The choice of enzyme influences enrichment for specific genomic features, such as CpG-rich promoters and CpG islands, or enrichment for less CpG dense regions such as enhancers. COBRA-seq coupled with linear amplification has the additional advantage of reduced PCR bias by producing full length fragments at high abundance. Unlike other reduced representative methylome methods, COBRA-seq has great flexibility in the choice of enzyme and can be multiplexed and tuned, to reduce sequencing costs and to interrogate different numbers of sites. Moreover, COBRA-seq is applicable to non-model organisms without the reference genome and compatible with the investigation of non-CpG methylation by using restriction enzymes containing CpA, CpT, and CpC in their recognition site. PMID:26512698
Venomous snakebite in Thailand. I: Medically important snakes.
Chanhome, L; Cox, M J; Wilde, H; Jintakoon, P; Chaiyabutr, N; Sitprija, V
1998-05-01
Thailand has an abundance of venomous snakes. Among the neurotoxic family Elapidae, there are three species of the genus Naja (cobras), three of the genus Bungarus (kraits), and the king cobra of the genus Ophiophagus. Other Elapidae snakes in Thailand include sea snakes and Asian coral snakes of the genus Calliophis. They have potent venoms but rarely bite humans. Tissue and hemotoxic snakes are represented by family Viperidae, subfamilies Viperinae and Crotalinae. They remain an occupational hazard for farmers and rubber tappers, causing serious morbidity but only rare deaths, since competent treatment is now widely available throughout Thailand. Purified equine antivenin is manufactured locally for the monocled and Siamese spitting cobras (Naja kaouthia and N. siamensis), king cobra (Ophiophagus hannah), banded krait (Bungarus fasciatus), most green pit vipers (Trimeresurus sp.), Malayan pit viper (Calloselasma rhodostoma), and the Siamese Russell's viper (Daboia russelli siamensis).
Rahmy, T R; Hemmaid, K Z
2001-05-01
The present study aimed to examine the prophylactic action of oral administration of two doses of garlic on the histological and histochemical patterns of the gastric and hepatic tissues in rats envenomed with cobra snake. The study included the following groups: Group I contained control rats orally administered distilled water for ten days. Group II included rats orally administered daily for ten days with the equivalent therapeutic dose of garlic to rat (18 mg/kg body weight). Group III included rats orally administered daily for ten days with double the equivalent therapeutic dose of garlic to rat (36 mg/kg body weight). Group IV contained rats intramuscularly (i.m.) injected with 1/2 LD50 of cobra venom (0.0125 microg venom/gm body weight) and dissected after 6 hr from injection. Groups V and VI contained rats daily administered with the previous two doses of garlic for ten days, respectively, followed by a single i.m. injection of the above dose of cobra venom after 24 hr from the last garlic application. Rats of these two groups were dissected after 6 hr from venom injection. Administration of the therapeutic dose of garlic induced slight cytoplasmic granulation in some hepatic cells. However, administration of double the therapeutic dose caused swelling, necrosis, and damage of the gastric glandular epithelia together with signs of erosion, exfoliation, and necrosis of the surface mucosal cells. It also induced swelling and coalescence of the hepatic cells, loss of the normal arrangement of the hepatic cords, and hypertrophy of Kupffer cells. Injection with cobra venom caused loss of the normal characteristic appearance of the gastric glands and the epithelial lining cells of the gastric folds and the appearance of numerous inflammatory cells in the lamina properia. It also induced the occurrence of highly swollen hepatic cells, hepatic cellular necrosis and damage, as well as activated Kupffer cells. Nevertheless, pretreatment with the therapeutic dose of garlic for ten days induced a prophylactic activity against the pathogenic effects of the venom in both tissues, which appeared more or less normal except for very minor abnormalities. However, application of double the therapeutic dose of garlic for the same duration did not induce any prophylactic activity. Histochemically, slight alterations were noticed in the polysaccharide, protein, and nucleic acid contents of the gastric mucosa and the hepatic tissues due to administration of the therapeutic doses of garlic. However, severe depletions of these components were recorded in both tissues due to administration of double the therapeutic doses of garlic or injection of cobravenom or the application of both of them together. On the contrary, minor changes were noticed in the histochemical patterns of both tissues in rats pretreated with the therapeutic doses of garlic prior to venom application. It could be concluded that oral administration of the therapeutic dose of garlic for ten days has no serious side effects on gastric and hepatic tissues and could be used as a prophylactic tool against cobra snake envenomation.
Luna, M G; Martins, M M; Newton, S M; Costa, S O; Almeida, D F; Ferreira, L C
1997-01-01
Oligonucleotides coding for linear epitopes of the fimbrial colonization factor antigen I (CFA/I) of enterotoxigenic Escherichia coli (ETEC) were cloned and expressed in a deleted form of the Salmonella muenchen flagellin fliC (H1-d) gene. Four synthetic oligonucleotide pairs coding for regions corresponding to amino acids 1 to 15 (region I), amino acids 11 to 25 (region II), amino acids 32 to 45 (region III) and amino acids 88 to 102 (region IV) were synthesized and cloned in the Salmonella flagellin-coding gene. All four hybrid flagellins were exported to the bacterial surface where they produced flagella, but only three constructs were fully motile. Sera recovered from mice immunized with intraperitoneal injections of purified flagella containing region II (FlaII) or region IV (FlaIV) showed high titres against dissociated solid-phase-bound CFA/I subunits. Hybrid flagellins containing region I (FlaI) or region III (FlaIII) elicited a weak immune response as measured in enzyme-linked immunosorbent assay (ELISA) with dissociated CFA/I subunits. None of the sera prepared with purified hybrid flagella were able to agglutinate or inhibit haemagglutination promoted by CFA/I-positive strains. Moreover, inhibition ELISA tests indicated that antisera directed against region I, II, III or IV cloned in flagellin were not able to recognize surface-exposed regions on the intact CFA/I fimbriae.
Carter, Donald M; Darby, Christopher A; Johnson, Scott K; Carlock, Michael A; Kirchenbaum, Greg A; Allen, James D; Vogel, Thorsten U; Delagrave, Simon; DiNapoli, Joshua; Kleanthous, Harold; Ross, Ted M
2017-12-15
Most preclinical animal studies test influenza vaccines in immunologically naive animal models, even though the results of vaccination may not accurately reflect the effectiveness of vaccine candidates in humans that have preexisting immunity to influenza. In this study, novel, broadly reactive influenza vaccine candidates were assessed in preimmune ferrets. These animals were infected with different H1N1 isolates before being vaccinated or infected with another influenza virus. Previously, our group has described the design and characterization of computationally optimized broadly reactive hemagglutinin (HA) antigens (COBRA) for H1N1 isolates. Vaccinating ferrets with virus-like particle (VLP) vaccines expressing COBRA HA proteins elicited antibodies with hemagglutination inhibition (HAI) activity against more H1N1 viruses in the panel than VLP vaccines expressing wild-type HA proteins. Specifically, ferrets infected with the 1986 virus and vaccinated with a single dose of the COBRA HA VLP vaccines elicited antibodies with HAI activity against 11 to 14 of the 15 H1N1 viruses isolated between 1934 and 2013. A subset of ferrets was infected with influenza viruses expressing the COBRA HA antigens. These COBRA preimmune ferrets had superior breadth of HAI activity after vaccination with COBRA HA VLP vaccines than COBRA preimmune ferrets vaccinated with VLP vaccines expressing wild-type HA proteins. Overall, priming naive ferrets with COBRA HA based viruses or using COBRA HA based vaccines to boost preexisting antibodies induced by wild-type H1N1 viruses, COBRA HA antigens elicited sera with the broadest HAI reactivity against multiple antigenic H1N1 viral variants. This is the first report demonstrating the effectiveness of a broadly reactive or universal influenza vaccine in a preimmune ferret model. IMPORTANCE Currently, many groups are testing influenza vaccine candidates to meet the challenge of developing a vaccine that elicits broadly reactive and long-lasting protective immune responses. The goal of these vaccines is to stimulate immune responses that react against most, if not all, circulating influenza strains, over a long period of time in all populations of people. Commonly, these experimental vaccines are tested in naive animal models that do not have anti-influenza immune responses; however, humans have preexisting immunity to influenza viral antigens, particularly antibodies to the HA and NA glycoproteins. Therefore, this study investigated how preexisting antibodies to historical influenza viruses influenced HAI-specific antibodies and protective efficacy using a broadly protective vaccine candidate. Copyright © 2017 American Society for Microbiology.
Carter, Donald M.; Darby, Christopher A.; Johnson, Scott K.; Carlock, Michael A.; Kirchenbaum, Greg A.; Allen, James D.; Vogel, Thorsten U.; Delagrave, Simon; DiNapoli, Joshua; Kleanthous, Harold
2017-01-01
ABSTRACT Most preclinical animal studies test influenza vaccines in immunologically naive animal models, even though the results of vaccination may not accurately reflect the effectiveness of vaccine candidates in humans that have preexisting immunity to influenza. In this study, novel, broadly reactive influenza vaccine candidates were assessed in preimmune ferrets. These animals were infected with different H1N1 isolates before being vaccinated or infected with another influenza virus. Previously, our group has described the design and characterization of computationally optimized broadly reactive hemagglutinin (HA) antigens (COBRA) for H1N1 isolates. Vaccinating ferrets with virus-like particle (VLP) vaccines expressing COBRA HA proteins elicited antibodies with hemagglutination inhibition (HAI) activity against more H1N1 viruses in the panel than VLP vaccines expressing wild-type HA proteins. Specifically, ferrets infected with the 1986 virus and vaccinated with a single dose of the COBRA HA VLP vaccines elicited antibodies with HAI activity against 11 to 14 of the 15 H1N1 viruses isolated between 1934 and 2013. A subset of ferrets was infected with influenza viruses expressing the COBRA HA antigens. These COBRA preimmune ferrets had superior breadth of HAI activity after vaccination with COBRA HA VLP vaccines than COBRA preimmune ferrets vaccinated with VLP vaccines expressing wild-type HA proteins. Overall, priming naive ferrets with COBRA HA based viruses or using COBRA HA based vaccines to boost preexisting antibodies induced by wild-type H1N1 viruses, COBRA HA antigens elicited sera with the broadest HAI reactivity against multiple antigenic H1N1 viral variants. This is the first report demonstrating the effectiveness of a broadly reactive or universal influenza vaccine in a preimmune ferret model. IMPORTANCE Currently, many groups are testing influenza vaccine candidates to meet the challenge of developing a vaccine that elicits broadly reactive and long-lasting protective immune responses. The goal of these vaccines is to stimulate immune responses that react against most, if not all, circulating influenza strains, over a long period of time in all populations of people. Commonly, these experimental vaccines are tested in naive animal models that do not have anti-influenza immune responses; however, humans have preexisting immunity to influenza viral antigens, particularly antibodies to the HA and NA glycoproteins. Therefore, this study investigated how preexisting antibodies to historical influenza viruses influenced HAI-specific antibodies and protective efficacy using a broadly protective vaccine candidate. PMID:28978709
The I-V Measurement System for Solar Cells Based on MCU
NASA Astrophysics Data System (ADS)
Fengxiang, Chen; Yu, Ai; Jiafu, Wang; Lisheng, Wang
2011-02-01
In this paper, an I-V measurement system for solar cells based on Single-chip Microcomputer (MCU) is presented. According to the test principles of solar cells, this measurement system mainly comprises of two parts—data collecting, data processing and displaying. The MCU mainly used as to acquire data, then the collecting results is sent to the computer by serial port. The I-V measurement results of our test system are shown in the human-computer interaction interface based on our hardware circuit. By comparing the test results of our I-V tester and the results of other commercial I-V tester, we found errors for most parameters are less than 5%, which shows our I-V test result is reliable. Because the MCU can be applied in many fields, this I-V measurement system offers a simple prototype for portable I-V tester for solar cells.
User interfaces for computational science: A domain specific language for OOMMF embedded in Python
NASA Astrophysics Data System (ADS)
Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans
2017-05-01
Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.
e-MERLIN and the COBRaS legacy Project
NASA Astrophysics Data System (ADS)
Peck, Luke
2011-07-01
As one of the 12 legacy programmes given ~300 hrs observing time on the newly enhanced e-MERLIN; the Cygnus OB2 Radio Survey (COBRaS) (homepage: http://www.homepages.ucl.ac.uk/~ucapdwi/cobras/) is an intensive deep-field mapping of the Cyg OB2 association in the Cygnus region of our Galaxy. This will provide the most detailed census for the most massive OB association in the northern hemisphere. A range of astrophysical problems and themes will be investigated including: mass loss and evolution of massive stars; the formation, dynamics and content of massive OB associations and the frequency of massive binaries and the incidence of non-thermal radiation. As part of of the initial ground work for this project, extensive meta-data catalogues were amalgamated from various catalogues from the Virtual Observatory database. In this talk I will discuss; investigations into JHK photometric techniques which can help identify possible OB candidates and other spectral classes; theoretical mass loss models as described by Vink et al. 2001 along with stellar parameters from Martins et al. 2005, Searle et al. 2008 and Prinja et al 1990 which pave the way to calculate theoretical mass loss rates for smooth winds of O stars and B supergiants, and the predicted 6cm fluxes resulting from the thermal free-free radiation in their winds. This will be essential for the study of clumped winds which is an early goal for this project. Over the months following from when this abstract was written; the first e-MERLIN pointings are expected. In the event of obtaining data, this will also be included in the presentation as part of the 'early science' from e-MERLIN and COBRaS.
New progress in snake mitochondrial gene rearrangement.
Chen, Nian; Zhao, Shujin
2009-08-01
To further understand the evolution of snake mitochondrial genomes, the complete mitochondrial DNA (mtDNA) sequences were determined for representative species from two snake families: the Many-banded krait, the Banded krait, the Chinese cobra, the King cobra, the Hundred-pace viper, the Short-tailed mamushi, and the Chain viper. Thirteen protein-coding genes, 22-23 tRNA genes, 2 rRNA genes, and 2 control regions were identified in these mtDNAs. Duplication of the control region and translocation of the tRNAPro gene were two notable features of the snake mtDNAs. These results from the gene rearrangement comparisons confirm the correctness of traditional classification schemes and validate the utility of comparing complete mtDNA sequences for snake phylogeny reconstruction.
MIADS2 ... an alphanumeric map information assembly and display system for a large computer
Elliot L. Amidon
1966-01-01
A major improvement and extension of the Map Information Assembly and Display System (MIADS) developed in 1964 is described. Basic principles remain unchanged, but the computer programs have been expanded and rewritten for a large computer, in Fortran IV and MAP languages. The code system is extended from 99 integers to about 2,200 alphanumeric 2-character codes. Hand-...
Spitting cobras: fluid jets in nature as models for technical applications
NASA Astrophysics Data System (ADS)
Balmert, Alexander; Hess, David; Brücker, Christoph; Bleckmann, Horst; Westhoff, Guido
2011-04-01
Spitting cobras defend themselves by ejecting rapid jets of venom through their fangs towards the face of an offender. To generate these jets, the venom delivery system of spitting cobras has some unique adaptations, such as prominent ridges on the surface of the venom channel. We examined the fluid acceleration mechanisms in three spitting cobra species of the genus Naja. To investigate the liquid-flow through the venom channel we built a three-dimensional 60:1 scale model. First we determined the three-dimensional structure of the channel by using microcomputer tomography. With help of the micro computer tomographical data we then created a negative form out of wax. Finally, silicon was casted around the wax form and the wax removed, resulting in a completely transparent model of the cobrás venom channel. The physical-chemical properties of the cobra venom were measured by micro rheometry and tensiometry. Thereafter, an artificial fluid with similar properties was generated. Particle image velocimetry (PIV) was performed to visualize the flow of the artificial liquid in the three-dimensional model. Our experiments show how the surface structure of the venom channel determines the liquid flow through the channel and ultimately the form of the liquid jet. Understanding the biological mechanisms of venom ejection helps to enhance industrial processes such as water jet cutting and cleaning as well as injection methods in technical and medical sectors, e.g. liquid microjet dissection in microsurgery.
Radiation damage of gallium arsenide production cells
NASA Technical Reports Server (NTRS)
Mardesich, N.; Joslin, D.; Garlick, J.; Lillington, D.; Gillanders, M.; Cavicchi, B.; Scott-Monck, J.; Kachare, R.; Anspaugh, B.
1987-01-01
High efficiency liquid phase epitaxy (LPE) gallium arsenide cells were irradiated with 1 Mev electrons up to fluences of 1 times 10 to the 16th power cm-2. Measurements of spectral response and dark and illuminated I-V data were made at each fluence and then, using computer codes, the experimental data was fitted to gallium arsenide cell models. In this way it was possible to determine the extent of the damage, and hence damage coefficients in both the emitter and base of the cell.
NASA Astrophysics Data System (ADS)
Lidar, Daniel A.; Brun, Todd A.
2013-09-01
Prologue; Preface; Part I. Background: 1. Introduction to decoherence and noise in open quantum systems Daniel Lidar and Todd Brun; 2. Introduction to quantum error correction Dave Bacon; 3. Introduction to decoherence-free subspaces and noiseless subsystems Daniel Lidar; 4. Introduction to quantum dynamical decoupling Lorenza Viola; 5. Introduction to quantum fault tolerance Panos Aliferis; Part II. Generalized Approaches to Quantum Error Correction: 6. Operator quantum error correction David Kribs and David Poulin; 7. Entanglement-assisted quantum error-correcting codes Todd Brun and Min-Hsiu Hsieh; 8. Continuous-time quantum error correction Ognyan Oreshkov; Part III. Advanced Quantum Codes: 9. Quantum convolutional codes Mark Wilde; 10. Non-additive quantum codes Markus Grassl and Martin Rötteler; 11. Iterative quantum coding systems David Poulin; 12. Algebraic quantum coding theory Andreas Klappenecker; 13. Optimization-based quantum error correction Andrew Fletcher; Part IV. Advanced Dynamical Decoupling: 14. High order dynamical decoupling Zhen-Yu Wang and Ren-Bao Liu; 15. Combinatorial approaches to dynamical decoupling Martin Rötteler and Pawel Wocjan; Part V. Alternative Quantum Computation Approaches: 16. Holonomic quantum computation Paolo Zanardi; 17. Fault tolerance for holonomic quantum computation Ognyan Oreshkov, Todd Brun and Daniel Lidar; 18. Fault tolerant measurement-based quantum computing Debbie Leung; Part VI. Topological Methods: 19. Topological codes Héctor Bombín; 20. Fault tolerant topological cluster state quantum computing Austin Fowler and Kovid Goyal; Part VII. Applications and Implementations: 21. Experimental quantum error correction Dave Bacon; 22. Experimental dynamical decoupling Lorenza Viola; 23. Architectures Jacob Taylor; 24. Error correction in quantum communication Mark Wilde; Part VIII. Critical Evaluation of Fault Tolerance: 25. Hamiltonian methods in QEC and fault tolerance Eduardo Novais, Eduardo Mucciolo and Harold Baranger; 26. Critique of fault-tolerant quantum information processing Robert Alicki; References; Index.
Karim, Md Robiul; Yu, Fuchang; Li, Jian; Li, Junqiang; Zhang, Longxian; Wang, Rongjun; Rume, Farzana Islam; Jian, Fuchun; Zhang, Sumei; Ning, Changshen
2014-08-01
Enteric protozoa are frequently found in snakes. Nevertheless, few studies regarding genetic characterization of these parasites have been carried out. We describe here the first molecular survey of protozoan pathogens from snakes in China and the first report on Enterocytozoon bieneusi genotyping in snakes in the world. Here, 240 fecal specimens were collected from two species of captive snakes, Naja naja (Indian cobra) and Ptyas mucosus (Oriental rat snake), in Guangxi Province, China, and examined by PCR amplification of the small subunit-ribosomal RNA of enteric protozoa and the internal transcribed spacer of ribosomal RNA of E. bieneusi. Cryptosporidium serpentis was identified in three specimens (2.1%) of Oriental rat snakes. Caryospora was found in 5.4% specimens, including eight from cobras (8.1%) and five from rat snakes (3.6%), and represented six new species-Caryospora sp. SKC-2014a to Caryospora sp. SKC-2014 f. Three new Eimeria species, Eimeria sp. SKE-2014a to Eimeria sp. SKE-2014c, were detected in three specimens (2.1%) from rat snakes. Additionally, Sarcocystis sp. SKS-2014 was detected in one specimen from a cobra. The infection rates of E. bieneusi were 3.0% in cobras and 5.7% in rat snakes. Sequence analysis of 11 PCR products revealed the presence of six E. bieneusi genotypes-two known genotypes (type IV and Henan V) and four new genotypes (CRep-1 to CRep-4). All six E. bieneusi genotypes belonged to the zoonotic group (group 1). This result raised the possibility that E. bieneusi could be present in animals consumed by snakes. This should be taken into consideration to better understand the diversity of the parasite, its transmission through the predator-prey relationship, and public health implications.
Integrating technology to improve medication administration.
Prusch, Amanda E; Suess, Tina M; Paoletti, Richard D; Olin, Stephen T; Watts, Starann D
2011-05-01
The development, implementation, and evaluation of an i.v. interoperability program to advance medication safety at the bedside are described. I.V. interoperability integrates intelligent infusion devices (IIDs), the bar-code-assisted medication administration system, and the electronic medication administration record system into a bar-code-driven workflow that populates provider-ordered, pharmacist-validated infusion parameters on IIDs. The purpose of this project was to improve medication safety through the integration of these technologies and decrease the potential for error during i.v. medication administration. Four key phases were essential to developing and implementing i.v. interoperability: (a) preparation, (b) i.v. interoperability pilot, (c) preliminary validation, and (d) expansion. The establishment of pharmacy involvement in i.v. interoperability resulted in two additional safety checks: pharmacist infusion rate oversight and nurse independent validation of the autoprogrammed rate. After instituting i.v. interoperability, monthly compliance to the telemetry drug library increased to a mean ± S.D. of 72.1% ± 2.1% from 56.5% ± 1.5%, and the medical-surgical nursing unit's drug library monthly compliance rate increased to 58.6% ± 2.9% from 34.1% ± 2.6% (p < 0.001 for both comparisons). The number of manual pump edits decreased with both telemetry and medical-surgical drug libraries, demonstrating a reduction from 56.9 ± 12.8 to 14.2 ± 3.9 and from 61.2 ± 15.4 to 14.7 ± 3.8, respectively (p < 0.001 for both comparisons). Through the integration and incorporation of pharmacist oversight for rate changes, the telemetry and medical-surgical patient care areas demonstrated a 32% reduction in reported monthly errors involving i.v. administration of heparin. By integrating two stand-alone technologies, i.v. interoperability was implemented to improve medication administration. Medication errors were reduced, nursing workflow was simplified, and pharmacists became involved in checking infusion rates of i.v. medications.
Ter Wee, Marieke M; Coupé, Veerle Mh; den Uyl, Debby; Blomjous, Birgit S; Kooijmans, Esmee; Kerstens, Pit Jsm; Nurmohamed, Mike T; van Schaardenburg, Dirkjan; Voskuyl, Alexandre E; Boers, Maarten; Lems, Willem F
2017-01-01
To evaluate if COmbinatie therapie Bij Reumatoïde Artritis (COBRA)-light therapy is cost-effective in treating patients with early rheumatoid arthritis (RA) compared with COBRA therapy. This economic evaluation was performed next to the open-label, randomised non-inferiority COBRA-light trial in 164 patients with early RA. Non-responders to COBRA or COBRA-light received etanercept (50 mg/week) for 3-6 months. The societal perspective analysis took medical direct, non-medical direct and indirect costs into account. Costs were measured with patient cost diaries for the follow-up period of 52 weeks. Bootstrapping techniques estimated uncertainty around the cost-effectiveness ratios, presented in cost-effectiveness planes. 164 patients were randomised to either COBRA or COBRA-light strategy. At week 52, COBRA-light proved to be non-inferior to COBRA therapy on all clinical outcome measures. The results of the base-case cost-utility analysis (intention-to-treat analyses) revealed that COBRA-light strategy is more expensive (k€9.3 (SD 0.9) compared with COBRA (k€7.2 (SD 0.8)), but the difference in costs were not significant (k€2.0; 95% CI -0.3 to 4.4). Also, both strategies produced similar quality-adjusted life-years (QALYs). The sensitivity analyses showed robustness of these results. In a per-protocol sensitivity analysis, in which costs of etanercept were assumed to be provided as prescribed according to protocol, both arms had much higher costs: COBRA-light: k€11.5 (8.3) compared with k€8.5 (6.8) for COBRA, and the difference in costs was significant (k€2.9; 0.6 to 5.3). In the base-case cost-utility analysis, the two strategies produced similar QALYs for similar costs. But it is anticipated that if protocol had been followed correctly, the COBRA-light strategy would have been more costly due to additional etanercept costs, for a limited health gain. Given the limited added benefit and high costs of starting etanercept in the presence of low disease activity in our trial, such a strategy needs better justification than is available now. 55552928, Results.
On Flowfield Periodicity in the NASA Transonic Flutter Cascade. Part 2; Numerical Study
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.; McFarland, Eric R.; Wood, Jerry R.; Lepicovsky, Jan
2000-01-01
The transonic flutter cascade facility at NASA Glenn Research Center was redesigned based on a combined program of experimental measurements and numerical analyses. The objectives of the redesign were to improve the periodicity of the cascade in steady operation, and to better quantify the inlet and exit flow conditions needed for CFD predictions. Part I of this paper describes the experimental measurements, which included static pressure measurements on the blade and endwalls made using both static taps and pressure sensitive paints, cobra probe measurements of the endwall boundary layers and blade wakes, and shadowgraphs of the wave structure. Part II of this paper describes three CFD codes used to analyze the facility, including a multibody panel code, a quasi-three-dimensional viscous code, and a fully three-dimensional viscous code. The measurements and analyses both showed that the operation of the cascade was heavily dependent on the configuration of the sidewalls. Four configurations of the sidewalls were studied and the results are described. For the final configuration, the quasi-three-dimensional viscous code was used to predict the location of mid-passage streamlines for a perfectly periodic cascade. By arranging the tunnel sidewalls to approximate these streamlines, sidewall interference was minimized and excellent periodicity was obtained.
COBRA compliance: how employers can successfully meet today's complexities.
Trimble, Jim
2003-03-01
Although the architects of COBRA had sound and compassionate motivations in place, administration of and compliance with this law are far from easy. COBRA assists employees that lose their jobs by allowing them to purchase insurance benefits from their former employer. Outsourcing COBRA administration can be the best way for some employers to cope with COBRA regulations, contingencies and paperwork and avoid legal fees and penalties. But look for COBRA providers that have a sound track record.
CASL VMA Milestone Report FY16 (L3:VMA.VUQ.P13.08): Westinghouse Mixing with STAR-CCM+
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilkey, Lindsay Noelle
2016-09-30
STAR-CCM+ (STAR) is a high-resolution computational fluid dynamics (CFD) code developed by CD-adapco. STAR includes validated physics models and a full suite of turbulence models including ones from the k-ε and k-ω families. STAR is currently being developed to be able to do two phase flows, but the current focus of the software is single phase flow. STAR can use imported meshes or use the built in meshing software to create computation domains for CFD. Since the solvers generally require a fine mesh for good computational results, the meshes used with STAR tend to number in the millions of cells,more » with that number growing with simulation and geometry complexity. The time required to model the flow of a full 5x5 Mixing Vane Grid Assembly (5x5MVG) in the current STAR configuration is on the order of hours, and can be very computationally expensive. COBRA-TF (CTF) is a low-resolution subchannel code that can be trained using high fidelity data from STAR. CTF does not have turbulence models and instead uses a turbulent mixing coefficient β. With a properly calibrated β, CTF can be used a low-computational cost alternative to expensive full CFD calculations performed with STAR. During the Hi2Lo work with CTF and STAR, STAR-CCM+ will be used to calibrate β and to provide high-resolution results that can be used in the place of and in addition to experimental results to reduce the uncertainty in the CTF results.« less
Atmospheric Transmittance from 0.25 to 28.5 Microns: Computer Code LOWTRAN 3
1975-05-07
thait worKe’S whIo arc iirca~i) risinig I.A M’ Il V\\ 2 c~in ilwl(ite thir ciu -rd dvciks- ,Aith i nii~wimun of effort. \\ll rh ’,n ; and diir<to the lIt...specified on CARD I. The appropriate meteorological parameters and format for thr atmospheric data are given below. Z, P. T. DP. RH . WHL WO, AHAZE [FORMAT...M.GTOIREA!) 4Z9,ZfKJP(?,K),TM?,)P, RH ,HH(7.KI ,NO(1,KIANAZý (KI A &2,3E .JTIF!W(Z(K) b1.0E-6)#1. A 1:0sF IV(M.EQ.0) UK)=mil A UG IF(7(KI .GF..35.0I Jr(Z
ter Wee, Marieke M; Coupé, Veerle MH; den Uyl, Debby; Blomjous, Birgit S; Kooijmans, Esmee; Kerstens, Pit JSM; Nurmohamed, Mike T; van Schaardenburg, Dirkjan; Voskuyl, Alexandre E; Boers, Maarten; Lems, Willem F
2017-01-01
Objective To evaluate if COmbinatie therapie Bij Reumatoïde Artritis (COBRA)-light therapy is cost-effective in treating patients with early rheumatoid arthritis (RA) compared with COBRA therapy. Methods This economic evaluation was performed next to the open-label, randomised non-inferiority COBRA-light trial in 164 patients with early RA. Non-responders to COBRA or COBRA-light received etanercept (50 mg/week) for 3–6 months. The societal perspective analysis took medical direct, non-medical direct and indirect costs into account. Costs were measured with patient cost diaries for the follow-up period of 52 weeks. Bootstrapping techniques estimated uncertainty around the cost-effectiveness ratios, presented in cost-effectiveness planes. Results 164 patients were randomised to either COBRA or COBRA-light strategy. At week 52, COBRA-light proved to be non-inferior to COBRA therapy on all clinical outcome measures. The results of the base-case cost-utility analysis (intention-to-treat analyses) revealed that COBRA-light strategy is more expensive (k€9.3 (SD 0.9) compared with COBRA (k€7.2 (SD 0.8)), but the difference in costs were not significant (k€2.0; 95% CI –0.3 to 4.4). Also, both strategies produced similar quality-adjusted life-years (QALYs). The sensitivity analyses showed robustness of these results. In a per-protocol sensitivity analysis, in which costs of etanercept were assumed to be provided as prescribed according to protocol, both arms had much higher costs: COBRA-light: k€11.5 (8.3) compared with k€8.5 (6.8) for COBRA, and the difference in costs was significant (k€2.9; 0.6 to 5.3). Conclusions In the base-case cost-utility analysis, the two strategies produced similar QALYs for similar costs. But it is anticipated that if protocol had been followed correctly, the COBRA-light strategy would have been more costly due to additional etanercept costs, for a limited health gain. Given the limited added benefit and high costs of starting etanercept in the presence of low disease activity in our trial, such a strategy needs better justification than is available now. Trial registration number 55552928, Results. PMID:29119006
The Arabidopsis COBRA Protein Facilitates Cellulose Crystallization at the Plasma Membrane*
Sorek, Nadav; Sorek, Hagit; Kijac, Aleksandra; Szemenyei, Heidi J.; Bauer, Stefan; Hématy, Kian; Wemmer, David E.; Somerville, Chris R.
2014-01-01
Mutations in the Arabidopsis COBRA gene lead to defects in cellulose synthesis but the function of COBRA is unknown. Here we present evidence that COBRA localizes to discrete particles in the plasma membrane and is sensitive to inhibitors of cellulose synthesis, suggesting that COBRA and the cellulose synthase complex reside in close proximity on the plasma membrane. Live-cell imaging of cellulose synthesis indicated that, once initiated, cellulose synthesis appeared to proceed normally in the cobra mutant. Using isothermal calorimetry, COBRA was found to bind individual β1–4-linked glucan chains with a KD of 3.2 μm. Competition assays suggests that COBRA binds individual β1–4-linked glucan chains with higher affinity than crystalline cellulose. Solid-state nuclear magnetic resonance studies of the cell wall of the cobra mutant also indicated that, in addition to decreases in cellulose amount, the properties of the cellulose fibrils and other cell wall polymers differed from wild type by being less crystalline and having an increased number of reducing ends. We interpret the available evidence as suggesting that COBRA facilitates cellulose crystallization from the emerging β1–4-glucan chains by acting as a “polysaccharide chaperone.” PMID:25331944
Numerical Modeling of Three-Dimensional Confined Flows
NASA Technical Reports Server (NTRS)
Greywall, M. S.
1981-01-01
A three dimensional confined flow model is presented. The flow field is computed by calculating velocity and enthalpy along a set of streamlines. The finite difference equations are obtained by applying conservation principles to streamtubes constructed around the chosen streamlines. With appropriate substitutions for the body force terms, the approach computes three dimensional magnetohydrodynamic channel flows. A listing of a computer code, based on this approach is presented in FORTRAN IV language. The code computes three dimensional compressible viscous flow through a rectangular duct, with the duct cross section specified along the axis.
Concentrating Solar Power Projects - Casablanca | Concentrating Solar Power
(Badajoz) Owner(s): ACS - COBRA group (100%) Technology: Parabolic trough Turbine Capacity: Net: 50.0 MW Participants Developer(s): ACS - COBRA group Owner(s) (%): ACS - COBRA group (100%) EPC Contractor: Cobra
Chen, Nian; Lai, Xiao-Ping
2010-07-01
We obtained the complete mitochondrial genome of King Cobra(GenBank accession number: EU_921899) by Ex Taq-PCR, TA-cloning and primer-walking methods. This genome is very similar to other vertebrate, which is 17 267 bp in length and encodes 38 genes (including 13 protein-coding, 2 ribosomal RNA and 23 transfer RNA genes) and two long non-coding regions. The duplication of tRNA-Ile gene forms a new mitochondrial gene rearrangement model. Eight tRNA genes and one protein genes were transcribed from L strand, and the other genes were transcribed genes from H strand. Genes on the H strand show a fairly similar content of Adenosine and Thymine respectively, whereas those on the L strand have higher proportion of A than T. Combined rDNA sequence data (12S+16S rRNA) were used to reconstruct the phylogeny of 21 snake species for which complete mitochondrial genome sequences were available in the public databases. This large data set and an appropriate range of outgroup taxa demonstrated that Elapidae is more closely related to colubridae than viperidae, which supports the traditional viewpoints.
Global computing for bioinformatics.
Loewe, Laurence
2002-12-01
Global computing, the collaboration of idle PCs via the Internet in a SETI@home style, emerges as a new way of massive parallel multiprocessing with potentially enormous CPU power. Its relations to the broader, fast-moving field of Grid computing are discussed without attempting a review of the latter. This review (i) includes a short table of milestones in global computing history, (ii) lists opportunities global computing offers for bioinformatics, (iii) describes the structure of problems well suited for such an approach, (iv) analyses the anatomy of successful projects and (v) points to existing software frameworks. Finally, an evaluation of the various costs shows that global computing indeed has merit, if the problem to be solved is already coded appropriately and a suitable global computing framework can be found. Then, either significant amounts of computing power can be recruited from the general public, or--if employed in an enterprise-wide Intranet for security reasons--idle desktop PCs can substitute for an expensive dedicated cluster.
Mongoose: Creation of a Rad-Hard MIPS R3000
NASA Technical Reports Server (NTRS)
Lincoln, Dan; Smith, Brian
1993-01-01
This paper describes the development of a 32 Bit, full MIPS R3000 code-compatible Rad-Hard CPU, code named Mongoose. Mongoose progressed from contract award, through the design cycle, to operational silicon in 12 months to meet a space mission for NASA. The goal was the creation of a fully static device capable of operation to the maximum Mil-883 derated speed, worst-case post-rad exposure with full operational integrity. This included consideration of features for functional enhancements relating to mission compatibility and removal of commercial practices not supported by Rad-Hard technology. 'Mongoose' developed from an evolution of LSI Logic's MIPS-I embedded processor, LR33000, code named Cobra, to its Rad-Hard 'equivalent', Mongoose. The term 'equivalent' is used to infer that the core of the processor is functionally identical, allowing the same use and optimizations of the MIPS-I Instruction Set software tool suite for compilation, software program trace, etc. This activity was started in September of 1991 under a contract from NASA-Goddard Space Flight Center (GSFC)-Flight Data Systems. The approach affected a teaming of NASA-GSFC for program development, LSI Logic for system and ASIC design coupled with the Rad-Hard process technology, and Harris (GASD) for Rad-Hard microprocessor design expertise. The program culminated with the generation of Rad-Hard Mongoose prototypes one year later.
Independent Validation and Verification of automated information systems in the Department of Energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunteman, W.J.; Caldwell, R.
1994-07-01
The Department of Energy (DOE) has established an Independent Validation and Verification (IV&V) program for all classified automated information systems (AIS) operating in compartmented or multi-level modes. The IV&V program was established in DOE Order 5639.6A and described in the manual associated with the Order. This paper describes the DOE IV&V program, the IV&V process and activities, the expected benefits from an IV&V, and the criteria and methodologies used during an IV&V. The first IV&V under this program was conducted on the Integrated Computing Network (ICN) at Los Alamos National Laboratory and several lessons learned are presented. The DOE IV&Vmore » program is based on the following definitions. An IV&V is defined as the use of expertise from outside an AIS organization to conduct validation and verification studies on a classified AIS. Validation is defined as the process of applying the specialized security test and evaluation procedures, tools, and equipment needed to establish acceptance for joint usage of an AIS by one or more departments or agencies and their contractors. Verification is the process of comparing two levels of an AIS specification for proper correspondence (e.g., security policy model with top-level specifications, top-level specifications with source code, or source code with object code).« less
2008-12-01
SHA256 DIGEST LENGTH) ) ; peAddSection(&sF i l e , " . S i g S t u b " , dwStubSecSize , dwStubSecSize ) ; 169 peSecure(&sF i l e , deqAddrSize...deqAuthPageAddrSize . s i z e ( ) /2) ∗ (8 + SHA256 DIGEST LENGTH) ) + 16 ; bCode [ 3 4 ] = ( ( char∗)&dwSize ) [ 0 ] ; bCode [ 3 5 ] = ( ( char∗)&dwSize ) [ 1...2) ∗ (8 + SHA256 DIGEST LENGTH... ) ) ; AES KEY aesKey ; unsigned char i v s a l t [ 1 6 ] , temp iv [ 1 6 ] ; 739 unsigned char ∗key
Defense.gov - Special Report: Marine Assault: Operation Cobra's Anger
Operation Cobra's Anger. Story Marines Clear Taliban Stronghold During Operation Cobra's Anger HELMAND -scale operation in northern Helmand province. Story Related Sites Marine Corps News White House Fact insurgents Operation Cobra's Anger PHOTOS About This Site DoD Inspector General Freedom of Information
Concentrating Solar Power Projects - Extresol-2 | Concentrating Solar Power
Sesmero (Badajoz) Owner(s): ACS/Cobra Group (100%) Technology: Parabolic trough Turbine Capacity: Net : 158,000 MWh/yr (Expected/Planned) Contact(s): Manuel Cortes; Ana Salazar Company: ACS/Cobra Group Break Project Type: Commercial Participants Developer(s): ACS/Cobra Group Owner(s) (%): ACS/Cobra Group (100
Concentrating Solar Power Projects - Extresol-3 | Concentrating Solar Power
Sesmero (Badajoz) Owner(s): ACS/Cobra Group (100%) Technology: Parabolic trough Turbine Capacity: Net : 158,000 MWh/yr (Expected/Planned) Contact(s): Manuel Cortes; Ana Salazar Company: ACS/Cobra Group Break years Project Type: Commercial Participants Developer(s): ACS/Cobra Group Owner(s) (%): ACS/Cobra Group
Concentrating Solar Power Projects - Manchasol-2 | Concentrating Solar
Juan (Ciudad Real) Owner(s): ACS/Cobra Group (100%) Technology: Parabolic trough Turbine Capacity: Net : ACS/Cobra Group Break Ground: May 2009 Start Production: April 2011 Construction Job-Years: 600 Annual : Commercial Participants Developer(s): ACS/Cobra Group Owner(s) (%): ACS/Cobra Group (100%) EPC Contractor
Concentrating Solar Power Projects - Manchasol-1 | Concentrating Solar
Juan (Ciudad Real) Owner(s): ACS/Cobra Group (100%) Technology: Parabolic trough Turbine Capacity: Net : ACS/Cobra Group Break Ground: October 2008 Start Production: January 2011 Construction Job-Years: 600 : Commercial Participants Developer(s): ACS/Cobra Group Owner(s) (%): ACS/Cobra Group (100%) EPC Contractor
Deformations of thick two-material cylinder under axially varying radial pressure
NASA Technical Reports Server (NTRS)
Patel, Y. A.
1976-01-01
Stresses and deformations in thick, short, composite cylinder subjected to axially varying radial pressure are studied. Effect of slippage at the interface is examined. In the NASTRAN finite element model, multipoint constraint feature is utilized. Results are compared with theoretical analysis and SAP-IV computer code. Results from NASTRAN computer code are in good agreement with the analytical solutions. Results suggest a considerable influence of interfacial slippage on the axial bending stresses in the cylinder.
Computer program for fast Karhunen Loeve transform algorithm
NASA Technical Reports Server (NTRS)
Jain, A. K.
1976-01-01
The fast KL transform algorithm was applied for data compression of a set of four ERTS multispectral images and its performance was compared with other techniques previously studied on the same image data. The performance criteria used here are mean square error and signal to noise ratio. The results obtained show a superior performance of the fast KL transform coding algorithm on the data set used with respect to the above stated perfomance criteria. A summary of the results is given in Chapter I and details of comparisons and discussion on conclusions are given in Chapter IV.
ACDOS2: an improved neutron-induced dose rate code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lagache, J.C.
1981-06-01
To calculate the expected dose rate from fusion reactors as a function of geometry, composition, and time after shutdown a computer code, ACDOS2, was written, which utilizes up-to-date libraries of cross-sections and radioisotope decay data. ACDOS2 is in ANSI FORTRAN IV, in order to make it readily adaptable elsewhere.
46 CFR 164.019-3 - Definitions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Guard-approved PFDs. Commandant means the Chief of the Lifesaving and Fire Safety Division, Office of... code PFD type acceptable for use 1 I, II, and III. 2 II and III. 3 III. 4B IV (all Ring Buoys). 4BC IV (Buoyant Cushions). 4RB IV (Recreational Ring Buoys only). 5 Wearable Type V (intended use must be...
User's manual: Subsonic/supersonic advanced panel pilot code
NASA Technical Reports Server (NTRS)
Moran, J.; Tinoco, E. N.; Johnson, F. T.
1978-01-01
Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.
Premendran, S. Jhon; Salwe, Kartik J.; Pathak, Swanand; Brahmane, Ranjana; Manimekalai, K.
2011-01-01
Background: To investigate the anti-cobra venom effect of alcoholic extract of Andrographis paniculata. Materials and Methods: After calculating the LD99 of snake venom, the venom-neutralizing ability of plant extract at the dose 1 g/kg and 2 g/kg was determined using in vitro and in vivo methods. The alleviation in the mean survival time of the animals were used to infer the antivenom property of the drug after challenging with LD99 of snake venom. Results: The ethanolic extract of plant A. paniculata significantly increases mean survival time and the protection fold, but could not protect animals from death when used alone. The higher dose, i.e., 2 g/kg was found better than that of the lower. ASV was found more effective than the plant extract. When ASV was given along with plant extract, it potentiates its effect. Conclusion: The observation demonstrates the anti-cobra venom activity of ethanolic extract of A. paniculata which is comparable with ASV. PMID:22346236
Mirza, Zeenat; Pillai, Vikram Gopalakrishna; Zhong, Wei-Zhu
2014-03-10
Alzheimer's disease (AD) is one of the most significant social and health burdens of the present century. Plaques formed by extracellular deposits of amyloid β (Aβ) are the prime player of AD's neuropathology. Studies have implicated the varied role of phospholipase A2 (PLA2) in brain where it contributes to neuronal growth and inflammatory response. Overall contour and chemical nature of the substrate-binding channel in the low molecular weight PLA2s are similar. This study involves the reductionist fragment-based approach to understand the structure adopted by N-terminal fragment of Alzheimer's Aβ peptide in its complex with PLA2. In the current communication, we report the structure determined by X-ray crystallography of N-terminal sequence Asp-Ala-Glu-Phe-Arg-His-Asp-Ser (DAEFRHDS) of Aβ-peptide with a Group I PLA2 purified from venom of Andaman Cobra sub-species Naja naja sagittifera at 2.0 Å resolution (Protein Data Bank (PDB) Code: 3JQ5). This is probably the first attempt to structurally establish interaction between amyloid-β peptide fragment and hydrophobic substrate binding site of PLA2 involving H bond and van der Waals interactions. We speculate that higher affinity between Aβ and PLA2 has the therapeutic potential of decreasing the Aβ-Aβ interaction, thereby reducing the amyloid aggregation and plaque formation in AD.
van Tuyl, Lilian H D; Plass, Anne Marie C; Lems, Willem F; Voskuyl, Alexandre E; Dijkmans, Ben A C; Boers, Maarten
2007-01-01
Background The Combinatietherapie Bij Reumatoide Artritis (COBRA) trial has proved that combination therapy with prednisolone, methotrexate and sulphasalazine is superior to sulphasalazine monotherapy in suppressing disease activity and radiological progression of early rheumatoid arthritis (RA). In addition, 5 years of follow‐up proved that COBRA therapy results in sustained reduction of the rate of radiological progression. Despite this evidence, Dutch rheumatologists seem reluctant to prescribe COBRA therapy. Objective To explore the reasons for the reluctance in Dutch rheumatologists to prescribe COBRA therapy. Methods A short structured questionnaire based on social–psychological theories of behaviour was sent to all Dutch rheumatologists (n = 230). Results The response rate was 50%. COBRA therapy was perceived as both effective and safe, but complex to administer. Furthermore, rheumatologists expressed their concern about the large number of pills that had to be taken, the side effects of high‐dose prednisolone and the low dose of methotrexate. Although the average attitude towards the COBRA therapy was slightly positive (above the neutral point), the majority of responding rheumatologists had a negative intention (below the neutral point) to prescribe COBRA therapy in the near future. Conclusion The reluctance of Dutch rheumatologists to prescribe effective COBRA therapy may be due to perceptions of complexity of the treatment schedule and negative patient‐related consequences of the therapy. PMID:17392349
2012-01-01
Background Extensive studies have demonstrated that the COBRA gene is critical for biosynthesis of cell wall constituents comprising structural tissues of roots, stalks, leaves and other vegetative organs, however, its role in fruit development and ripening remains largely unknown. Results We identified a tomato gene (SlCOBRA-like) homologous to Arabidopsis COBRA, and determined its role in fleshy fruit biology. The SlCOBRA-like gene is highly expressed in vegetative organs and in early fruit development, but its expression in fruit declines dramatically during ripening stages, implying a primary role in early fruit development. Fruit-specific suppression of SlCOBRA-like resulted in impaired cell wall integrity and up-regulation of genes encoding proteins involved in cell wall degradation during early fruit development. In contrast, fruit-specific overexpression of SlCOBRA-like resulted in increased wall thickness of fruit epidermal cells, more collenchymatous cells beneath the epidermis, elevated levels of cellulose and reduced pectin solubilization in the pericarp cells of red ripe fruits. Moreover, transgenic tomato fruits overexpressing SlCOBRA-like exhibited desirable early development phenotypes including enhanced firmness and a prolonged shelf life. Conclusions Our results suggest that SlCOBRA-like plays an important role in fruit cell wall architecture and provides a potential genetic tool for extending the shelf life of tomato and potentially additional fruits. PMID:23140186
NASA Technical Reports Server (NTRS)
Lilley, D. G.; Rhode, D. L.
1982-01-01
A primitive pressure-velocity variable finite difference computer code was developed to predict swirling recirculating inert turbulent flows in axisymmetric combustors in general, and for application to a specific idealized combustion chamber with sudden or gradual expansion. The technique involves a staggered grid system for axial and radial velocities, a line relaxation procedure for efficient solution of the equations, a two-equation k-epsilon turbulence model, a stairstep boundary representation of the expansion flow, and realistic accommodation of swirl effects. A user's manual, dealing with the computational problem, showing how the mathematical basis and computational scheme may be translated into a computer program is presented. A flow chart, FORTRAN IV listing, notes about various subroutines and a user's guide are supplied as an aid to prospective users of the code.
2015-06-01
events was ad - hoc and problematic due to time constraints and changing requirements. Determining errors in context and heuristics required expertise...area code ) 410-278-4678 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 iii Contents List of Figures iv 1. Introduction 1...reduction code ...........8 1 1. Introduction Data reduction for analysis of Command, Control, Communications, and Computer (C4) network tests
26 CFR 1.6042-3 - Dividends subject to reporting.
Code of Federal Regulations, 2012 CFR
2012-04-01
... documentation of foreign status and definition of U.S. payor and non-U.S. payor) shall apply. The provisions of... the Internal Revenue Code (Code). (iv) Distributions or payments from sources outside the United States (as determined under the provisions of part I, subchapter N, chapter 1 of the Code and the...
Swertz, Morris A; De Brock, E O; Van Hijum, Sacha A F T; De Jong, Anne; Buist, Girbe; Baerends, Richard J S; Kok, Jan; Kuipers, Oscar P; Jansen, Ritsert C
2004-09-01
Genomic research laboratories need adequate infrastructure to support management of their data production and research workflow. But what makes infrastructure adequate? A lack of appropriate criteria makes any decision on buying or developing a system difficult. Here, we report on the decision process for the case of a molecular genetics group establishing a microarray laboratory. Five typical requirements for experimental genomics database systems were identified: (i) evolution ability to keep up with the fast developing genomics field; (ii) a suitable data model to deal with local diversity; (iii) suitable storage of data files in the system; (iv) easy exchange with other software; and (v) low maintenance costs. The computer scientists and the researchers of the local microarray laboratory considered alternative solutions for these five requirements and chose the following options: (i) use of automatic code generation; (ii) a customized data model based on standards; (iii) storage of datasets as black boxes instead of decomposing them in database tables; (iv) loosely linking to other programs for improved flexibility; and (v) a low-maintenance web-based user interface. Our team evaluated existing microarray databases and then decided to build a new system, Molecular Genetics Information System (MOLGENIS), implemented using code generation in a period of three months. This case can provide valuable insights and lessons to both software developers and a user community embarking on large-scale genomic projects. http://www.molgenis.nl
Proceduracy: Computer Code Writing in the Continuum of Literacy
ERIC Educational Resources Information Center
Vee, Annette
2010-01-01
This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…
Chanhome, Lawan; Khow, Orawan; Omori-Satoh, Tamotsu; Sitprija, Visith
2003-06-01
King cobra (Ophiophagus hannah) serum was found to possess antihemorrhagic activity against king cobra hemorrhagin. The activity was stronger than that in commercial king cobra antivenom. An antihemorrhagin has been purified by ion exchange chromatography, affinity chromatography and gel filtration with a 22-fold purification and an overall yield of 12% of the total antihemorrhagic activity contained in crude serum. The purified antihemorrhagin was homogeneous in disc-PAGE and SDS-PAGE. Its apparent molecular weight determined by SDS-PAGE was 120 kDa. The antihemorrhagin was also active against other hemorrhagic snake venoms obtained in Thailand and Japan such as Calloselasma rhodostoma, Trimeresurus albolabris, Trimeresurus macrops and Trimeresurus flavoviridis (Japanese Habu). It inhibited the proteolytic activity of king cobra venom. It is an acid- and thermolabile protein and does not form precipitin lines against king cobra venom.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salko, Robert K; Sung, Yixing; Kucukboyaci, Vefa
The Virtual Environment for Reactor Applications core simulator (VERA-CS) being developed by the Consortium for the Advanced Simulation of Light Water Reactors (CASL) includes coupled neutronics, thermal-hydraulics, and fuel temperature components with an isotopic depletion capability. The neutronics capability employed is based on MPACT, a three-dimensional (3-D) whole core transport code. The thermal-hydraulics and fuel temperature models are provided by the COBRA-TF (CTF) subchannel code. As part of the CASL development program, the VERA-CS (MPACT/CTF) code system was applied to model and simulate reactor core response with respect to departure from nucleate boiling ratio (DNBR) at the limiting time stepmore » of a postulated pressurized water reactor (PWR) main steamline break (MSLB) event initiated at the hot zero power (HZP), either with offsite power available and the reactor coolant pumps in operation (high-flow case) or without offsite power where the reactor core is cooled through natural circulation (low-flow case). The VERA-CS simulation was based on core boundary conditions from the RETRAN-02 system transient calculations and STAR-CCM+ computational fluid dynamics (CFD) core inlet distribution calculations. The evaluation indicated that the VERA-CS code system is capable of modeling and simulating quasi-steady state reactor core response under the steamline break (SLB) accident condition, the results are insensitive to uncertainties in the inlet flow distributions from the CFD simulations, and the high-flow case is more DNB limiting than the low-flow case.« less
Toyoshima, Kuniyoshi; Fujii, Yutaka; Mitsui, Nobuyuki; Kako, Yuki; Asakura, Satoshi; Martinez-Aran, Anabel; Vieta, Eduard; Kusumi, Ichiro
2017-08-01
In Japan, there are currently no reliable rating scales for the evaluation of subjective cognitive impairment in patients with bipolar disorder. We studied the relationship between the Japanese version of the Cognitive Complaints in Bipolar Disorder Rating Assessment (COBRA) and objective cognitive assessments in patients with bipolar disorder. We further assessed the reliability and validity of the COBRA. Forty-one patients, aged 16-64, in a remission period of bipolar disorder were recruited from Hokkaido University Hospital in Sapporo, Japan. The COBRA (Japanese version) and Frankfurt Complaint Questionnaire (FCQ), the gold standard in subjective cognitive assessment, were administered. A battery of neuropsychological tests was employed to measure objective cognitive impairment. Correlations among the COBRA, FCQ, and neuropsychological tests were determined using Spearman's correlation coefficient. The Japanese version of the COBRA had high internal consistency, good retest reliability, and concurrent validity-as indicated by a strong correlation with the FCQ. A significant correlation was also observed between the COBRA and objective cognitive measurements of processing speed. These findings are the first to demonstrate that the Japanese version of the COBRA may be clinically useful as a subjective cognitive impairment rating scale in Japanese patients with bipolar disorder. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Suntrarachun, S; Pakmanee, N; Tirawatnapong, T; Chanhome, L; Sitprija, V
2001-07-01
A PCR technique was used in this study to identify and distinguish monocellate cobra snake bites using snake venoms and swab specimens from snake bite-sites in mice from bites by other common Thai snakes. The sequences of nucleotide primers were selected for the cobrotoxin-encoding gene from the Chinese cobra (Naja atra) since the sequences of monocellate cobra (Naja kaouthia) venom are still unknown. However, the 113-bp fragment of cDNA of the cobrotoxin-encoding gene was detected in the monocellate cobra venom using RT-PCR. This gene was not found in the venoms of Ophiophagus hannah (king cobra), Bungarus fasciatus (banded krait), Daboia russelii siamensis (Siamese Russell's Viper, and Calloselasma rhodostoma (Malayan pit viper). Moreover, direct PCR could detect a 665-bp fragment of the cobrotoxin-encoding gene in the monocellate cobra venom but not the other snake venoms. Likewise, this gene was only observed in swab specimens from cobra snake bite-sites in mice. This is the first report demonstrating the ability of PCR to detect the cobrotoxin-encoding gene from snake venoms and swab specimens. Further studies are required for identification of this and other snakes from the bite-sites on human skin.
26 CFR 54.4980B-5 - COBRA continuation coverage.
Code of Federal Regulations, 2010 CFR
2010-04-01
... (for example, because of a divorce), the family deductible may be computed separately for each... the year. The plan provides that upon the divorce of a covered employee, coverage will end immediately... family had accumulated $420 of covered expenses before the divorce, as follows: $70 by each parent, $200...
Concentrating Solar Power Projects - Andasol-2 | Concentrating Solar Power
): ACS/Cobra Group Owner(s) (%): ACS/Cobra Group (100%) EPC Contractor: UTE CT Andasol-2: Cobra (80%) and (Model): UTE CT Andasol-2 (SKAL-ET) Mirror Manufacturer (Model): Flabeg (RP3) # of Heat Collector
ter Wee, Marieke M; den Uyl, Debby; Boers, Maarten; Kerstens, Pit; Nurmohamed, Mike; van Schaardenburg, Dirkjan; Voskuyl, Alexandre E; Lems, Willem F
2015-06-01
Recently, we documented the likely non-inferiority of Combinatietherapie Bij Reumatoïde Artritis (COBRA)-light therapy (methotrexate increased to 25 mg/week with initial prednisolone 30 mg/day) compared with the original COBRA therapy (methotrexate 7.5 mg/week, sulfasalazine 2 g/day, with initial prednisolone 60 mg/day) after 26 weeks in patients with early active rheumatoid arthritis (RA). To assess the non-inferiority of COBRA-light versus COBRA after 1 year in terms of disease activity (DAS44), functional outcome (Health Assessment Questionnaire (HAQ)) and radiographic progression (Sharp/van der Heijde score (SHS)), and to assess the effect of adding etanercept. An open-label, randomised controlled, non-inferiority trial of 162 patients with active early RA, following a treat-to-target protocol incorporating the addition of etanercept if DAS44 ≥1.6 at weeks 26 or 39. Both groups showed major improvements in DAS44 after 52 weeks: mean (SD) -2.41 (1.2) in the COBRA and -2.02 (1.0) in the COBRA-light group (p=ns). In both groups, functional ability improved and radiological progression of joints was minimal. At least one adverse event was reported in 96% of the patients in both groups. In total, 25 serious adverse events occurred: 9 vs 16 in COBRA and COBRA-light, respectively. Treatment actually instituted was often less intensive than required by the protocol: of the total population, 108 patients (67%) required etanercept (more in the COBRA-light group), but only 67 of these (62%) actually received it. Intensive COBRA or COBRA-light therapy has major, comparably favourable effects on disease activity, functional ability and radiological outcome after 1 year in patients with early RA. Protocolised addition of etanercept was often not implemented by treating rheumatologists, and patients receiving it appeared to have limited added benefit, probably because of low disease activity levels at its initiation. ISRCTN55552928. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
An emulator for minimizing computer resources for finite element analysis
NASA Technical Reports Server (NTRS)
Melosh, R.; Utku, S.; Islam, M.; Salama, M.
1984-01-01
A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).
Computationally Guided Photothermal Tumor Therapy Using Long-Circulating Gold Nanorod Antennas
Maltzahn, Geoffrey von; Park, Ji-Ho; Agrawal, Amit; Bandaru, Nanda Kishor; Das, Sarit K.; Sailor, Michael J.; Bhatia, Sangeeta N.
2009-01-01
Plasmonic nanomaterials have the opportunity to considerably improve the specificity of cancer ablation by i.v. homing to tumors and acting as antennas for accepting externally applied energy. Here, we describe an integrated approach to improved plasmonic therapy composed of multimodal nanomaterial optimization and computational irradiation protocol development. We synthesized polyethylene glycol (PEG)-protected gold nanorods (NR) that exhibit superior spectral bandwidth, photothermal heat generation per gram of gold, and circulation half-life in vivo (t1/2, ~17 hours) compared with the prototypical tunable plasmonic particles, gold nanoshells, as well as ~2-fold higher X-ray absorption than a clinical iodine contrast agent. After intratumoral or i.v. administration, we fuse PEG-NR biodistribution data derived via noninvasive X-ray computed tomography or ex vivo spectrometry, respectively, with four-dimensional computational heat transport modeling to predict photothermal heating during irradiation. In computationally driven pilot therapeutic studies, we show that a single i.v. injection of PEG-NRs enabled destruction of all irradiated human xenograft tumors in mice. These studies highlight the potential of integrating computational therapy design with nanotherapeutic development for ultraselective tumor ablation. PMID:19366797
The king cobra genome reveals dynamic gene evolution and adaptation in the snake venom system.
Vonk, Freek J; Casewell, Nicholas R; Henkel, Christiaan V; Heimberg, Alysha M; Jansen, Hans J; McCleary, Ryan J R; Kerkkamp, Harald M E; Vos, Rutger A; Guerreiro, Isabel; Calvete, Juan J; Wüster, Wolfgang; Woods, Anthony E; Logan, Jessica M; Harrison, Robert A; Castoe, Todd A; de Koning, A P Jason; Pollock, David D; Yandell, Mark; Calderon, Diego; Renjifo, Camila; Currier, Rachel B; Salgado, David; Pla, Davinia; Sanz, Libia; Hyder, Asad S; Ribeiro, José M C; Arntzen, Jan W; van den Thillart, Guido E E J M; Boetzer, Marten; Pirovano, Walter; Dirks, Ron P; Spaink, Herman P; Duboule, Denis; McGlinn, Edwina; Kini, R Manjunatha; Richardson, Michael K
2013-12-17
Snakes are limbless predators, and many species use venom to help overpower relatively large, agile prey. Snake venoms are complex protein mixtures encoded by several multilocus gene families that function synergistically to cause incapacitation. To examine venom evolution, we sequenced and interrogated the genome of a venomous snake, the king cobra (Ophiophagus hannah), and compared it, together with our unique transcriptome, microRNA, and proteome datasets from this species, with data from other vertebrates. In contrast to the platypus, the only other venomous vertebrate with a sequenced genome, we find that snake toxin genes evolve through several distinct co-option mechanisms and exhibit surprisingly variable levels of gene duplication and directional selection that correlate with their functional importance in prey capture. The enigmatic accessory venom gland shows a very different pattern of toxin gene expression from the main venom gland and seems to have recruited toxin-like lectin genes repeatedly for new nontoxic functions. In addition, tissue-specific microRNA analyses suggested the co-option of core genetic regulatory components of the venom secretory system from a pancreatic origin. Although the king cobra is limbless, we recovered coding sequences for all Hox genes involved in amniote limb development, with the exception of Hoxd12. Our results provide a unique view of the origin and evolution of snake venom and reveal multiple genome-level adaptive responses to natural selection in this complex biological weapon system. More generally, they provide insight into mechanisms of protein evolution under strong selection.
Cobra Probes Containing Replaceable Thermocouples
NASA Technical Reports Server (NTRS)
Jones, John; Redding, Adam
2007-01-01
A modification of the basic design of cobra probes provides for relatively easy replacement of broken thermocouples. Cobra probes are standard tube-type pressure probes that may also contain thermocouples and that are routinely used in wind tunnels and aeronautical hardware. They are so named because in side views, they resemble a cobra poised to attack. Heretofore, there has been no easy way to replace a broken thermocouple in a cobra probe: instead, it has been necessary to break the probe apart and then rebuild it, typically at a cost between $2,000 and $4,000 (2004 prices). The modified design makes it possible to replace the thermocouple, in minimal time and at relatively low cost, by inserting new thermocouple wire in a tube.
Sankarasubramanian, Jagadesan; Vishnu, Udayakumar S; Dinakaran, Vasudevan; Sridhar, Jayavel; Gunasekaran, Paramasamy; Rajendhran, Jeyaprakash
2016-01-01
Brucella spp. are facultative intracellular pathogens that cause brucellosis in various mammals including humans. Brucella survive inside the host cells by forming vacuoles and subverting host defence systems. This study was aimed to predict the secretion systems and the secretomes of Brucella spp. from 39 complete genome sequences available in the databases. Furthermore, an attempt was made to identify the type IV secretion effectors and their interactions with host proteins. We predicted the secretion systems of Brucella by the KEGG pathway and SecReT4. Brucella secretomes and type IV effectors (T4SEs) were predicted through genome-wide screening using JVirGel and S4TE, respectively. Protein-protein interactions of Brucella T4SEs with their hosts were analyzed by HPIDB 2.0. Genes coding for Sec and Tat pathways of secretion and type I (T1SS), type IV (T4SS) and type V (T5SS) secretion systems were identified and they are conserved in all the species of Brucella. In addition to the well-known VirB operon coding for the type IV secretion system (T4SS), we have identified the presence of additional genes showing homology with T4SS of other organisms. On the whole, 10.26 to 14.94% of total proteomes were found to be either secreted (secretome) or membrane associated (membrane proteome). Approximately, 1.7 to 3.0% of total proteomes were identified as type IV secretion effectors (T4SEs). Prediction of protein-protein interactions showed 29 and 36 host-pathogen specific interactions between Bos taurus (cattle)-B. abortus and Ovis aries (sheep)-B. melitensis, respectively. Functional characterization of the predicted T4SEs and their interactions with their respective hosts may reveal the secrets of host specificity of Brucella.
Quantum error-correcting code for ternary logic
NASA Astrophysics Data System (ADS)
Majumdar, Ritajit; Basu, Saikat; Ghosh, Shibashis; Sur-Kolay, Susmita
2018-05-01
Ternary quantum systems are being studied because they provide more computational state space per unit of information, known as qutrit. A qutrit has three basis states, thus a qubit may be considered as a special case of a qutrit where the coefficient of one of the basis states is zero. Hence both (2 ×2 ) -dimensional and (3 ×3 ) -dimensional Pauli errors can occur on qutrits. In this paper, we (i) explore the possible (2 ×2 ) -dimensional as well as (3 ×3 ) -dimensional Pauli errors in qutrits and show that any pairwise bit swap error can be expressed as a linear combination of shift errors and phase errors, (ii) propose a special type of error called a quantum superposition error and show its equivalence to arbitrary rotation, (iii) formulate a nine-qutrit code which can correct a single error in a qutrit, and (iv) provide its stabilizer and circuit realization.
SIVEH: numerical computing simulation of wireless energy-harvesting sensor nodes.
Sanchez, Antonio; Blanc, Sara; Climent, Salvador; Yuste, Pedro; Ors, Rafael
2013-09-04
The paper presents a numerical energy harvesting model for sensor nodes, SIVEH (Simulator I-V for EH), based on I-V hardware tracking. I-V tracking is demonstrated to be more accurate than traditional energy modeling techniques when some of the components present different power dissipation at either different operating voltages or drawn currents. SIVEH numerical computing allows fast simulation of long periods of time-days, weeks, months or years-using real solar radiation curves. Moreover, SIVEH modeling has been enhanced with sleep time rate dynamic adjustment, while seeking energy-neutral operation. This paper presents the model description, a functional verification and a critical comparison with the classic energy approach.
Simulation of electron transport across charged grain boundaries
NASA Astrophysics Data System (ADS)
Srikant, V.; Clarke, D. R.; Evans, P. V.
1996-09-01
The I-V (current density-electric field) characteristics of low-angle grain boundaries consisting of periodic arrays of charged dislocations are computed using a quasiclassical molecular dynamics approach. Below a critical value of the grain boundary misorientation, the computed I-V characteristics are linear whereas above they are nonlinear. The degree of nonlinearity and the voltage onset of nonlinearity are found to be dependent on the grain boundary misorientation.
Computer Description of the M561 Utility Truck
1984-10-01
GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom
26 CFR 301.6331-2 - Procedures and restrictions on levies.
Code of Federal Regulations, 2010 CFR
2010-04-01
... certified mail to the taxpayer's last known address. For further guidance regarding the definition of last...— (i) The Internal Revenue Code provisions and the procedures relating to levy and sale of property... (including the use of an installment agreement under section 6159); and (iv) The Internal Revenue Code...
26 CFR 301.6331-2 - Procedures and restrictions on levies.
Code of Federal Regulations, 2011 CFR
2011-04-01
... certified mail to the taxpayer's last known address. For further guidance regarding the definition of last...— (i) The Internal Revenue Code provisions and the procedures relating to levy and sale of property... (including the use of an installment agreement under section 6159); and (iv) The Internal Revenue Code...
26 CFR 301.6331-2 - Procedures and restrictions on levies.
Code of Federal Regulations, 2012 CFR
2012-04-01
... certified mail to the taxpayer's last known address. For further guidance regarding the definition of last...— (i) The Internal Revenue Code provisions and the procedures relating to levy and sale of property... (including the use of an installment agreement under section 6159); and (iv) The Internal Revenue Code...
26 CFR 301.6331-2 - Procedures and restrictions on levies.
Code of Federal Regulations, 2014 CFR
2014-04-01
... certified mail to the taxpayer's last known address. For further guidance regarding the definition of last...— (i) The Internal Revenue Code provisions and the procedures relating to levy and sale of property... (including the use of an installment agreement under section 6159); and (iv) The Internal Revenue Code...
40 CFR 147.200 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the program administered by the Arkansas Department of Pollution Control and Ecology approved by EPA... Code, Department of Pollution Control and Ecology, promulgated January 22, 1982; (4) General Rule and... Management Code, Department of Pollution Control and Ecology, promulgated August 21, 1981. (b) The Memorandum...
40 CFR 147.200 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the program administered by the Arkansas Department of Pollution Control and Ecology approved by EPA... Code, Department of Pollution Control and Ecology, promulgated January 22, 1982; (4) General Rule and... Management Code, Department of Pollution Control and Ecology, promulgated August 21, 1981. (b) The Memorandum...
40 CFR 147.200 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the program administered by the Arkansas Department of Pollution Control and Ecology approved by EPA... Code, Department of Pollution Control and Ecology, promulgated January 22, 1982; (4) General Rule and... Management Code, Department of Pollution Control and Ecology, promulgated August 21, 1981. (b) The Memorandum...
40 CFR 147.200 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2012 CFR
2012-07-01
... the program administered by the Arkansas Department of Pollution Control and Ecology approved by EPA... Code, Department of Pollution Control and Ecology, promulgated January 22, 1982; (4) General Rule and... Management Code, Department of Pollution Control and Ecology, promulgated August 21, 1981. (b) The Memorandum...
40 CFR 147.200 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the program administered by the Arkansas Department of Pollution Control and Ecology approved by EPA... Code, Department of Pollution Control and Ecology, promulgated January 22, 1982; (4) General Rule and... Management Code, Department of Pollution Control and Ecology, promulgated August 21, 1981. (b) The Memorandum...
76 FR 51963 - Cobra Pipeline Ltd.; Notice of Baseline Filings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-19
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-122-000] Cobra Pipeline Ltd.; Notice of Baseline Filings Take notice that on August 12, 2011, Cobra Pipeline Ltd. submitted a revised baseline filing of their Statement of Operating Conditions for services provided under Section 311...
Simulation Of Combat With An Expert System
NASA Technical Reports Server (NTRS)
Provenzano, J. P.
1989-01-01
Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.
Patient dumping, COBRA, and the public psychiatric hospital.
Elliott, R L
1993-02-01
Serious clinical and risk management problems arise when indigent patients with acute medical conditions are transferred from general medical hospitals or emergency departments to public psychiatric hospitals that are ill equipped to provide medical care. To combat such practices, referred to as dumping, Congress included measures in the Consolidated Omnibus Budget Reconciliation Act of 1985 (COBRA) prohibiting such transfers. Because physicians and administrators in public psychiatric hospitals are generally not aware of the potential usefulness of COBRA in reducing dumping, this paper describes its important provisions. The key to preventing dumping is to educate referral sources to limitations on the medical care available at the receiving hospital and to discourage negligent patient transfers by enforcing COBRA. Public hospital staff and legal counsel who become familiar with COBRA's provisions can develop an antidumping strategy.
Next-generation genome-scale models for metabolic engineering.
King, Zachary A; Lloyd, Colton J; Feist, Adam M; Palsson, Bernhard O
2015-12-01
Constraint-based reconstruction and analysis (COBRA) methods have become widely used tools for metabolic engineering in both academic and industrial laboratories. By employing a genome-scale in silico representation of the metabolic network of a host organism, COBRA methods can be used to predict optimal genetic modifications that improve the rate and yield of chemical production. A new generation of COBRA models and methods is now being developed--encompassing many biological processes and simulation strategies-and next-generation models enable new types of predictions. Here, three key examples of applying COBRA methods to strain optimization are presented and discussed. Then, an outlook is provided on the next generation of COBRA models and the new types of predictions they will enable for systems metabolic engineering. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Development of New Atmospheric Models for K and M DwarfStars with Exoplanets
NASA Astrophysics Data System (ADS)
Linsky, Jeffrey L.
2018-01-01
The ultraviolet and X-ray emissions of host stars play critical roles in the survival and chemical composition of the atmospheres of their exoplanets. The need to measure and understand this radiative output, in particular for K and M dwarfs, is the main rationale for computing a new generation of stellar models that includes magnetically heated chromospheres and coronae in addition to their photospheres. We describe our method for computing semi-empirical models that includes solutions of the statistical equilibrium equations for 52 atoms and ions and of the non-LTE radiative transfer equations for all important spectral lines. The code is an offspring of the Solar Radiation Physical Modelling system (SRPM) developed by Fontenla et al. (2007--2015) to compute one-dimensional models in hydrostatic equilibrium to fit high-resolution stellar X-ray to IR spectra. Also included are 20 diatomic molecules and their more than 2 million spectral lines. Our-proof-of-concept model is for the M1.5 V star GJ 832 (Fontenla et al. ApJ 830, 154 (2016)). We will fit the line fluxes and profiles of X-ray lines and continua observed by Chandra and XMM-Newton, UV lines observed by the COS and STIS instruments on HST (N V, C IV, Si IV, Si III, Mg II, C II, and O I), optical lines (including H$\\alpha$, Ca II, Na I), and continua. These models will allow us to compute extreme-UV spectra, which are unobservable but required to predict the hydrodynamic mass-loss rate from exoplanet atmospheres, and to predict panchromatic spectra of new exoplanet host stars discovered after the end of the HST mission.This work is supported by grant HST-GO-15038 from the Space Telescope Science Institute to the Univ. of Colorado
Ben-Abraham, Ron; Flaishon, Ron; Sotman, Alexander; Ekstein, Perla; Ezri, Tiberiu; Ogorek, Daniel; Weinbroum, Avi A
2008-07-01
The threat of a mass casualty unconventional attack has challenged the medical community to devise means for providing rapid and reliable emergent airway control under chaotic conditions by inexperienced medical personnel dressed in self protective gear. Since endotracheal intubation may not be feasible under those conditions, other extraglottic devices should be considered. We assessed the performance of anesthesia and non-anesthesia residents in inserting the CobraPLA, a supraglottic airway device, on consecutive anesthetized patients, to assess its potential use under simulated conditions. Anesthesia and non-anesthesia residents wearing either surgical scrubs or complete anti-chemical gear inserted the CobraPLA in anesthetized patients. If post-trial positive pressure ventilation via the CobraPLA was unsuccessful, an LMA or endotracheal tube was inserted in its stead. It took anesthesia residents 57+/-23 sec and 43+/-13 sec (P<0.05) to place the CobraPLA while wearing anti-chemical gear and surgical scrubs, respectively. Non-anesthesia residents wearing anti-chemical gear performed worse than anesthetists in their first insertion (73+/-9 sec, P<0.05), but after the brief training period they performed as well as their colleagues anesthetists (58+/-10 sec, P=NS). Post-trial, twenty-one CobraPLA (42%) leaked, preventing adequate positive-pressure ventilation: 13 devices (26% of the total) required replacements. Anti-chemical protective gear slowed the insertion of the CobraPLA by anesthetists, and more so by other residents inexperienced in airway management. In 26% of the cases CobraPLA was inadequate for positive pressure ventilation.
Fibrinogen Recovery in Two Methods of Cryoprecipitate Preparation
1989-08-01
ERNEST A. HAYGOOD, 1st Lt, USAF Executive Officer, Civilian Institution Programs 17. COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necessary...NAME OF RESPONSIBLE INDIVIDUAL 22b. TELEPHONE (Include Area Code) 22c. OFFICE SYMBOL ERNEST A. HAYGOOD, 1st Lt, USAF (513) 255-2259 AFIT/CI DDForm...u I iv ACKNOWLEDGEMENTS I would like to extend sincerest appreciation to Dr. Lloyd Lippert , my research advisor. Without his continued guidance
SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)
NASA Technical Reports Server (NTRS)
Manteufel, R.
1994-01-01
The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.
SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)
NASA Technical Reports Server (NTRS)
Merwarth, P. D.
1994-01-01
The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.
A Case Study of 4 & 5 Cost Effectiveness
NASA Technical Reports Server (NTRS)
Neal, Ralph D.; McCaugherty, Dan; Joshi, Tulasi; Callahan, John
1997-01-01
This paper looks at the Independent Verification and Validation (IV&V) of NASA's Space Shuttle Day of Launch I-Load Update (DoLILU) project. IV&V is defined. The system's development life cycle is explained. Data collection and analysis are described. DoLILU Issue Tracking Reports (DITRs) authored by IV&V personnel are analyzed to determine the effectiveness of IV&V in finding errors before the code, testing, and integration phase of the software development life cycle. The study's findings are reported along with the limitations of the study and planned future research.
Recognition of similar epitopes on varicella-zoster virus gpI and gpIV by monoclonal antibodies.
Vafai, A; Wroblewska, Z; Mahalingam, R; Cabirac, G; Wellish, M; Cisco, M; Gilden, D
1988-01-01
Two monoclonal antibodies, MAb43.2 and MAb79.0, prepared against varicella-zoster virus (VZV) proteins were selected to analyze VZV gpIV and gpI, respectively. MAb43.2 reacted only with cytoplasmic antigens, whereas MAb79.0 recognized both cytoplasmic and membrane antigens in VZV-infected cells. Immunoprecipitation of in vitro translation products with MAb43.2 revealed only proteins encoded by the gpIV gene, whereas MAb79.0 precipitated proteins encoded by the gpIV and gpI genes. Pulse-chase analysis followed by immunoprecipitation of VZV-infected cells indicated reactivity of MAb43.2 with three phosphorylated precursor species of gpIV and reactivity of MAb79.0 with the precursor and mature forms of gpI and gpIV. These results indicated that (i) MAb43.2 and MAb79.0 recognize different epitopes on VZV gpIV, (ii) glycosylation of gpIV ablates recognition by MAb43.2, and (iii) gpIV is phosphorylated. To map the binding site of MAb79.0 on gpI, the pGEM transcription vector, containing the coding region of the gpI gene, was linearized, and three truncated gpI DNA fragments were generated. RNA was transcribed from each truncated fragment by using SP6 RNA polymerase, translated in vitro in a rabbit reticulocyte lysate, and immunoprecipitated with MAb79.0 and human sera. The results revealed the existence of an antibody-binding site within 14 amino acid residues located between residues 109 to 123 on the predicted amino acid sequences of gpI. From the predicted amino acid sequences, 14 residues on gpI (residues 107 to 121) displayed a degree of similarity (36%) to two regions (residues 55 to 69 and 245 to 259) of gp IV. Such similarities may account for the binding of MAb79.0 to both VZV gpI and gpIV. Images PMID:2455814
Ocean Current Effects on Marine Seismic Systems and Deployments.
1982-01-01
UNCLASSIFIED NOROA-TN 132 N 44, i . 4- iv L~~~ Kr~4~ !jj A r qt4 : ~’~A71 I0 AII ABSTRACT Upper level and near bottom current measurements were made...indicated a variable yet generally slow 1 " current regime which posed minimal threat of cable entanglement. Current [ measurements made 5 m off bottom during...diameters a iv -ALI-- - 1. 1. Introduction Two types of physical oceanographic measurements were supplied by NORDA Code 331 In support of the March-April
Duan, Qiangde; Lee, Kuo Hao; Nandre, Rahul M; Garcia, Carolina; Chen, Jianhan; Zhang, Weiping
2017-01-01
Vaccine development often encounters the challenge of virulence heterogeneity. Enterotoxigenic Escherichia coli (ETEC) bacteria producing immunologically heterogeneous virulence factors are a leading cause of children’s diarrhea and travelers’ diarrhea. Currently, we do not have licensed vaccines against ETEC bacteria. While conventional methods continue to make progress but encounter challenge, new computational and structure-based approaches are explored to accelerate ETEC vaccine development. In this study, we applied a structural vaccinology concept to construct a structure-based multiepitope fusion antigen (MEFA) to carry representing epitopes of the seven most important ETEC adhesins [CFA/I, CFA/II (CS1–CS3), CFA/IV (CS4–CS6)], simulated antigenic structure of the CFA/I/II/IV MEFA with computational atomistic modeling and simulation, characterized immunogenicity in mouse immunization, and examined the potential of structure-informed vaccine design for ETEC vaccine development. A tag-less recombinant MEFA protein (CFA/I/II/IV MEFA) was effectively expressed and extracted. Molecular dynamics simulations indicated that this MEFA immunogen maintained a stable secondary structure and presented epitopes on the protein surface. Empirical data showed that mice immunized with the tagless CFA/I/II/IV MEFA developed strong antigen-specific antibody responses, and mouse serum antibodies significantly inhibited in vitro adherence of bacteria expressing these seven adhesins. These results revealed congruence of antigen immunogenicity between computational simulation and empirical mouse immunization and indicated this tag-less CFA/I/II/IV MEFA potentially an antigen for a broadly protective ETEC vaccine, suggesting a potential application of MEFA-based structural vaccinology for vaccine design against ETEC and likely other pathogens. PMID:28944092
The efficiency of geophysical adjoint codes generated by automatic differentiation tools
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Köhl, A.; Stammer, D.
2016-02-01
The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.
Interactive Graphics Analysis for Aircraft Design
NASA Technical Reports Server (NTRS)
Townsend, J. C.
1983-01-01
Program uses higher-order far field drag minimization. Computer program WDES WDEM preliminary aerodynamic design tool for one or two interacting, subsonic lifting surfaces. Subcritical wing design code employs higher-order far-field drag minimization technique. Linearized aerodynamic theory used. Program written in FORTRAN IV.
26 CFR 54.4980B-5 - COBRA continuation coverage.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 17 2013-04-01 2013-04-01 false COBRA continuation coverage. 54.4980B-5 Section 54.4980B-5 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) MISCELLANEOUS EXCISE TAXES (CONTINUED) PENSION EXCISE TAXES § 54.4980B-5 COBRA continuation coverage. The following questions-and-answers address the...
26 CFR 54.4980B-5 - COBRA continuation coverage.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 17 2012-04-01 2012-04-01 false COBRA continuation coverage. 54.4980B-5 Section 54.4980B-5 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) MISCELLANEOUS EXCISE TAXES (CONTINUED) PENSION EXCISE TAXES § 54.4980B-5 COBRA continuation coverage. The following questions-and-answers address the...
26 CFR 54.4980B-5 - COBRA continuation coverage.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 17 2011-04-01 2011-04-01 false COBRA continuation coverage. 54.4980B-5 Section 54.4980B-5 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) MISCELLANEOUS EXCISE TAXES (CONTINUED) PENSION EXCISE TAXES § 54.4980B-5 COBRA continuation coverage. The following questions-and-answers address the...
26 CFR 54.4980B-5 - COBRA continuation coverage.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 17 2014-04-01 2014-04-01 false COBRA continuation coverage. 54.4980B-5 Section 54.4980B-5 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) MISCELLANEOUS EXCISE TAXES (CONTINUED) PENSION EXCISE TAXES § 54.4980B-5 COBRA continuation coverage. The following questions-and-answers address the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cuta, Judith M.; Adkins, Harold E.
2013-08-30
As part of the Used Fuel Disposition Campaign of the U. S. Department of Energy, Office of Nuclear Energy (DOE-NE) Fuel Cycle Research and Development, a consortium of national laboratories and industry is performing visual inspections and temperature measurements of selected storage modules at various locations around the United States. This report documents thermal analyses in in support of the inspections at the Hope Creek Nuclear Generating Station ISFSI. This site utilizes the HI-STORM100 vertical storage system developed by Holtec International. This is a vertical storage module design, and the thermal models are being developed using COBRA-SFS (Michener, et al.,more » 1987), a code developed by PNNL for thermal-hydraulic analyses of multi assembly spent fuel storage and transportation systems. This report describes the COBRA-SFS model in detail, and presents pre-inspection predictions of component temperatures and temperature distributions. The final report will include evaluation of inspection results, and if required, additional post-test calculations, with appropriate discussion of results.« less
The king cobra genome reveals dynamic gene evolution and adaptation in the snake venom system
Vonk, Freek J.; Casewell, Nicholas R.; Henkel, Christiaan V.; Heimberg, Alysha M.; Jansen, Hans J.; McCleary, Ryan J. R.; Kerkkamp, Harald M. E.; Vos, Rutger A.; Guerreiro, Isabel; Calvete, Juan J.; Wüster, Wolfgang; Woods, Anthony E.; Logan, Jessica M.; Harrison, Robert A.; Castoe, Todd A.; de Koning, A. P. Jason; Pollock, David D.; Yandell, Mark; Calderon, Diego; Renjifo, Camila; Currier, Rachel B.; Salgado, David; Pla, Davinia; Sanz, Libia; Hyder, Asad S.; Ribeiro, José M. C.; Arntzen, Jan W.; van den Thillart, Guido E. E. J. M.; Boetzer, Marten; Pirovano, Walter; Dirks, Ron P.; Spaink, Herman P.; Duboule, Denis; McGlinn, Edwina; Kini, R. Manjunatha; Richardson, Michael K.
2013-01-01
Snakes are limbless predators, and many species use venom to help overpower relatively large, agile prey. Snake venoms are complex protein mixtures encoded by several multilocus gene families that function synergistically to cause incapacitation. To examine venom evolution, we sequenced and interrogated the genome of a venomous snake, the king cobra (Ophiophagus hannah), and compared it, together with our unique transcriptome, microRNA, and proteome datasets from this species, with data from other vertebrates. In contrast to the platypus, the only other venomous vertebrate with a sequenced genome, we find that snake toxin genes evolve through several distinct co-option mechanisms and exhibit surprisingly variable levels of gene duplication and directional selection that correlate with their functional importance in prey capture. The enigmatic accessory venom gland shows a very different pattern of toxin gene expression from the main venom gland and seems to have recruited toxin-like lectin genes repeatedly for new nontoxic functions. In addition, tissue-specific microRNA analyses suggested the co-option of core genetic regulatory components of the venom secretory system from a pancreatic origin. Although the king cobra is limbless, we recovered coding sequences for all Hox genes involved in amniote limb development, with the exception of Hoxd12. Our results provide a unique view of the origin and evolution of snake venom and reveal multiple genome-level adaptive responses to natural selection in this complex biological weapon system. More generally, they provide insight into mechanisms of protein evolution under strong selection. PMID:24297900
Extended MHD Effects in High Energy Density Experiments
NASA Astrophysics Data System (ADS)
Seyler, Charles
2016-10-01
The MHD model is the workhorse for computational modeling of HEDP experiments. Plasma models are inheritably limited in scope, but MHD is expected to be a very good model for studying plasmas at the high densities attained in HEDP experiments. There are, however, important ways in which MHD fails to adequately describe the results, most notably due to the omission of the Hall term in the Ohm's law (a form of extended MHD or XMHD). This talk will discuss these failings by directly comparing simulations of MHD and XMHD for particularly relevant cases. The methodology is to simulate HEDP experiments using a Hall-MHD (HMHD) code based on a highly accurate and robust Discontinuous Galerkin method, and by comparison of HMHD to MHD draw conclusions about the impact of the Hall term. We focus on simulating two experimental pulsed power machines under various scenarios. We examine the MagLIF experiment on the Z-machine at Sandia National Laboratories and liner experiments on the COBRA machine at Cornell. For the MagLIF experiment we find that power flow in the feed leads to low density plasma ablation into the region surrounding the liner. The inflow of this plasma compresses axial magnetic flux onto the liner. In MHD this axial flux tends to resistively decay, whereas in HMHD a force-free current layer sustains the axial flux on the liner leading to a larger ratio of axial to azimuthal flux. During the liner compression the magneto-Rayleigh-Taylor instability leads to helical perturbations due to minimization of field line bending. Simulations of a cylindrical liner using the COBRA machine parameters can under certain conditions exhibit amplification of an axial field due to a force-free low-density current layer separated by some distance from the liner. This results in a configuration in which there is predominately axial field on the liner inside the current layer and azimuthal field outside the layer. We are currently attempting to experimentally verify the simulation results. Collaborator: Nathaniel D. Hamlin, School of Electrical and Computer Engineering, Cornell University, Ithaca, New York.
Computer Code For Turbocompounded Adiabatic Diesel Engine
NASA Technical Reports Server (NTRS)
Assanis, D. N.; Heywood, J. B.
1988-01-01
Computer simulation developed to study advantages of increased exhaust enthalpy in adiabatic turbocompounded diesel engine. Subsytems of conceptual engine include compressor, reciprocator, turbocharger turbine, compounded turbine, ducting, and heat exchangers. Focus of simulation of total system is to define transfers of mass and energy, including release and transfer of heat and transfer of work in each subsystem, and relationship among subsystems. Written in FORTRAN IV.
Log-normal spray drop distribution...analyzed by two new computer programs
Gerald S. Walton
1968-01-01
Results of U.S. Forest Service research on chemical insecticides suggest that large drops are not as effective as small drops in carrying insecticides to target insects. Two new computer programs have been written to analyze size distribution properties of drops from spray nozzles. Coded in Fortran IV, the programs have been tested on both the CDC 6400 and the IBM 7094...
40 CFR 63.2435 - Am I subject to the requirements in this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... MCPU includes equipment necessary to operate a miscellaneous organic chemical manufacturing process, as...)(1)(i), (ii), (iii), (iv), or (v) of this section. (i) An organic chemical(s) classified using the...)(5) of this section. (ii) An organic chemical(s) classified using the 1997 version of NAICS code 325...
40 CFR 63.2435 - Am I subject to the requirements in this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... MCPU includes equipment necessary to operate a miscellaneous organic chemical manufacturing process, as...)(1)(i), (ii), (iii), (iv), or (v) of this section. (i) An organic chemical(s) classified using the...)(5) of this section. (ii) An organic chemical(s) classified using the 1997 version of NAICS code 325...
40 CFR 63.2435 - Am I subject to the requirements in this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... MCPU includes equipment necessary to operate a miscellaneous organic chemical manufacturing process, as...)(1)(i), (ii), (iii), (iv), or (v) of this section. (i) An organic chemical(s) classified using the...)(5) of this section. (ii) An organic chemical(s) classified using the 1997 version of NAICS code 325...
Code Verification of the HIGRAD Computational Fluid Dynamics Solver
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.
2012-05-04
The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verificationmore » test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.« less
NASA Astrophysics Data System (ADS)
Ji, Kun; Ren, Yefei; Wen, Ruizhi
2017-10-01
Reliable site classification of the stations of the China National Strong Motion Observation Network System (NSMONS) has not yet been assigned because of lacking borehole data. This study used an empirical horizontal-to-vertical (H/V) spectral ratio (hereafter, HVSR) site classification method to overcome this problem. First, according to their borehole data, stations selected from KiK-net in Japan were individually assigned a site class (CL-I, CL-II, or CL-III), which is defined in the Chinese seismic code. Then, the mean HVSR curve for each site class was computed using strong motion recordings captured during the period 1996-2012. These curves were compared with those proposed by Zhao et al. (2006a) for four types of site classes (SC-I, SC-II, SC-III, and SC-IV) defined in the Japanese seismic code (JRA, 1980). It was found that an approximate range of the predominant period Tg could be identified by the predominant peak of the HVSR curve for the CL-I and SC-I sites, CL-II and SC-II sites, and CL-III and SC-III + SC-IV sites. Second, an empirical site classification method was proposed based on comprehensive consideration of peak period, amplitude, and shape of the HVSR curve. The selected stations from KiK-net were classified using the proposed method. The results showed that the success rates of the proposed method in identifying CL-I, CL-II, and CL-III sites were 63%, 64%, and 58% respectively. Finally, the HVSRs of 178 NSMONS stations were computed based on recordings from 2007 to 2015 and the sites classified using the proposed method. The mean HVSR curves were re-calculated for three site classes and compared with those from KiK-net data. It was found that both the peak period and the amplitude were similar for the mean HVSR curves derived from NSMONS classification results and KiK-net borehole data, implying the effectiveness of the proposed method in identifying different site classes. The classification results have good agreement with site classes based on borehole data of 81 stations in China, which indicates that our site classification results are acceptable and that the proposed method is practicable.
Modified Laser and Thermos cell calculations on microcomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.
1987-01-01
In the course of designing and operating nuclear reactors, many fuel pin cell calculations are required to obtain homogenized cell cross sections as a function of burnup. In the interest of convenience and cost, it would be very desirable to be able to make such calculations on microcomputers. In addition, such a microcomputer code would be very helpful for educational course work in reactor computations. To establish the feasibility of making detailed cell calculations on a microcomputer, a mainframe cell code was compiled and run on a microcomputer. The computer code Laser, originally written in Fortran IV for the IBM-7090more » class of mainframe computers, is a cylindrical, one-dimensional, multigroup lattice cell program that includes burnup. It is based on the MUFT code for epithermal and fast group calculations, and Thermos for the thermal calculations. There are 50 fast and epithermal groups and 35 thermal groups. Resonances are calculated assuming a homogeneous system and then corrected for self-shielding, Dancoff, and Doppler by self-shielding factors. The Laser code was converted to run on a microcomputer. In addition, the Thermos portion of Laser was extracted and compiled separately to have available a stand alone thermal code.« less
Synchronization and an application of a novel fractional order King Cobra chaotic system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muthukumar, P., E-mail: muthukumardgl@gmail.com; Balasubramaniam, P., E-mail: balugru@gmail.com; Ratnavelu, K., E-mail: kuru052001@gmail.com
2014-09-01
In this paper, we design a new three dimensional King Cobra face shaped fractional order chaotic system. The multi-scale synchronization scheme of two fractional order chaotic systems is described. The necessary conditions for the multi-scale synchronization of two identical fractional order King Cobra chaotic systems are derived through feedback control. A new cryptosystem is proposed for an image encryption and decryption by using synchronized fractional order King Cobra chaotic systems with the supports of multiple cryptographic assumptions. The security of the proposed cryptosystem is analyzed by the well known algebraic attacks. Numerical simulations are given to show the effectiveness ofmore » the proposed theoretical results.« less
26 CFR 1.460-0 - Outline of regulations under section 460.
Code of Federal Regulations, 2013 CFR
2013-04-01
... total allocable contract costs. (iv) Pre-contracting-year costs. (v) Post-completion-year costs. (6) 10... improvements. (iv) Mixed use costs. (3) $10,000,000 gross receipts test. (i) In general. (ii) Single employer...) Computations. (3) Post-completion-year income. (4) Total contract price. (i) In general. (A) Definition. (B...
Calculation of transonic aileron buzz
NASA Technical Reports Server (NTRS)
Steger, J. L.; Bailey, H. E.
1979-01-01
An implicit finite-difference computer code that uses a two-layer algebraic eddy viscosity model and exact geometric specification of the airfoil has been used to simulate transonic aileron buzz. The calculated results, which were performed on both the Illiac IV parallel computer processor and the Control Data 7600 computer, are in essential agreement with the original expository wind-tunnel data taken in the Ames 16-Foot Wind Tunnel just after World War II. These results and a description of the pertinent numerical techniques are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, J.; Cao, L.; Ohkawa, K.
2012-07-01
The non-condensable gases condensation suppression model is important for a realistic LOCA safety analysis code. A condensation suppression model for direct contact condensation was previously developed by Westinghouse using first principles. The model is believed to be an accurate description of the direct contact condensation process in the presence of non-condensable gases. The Westinghouse condensation suppression model is further revised by applying a more physical model. The revised condensation suppression model is thus implemented into the WCOBRA/TRAC-TF2 LOCA safety evaluation code for both 3-D module (COBRA-TF) and 1-D module (TRAC-PF1). Parametric study using the revised Westinghouse condensation suppression model ismore » conducted. Additionally, the performance of non-condensable gases condensation suppression model is examined in the ACHILLES (ISP-25) separate effects test and LOFT L2-5 (ISP-13) integral effects test. (authors)« less
Index of Ship Structure Committee Publications.
1977-12-01
rtr ra $e , SEE BOX 15 , ent.r manufactur er s full nanre and if iv ision h f any 1 in Bois ~r If nor. than one manufacturci .ntei phrase , SEE BOX 15...Do rot use abbreviation s oi words that are pott of the subject category l iste d rrr Boy 2 Key word pirrases are li rrr ited to 60 total characters...state it part icipant act iv i t y or corporation and division submitting the document and GIDEP two .c ltoracter code (e .g , Xl ) . DO PORN 2000 i t
Doty, Michelle; Rustgi, Sheila D; Schoen, Cathy; Collins, Sara R
2009-01-01
As the U.S. economic downturn continues and job losses mount, more working Americans are likely to lose access to affordable health benefits subsidized by their employers. Analysis of the 2007 Commonwealth Fund Biennial Health Insurance Survey finds that two of three working adults would be eligible to extend job-based coverage, under the 1985 Consolidated Omnibus Budget Reconciliation Act (COBRA) if they became unemployed. Under COBRA, however, unemployed workers would have to pay four to six times their current contribution at a time of sharply reduced income. In fact, the latest national figures indicate that, because of high premiums, only 9 percent of unemployed workers have COBRA coverage. Substantial financial assistance of 75 percent to 85 percent of premiums could help laid-off workers maintain coverage. In addition, expansion of Medicaid and the State Children's Health Insurance Program would benefit low-income, laid-off workers and their families who are ineligible for COBRA.
Performance Evaluation of the COBRA GEM for the Application of the TPC
NASA Astrophysics Data System (ADS)
Terasaki, Kohei; Hamagaki, Hideki; Gunji, Taku; Yamaguchi, Yorito
2014-09-01
Suppression of the back-drifting ions from avalanche region to drift space (IBF: Ion Backflow) is the key for a Time Projection Chamber (TPC) since IBF easily distorts the drift field. To suppress IBF, Gating Grid system is widely used for the TPC but this limits the data taking rate. Gas Electron Multiplier (GEM) has advantages in the reduction of IBF and high rate capability. By adopting GEM, it is possible to run a TPC continuously under high rate and high multiplicity conditions. Motivated by the study of IBF reduction for RICH with Thick COBRA, which has been developed by F. A. Amero et al., we developed COBRA GEMs for the application of a TPC. With a stack configuration, IBF reaches about 0.1 ~ 0.5%, which is ×5--10 better IBF than the standard GEMs. However, the measured energy resolution with COBRA is 20% (σ) and this is much worse than the resolution with standard GEMs. Measurement of long-time stability of gain indicates that gain of COBRA varies significantly due to charging up effect. Simulation studies based on Garfield++ are performed for understanding quantitatively the reasons of worse energy resolution and instability of gain. In this presentation, we will report the simulation studies together with the measured performance of the COBRA GEM.
Computer Description of the Field Artillery Ammunition Supply Vehicle
1983-04-01
Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and
NASA TechPort Entry for Coiled Brine Recovery Assembly (CoBRA) CL IR&D Project
NASA Technical Reports Server (NTRS)
Pensinger, Stuart
2014-01-01
The Coiled Brine Recovery Assembly (CoBRA) project will result in a proof-of-concept demonstration for a lightweight, compact, affordable, regenerable and disposable solution to brine water recovery. The heart of CoBRA is an evaporator that produces water vapor from brine. This evaporator leverages a novel design that enables passive transport of brine from place to place within the system. While it will be necessary to build or modify a system for testing the CoBRA concept, the emphasis of this project will be on developing the evaporator itself. This project will utilize a “test early, test often” approach, building at least one trial evaporator to guide the design of the final product.
Day, J G; Benson, E E; Harding, K; Knowles, B; Idowu, M; Bremner, D; Santos, L; Santos, F; Friedl, T; Lorenz, M; Lukesova, A; Elster, J; Lukavsky, J; Herdman, M; Rippka, R; Hall, T
2005-01-01
Microalgae are one of the most biologically important elements of worldwide ecology and could be the source of diverse new products and medicines. COBRA (The COnservation of a vital european scientific and Biotechnological Resource: microAlgae and cyanobacteria) is the acronym for a European Union, RTD Infrastructures project (Contract No. QLRI-CT-2001-01645). This project is in the process of developing a European Biological Resource Centre based on existing algal culture collections. The COBRA project's central aim is to apply cryopreservation methodologies to microalgae and cyanobacteria, organisms that, to date, have proved difficult to conserve using cryogenic methods. In addition, molecular and biochemical stability tests have been developed to ensure that the equivalent strains of microorganisms supplied by the culture collections give high quality and consistent performance. Fundamental and applied knowledge of stress physiology form an essential component of the project and this is being employed to assist the optimisation of methods for preserving a wide range of algal diversity. COBRA's "Resource Centre" utilises Information Technologies (IT) and Knowledge Management practices to assist project coordination, management and information dissemination and facilitate the generation of new knowledge pertaining to algal conservation. This review of the COBRA project will give a summary of current methodologies for cryopreservation of microalgae and procedures adopted within the COBRA project to enhance preservation techniques for this diverse group of organisms.
1975-05-01
Conference on Earthquake Engineering, Santiago de Chile, 13-18 January 1969, Vol. I , Session B2, Chilean Association oil Seismology and Earth- quake...Nuclear Agency May 1975 DISTRIBUTED BY: KJ National Technical Information Service U. S. DEPARTMENT OF COMMERCE ^804J AFWL-TR-74-228, Vol. I ...CM o / i ’•fu.r ) V V AFWL-TR- 74-228 Vol. I SINGER: A COMPUTER CODE FOR GENERAL ANALYSIS OF TWO-DIMENSIONAL CONCRETE STRUCTURES Volum« I
HELIOS: A new open-source radiative transfer code
NASA Astrophysics Data System (ADS)
Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin
2015-12-01
I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net
Cloning and purification of alpha-neurotoxins from king cobra (Ophiophagus hannah).
He, Ying-Ying; Lee, Wei-Hui; Zhang, Yun
2004-09-01
Thirteen complete and three partial cDNA sequences were cloned from the constructed king cobra (Ophiophagus hannah) venom gland cDNA library. Phylogenetic analysis of nucleotide sequences of king cobra with those from other snake venoms revealed that obtained cDNAs are highly homologous to snake venom alpha-neurotoxins. Alignment of deduced mature peptide sequences of the obtained clones with those of other reported alpha-neurotoxins from the king cobra venom indicates that our obtained 16 clones belong to long-chain neurotoxins (seven), short-chain neurotoxins (seven), weak toxin (one) and variant (one), respectively. Up to now, two out of 16 newly cloned king cobra alpha-neurotoxins have identical amino acid sequences with CM-11 and Oh-6A/6B, which have been characterized from the same venom. Furthermore, five long-chain alpha-neurotoxins and two short-chain alpha-neurotoxins were purified from crude venom and their N-terminal amino acid sequences were determined. The cDNAs encoding the putative precursors of the purified native peptide were also determined based on the N-terminal amino acid sequencing. The purified alpha-neurotoxins showed different lethal activities on mice.
Growth in the Number of SSN Tracked Orbital Objects
NASA Technical Reports Server (NTRS)
Stansbery, Eugene G.
2004-01-01
The number of objects in earth orbit tracked by the US Space Surveillance Network (SSN) has experienced unprecedented growth since March, 2003. Approximately 2000 orbiting objects have been added to the "Analyst list" of tracked objects. This growth is primarily due to the resumption of full power/full time operation of the AN/FPS-108 Cobra Dane radar located on Shemya Island, AK. Cobra Dane is an L-band (23-cm wavelength) phased array radar which first became operational in 1977. Cobra Dane was a "Collateral Sensor" in the SSN until 1994 when its communication link with the Space Control Center (SCC) was closed. NASA and the Air Force conducted tests in 1999 using Cobra Dane to detect and track small debris. These tests confirmed that the radar was capable of detecting and maintaining orbits on objects as small as 5-cm diameter. Subsequently, Cobra Dane was reconnected to the SSN and resumed full power/full time space surveillance operations on March 4, 2003. This paper will examine the new data and its implications to the understanding of the orbital debris environment and orbital safety.
Coded DS-CDMA Systems with Iterative Channel Estimation and no Pilot Symbols
2010-08-01
ar X iv :1 00 8. 31 96 v1 [ cs .I T ] 1 9 A ug 2 01 0 1 Coded DS - CDMA Systems with Iterative Channel Estimation and no Pilot Symbols Don...sequence code-division multiple-access ( DS - CDMA ) systems with quadriphase-shift keying in which channel estimation, coherent demodulation, and decoding...amplitude, phase, and the interference power spectral density (PSD) due to the combined interference and thermal noise is proposed for DS - CDMA systems
Level Energies, Oscillator Strengths and Lifetimes for Transitions in Pb IV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colon, C.; Alonso-Medina, A.; Zanon, A.
2008-10-22
Oscillator strengths for several lines of astrophysical interest arising from some configurations and some levels radiative lifetimes of Pb IV have been calculated. These values were obtained in intermediate coupling (IC) and using ab initio relativistic Hartree-Fock calculations. We use for the IC calculations the standard method of least square fitting of experimental energy levels by means of computer codes from Cowan. Transition Probabilities and oscillator strengths obtained, although in general agreement with the rare experimental data, do present some noticeable discrepancies that are studied in the text.
2,445 Hours of Code: What I Learned from Facilitating Hour of Code Events in High School Libraries
ERIC Educational Resources Information Center
Colby, Jennifer
2015-01-01
This article describes a school librarian's experience with initiating an Hour of Code event for her school's student body. Hadi Partovi of Code.org conceived the Hour of Code "to get ten million students to try one hour of computer science" (Partovi, 2013a), which is implemented during Computer Science Education Week with a goal of…
GSTARS computer models and their applications, part I: theoretical development
Yang, C.T.; Simoes, F.J.M.
2008-01-01
GSTARS is a series of computer models developed by the U.S. Bureau of Reclamation for alluvial river and reservoir sedimentation studies while the authors were employed by that agency. The first version of GSTARS was released in 1986 using Fortran IV for mainframe computers. GSTARS 2.0 was released in 1998 for personal computer application with most of the code in the original GSTARS revised, improved, and expanded using Fortran IV/77. GSTARS 2.1 is an improved and revised GSTARS 2.0 with graphical user interface. The unique features of all GSTARS models are the conjunctive use of the stream tube concept and of the minimum stream power theory. The application of minimum stream power theory allows the determination of optimum channel geometry with variable channel width and cross-sectional shape. The use of the stream tube concept enables the simulation of river hydraulics using one-dimensional numerical solutions to obtain a semi-two- dimensional presentation of the hydraulic conditions along and across an alluvial channel. According to the stream tube concept, no water or sediment particles can cross the walls of stream tubes, which is valid for many natural rivers. At and near sharp bends, however, sediment particles may cross the boundaries of stream tubes. GSTARS3, based on FORTRAN 90/95, addresses this phenomenon and further expands the capabilities of GSTARS 2.1 for cohesive and non-cohesive sediment transport in rivers and reservoirs. This paper presents the concepts, methods, and techniques used to develop the GSTARS series of computer models, especially GSTARS3. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.
Melani, Rafael D; Skinner, Owen S; Fornelli, Luca; Domont, Gilberto B; Compton, Philip D; Kelleher, Neil L
2016-07-01
Characterizing whole proteins by top-down proteomics avoids a step of inference encountered in the dominant bottom-up methodology when peptides are assembled computationally into proteins for identification. The direct interrogation of whole proteins and protein complexes from the venom of Ophiophagus hannah (king cobra) provides a sharply clarified view of toxin sequence variation, transit peptide cleavage sites and post-translational modifications (PTMs) likely critical for venom lethality. A tube-gel format for electrophoresis (called GELFrEE) and solution isoelectric focusing were used for protein fractionation prior to LC-MS/MS analysis resulting in 131 protein identifications (18 more than bottom-up) and a total of 184 proteoforms characterized from 14 protein toxin families. Operating both GELFrEE and mass spectrometry to preserve non-covalent interactions generated detailed information about two of the largest venom glycoprotein complexes: the homodimeric l-amino acid oxidase (∼130 kDa) and the multichain toxin cobra venom factor (∼147 kDa). The l-amino acid oxidase complex exhibited two clusters of multiproteoform complexes corresponding to the presence of 5 or 6 N-glycans moieties, each consistent with a distribution of N-acetyl hexosamines. Employing top-down proteomics in both native and denaturing modes provides unprecedented characterization of venom proteoforms and their complexes. A precise molecular inventory of venom proteins will propel the study of snake toxin variation and the targeted development of new antivenoms or other biotherapeutics. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
1979-04-17
i I tie frlr. o, -J*:, i io s )r *i r -is 1 :1 iv 71 1, q. Q’ " n -’ d to - ’ln t i: s ’I osl o SA A~S: 11,0% unid-rs! n d i of ?rint, d c ,,.I tt...readings off dosimeter Coordinate scales Callsigns-suffices Three-letter codes Exmples Calculations Markings Radio comunications Range cards Notes Messages V
49 CFR 178.502 - Identification codes for packagings.
Code of Federal Regulations, 2010 CFR
2010-10-01
... construction, as follows: (i) “A” means steel (all types and surface treatments). (ii) “B” means aluminum. (iii) “C” means natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means...
Computation of airfoil buffet boundaries
NASA Technical Reports Server (NTRS)
Levy, L. L., Jr.; Bailey, H. E.
1981-01-01
The ILLIAC IV computer has been programmed with an implicit, finite-difference code for solving the thin layer compressible Navier-Stokes equation. Results presented for the case of the buffet boundaries of a conventional and a supercritical airfoil section at high Reynolds numbers are found to be in agreement with experimentally determined buffet boundaries, especially at the higher freestream Mach numbers and lower lift coefficients where the onset of unsteady flows is associated with shock wave-induced boundary layer separation.
Explains the benefits of using COBRA model to convert emission reductions into changes in air quality and estimates the number of cases of illness and death avoided as well as the economic value of those benefits.
Zhi, Hui; Li, Xin; Wang, Peng; Gao, Yue; Gao, Baoqing; Zhou, Dianshuang; Zhang, Yan; Guo, Maoni; Yue, Ming; Shen, Weitao
2018-01-01
Abstract Lnc2Meth (http://www.bio-bigdata.com/Lnc2Meth/), an interactive resource to identify regulatory relationships between human long non-coding RNAs (lncRNAs) and DNA methylation, is not only a manually curated collection and annotation of experimentally supported lncRNAs-DNA methylation associations but also a platform that effectively integrates tools for calculating and identifying the differentially methylated lncRNAs and protein-coding genes (PCGs) in diverse human diseases. The resource provides: (i) advanced search possibilities, e.g. retrieval of the database by searching the lncRNA symbol of interest, DNA methylation patterns, regulatory mechanisms and disease types; (ii) abundant computationally calculated DNA methylation array profiles for the lncRNAs and PCGs; (iii) the prognostic values for each hit transcript calculated from the patients clinical data; (iv) a genome browser to display the DNA methylation landscape of the lncRNA transcripts for a specific type of disease; (v) tools to re-annotate probes to lncRNA loci and identify the differential methylation patterns for lncRNAs and PCGs with user-supplied external datasets; (vi) an R package (LncDM) to complete the differentially methylated lncRNAs identification and visualization with local computers. Lnc2Meth provides a timely and valuable resource that can be applied to significantly expand our understanding of the regulatory relationships between lncRNAs and DNA methylation in various human diseases. PMID:29069510
NASA Astrophysics Data System (ADS)
Arling, J.-H.; Gerhardt, M.; Gößling, C.; Gehre, D.; Klingenberg, R.; Kröninger, K.; Nitsch, C.; Quante, T.; Rohatsch, K.; Tebrügge, J.; Temminghoff, R.; Theinert, R.; Zatschler, S.; Zuber, K.
2017-11-01
The COBRA collaboration searches for neutrinoless double beta-decay (0νββ-decay) using CdZnTe semiconductor detectors with a coplanar-grid readout and a surrounding guard-ring structure. The operation of the COBRA demonstrator at the Gran Sasso underground laboratory (LNGS) indicates that alpha-induced lateral surface events are the dominant source of background events. By instrumenting the guard-ring electrode it is possible to suppress this type of background. In laboratory measurements this method achieved a suppression factor of alpha-induced lateral surface events of 5300+2660-1380, while retaining (85.3 ±0.1%) of gamma events occurring in the entire detector volume. This suppression is superior to the pulse-shape analysis methods used so far in COBRA by three orders of magnitude.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behafarid, F.; Shaver, D. R.; Bolotnov, I. A.
The required technological and safety standards for future Gen IV Reactors can only be achieved if advanced simulation capabilities become available, which combine high performance computing with the necessary level of modeling detail and high accuracy of predictions. The purpose of this paper is to present new results of multi-scale three-dimensional (3D) simulations of the inter-related phenomena, which occur as a result of fuel element heat-up and cladding failure, including the injection of a jet of gaseous fission products into a partially blocked Sodium Fast Reactor (SFR) coolant channel, and gas/molten sodium transport along the coolant channels. The computational approachmore » to the analysis of the overall accident scenario is based on using two different inter-communicating computational multiphase fluid dynamics (CMFD) codes: a CFD code, PHASTA, and a RANS code, NPHASE-CMFD. Using the geometry and time history of cladding failure and the gas injection rate, direct numerical simulations (DNS), combined with the Level Set method, of two-phase turbulent flow have been performed by the PHASTA code. The model allows one to track the evolution of gas/liquid interfaces at a centimeter scale. The simulated phenomena include the formation and breakup of the jet of fission products injected into the liquid sodium coolant. The PHASTA outflow has been averaged over time to obtain mean phasic velocities and volumetric concentrations, as well as the liquid turbulent kinetic energy and turbulence dissipation rate, all of which have served as the input to the core-scale simulations using the NPHASE-CMFD code. A sliding window time averaging has been used to capture mean flow parameters for transient cases. The results presented in the paper include testing and validation of the proposed models, as well the predictions of fission-gas/liquid-sodium transport along a multi-rod fuel assembly of SFR during a partial loss-of-flow accident. (authors)« less
Integrated Composite Analyzer (ICAN): Users and programmers manual
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1986-01-01
The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.
Gomes, A; Saha, Archita; Chatterjee, Ipshita; Chakravarty, A K
2007-09-01
We reported previously that the methanolic root extract of the Indian medicinal plant Pluchea indica Less. (Asteraceae) could neutralize viper venom-induced action [Alam, M.I., Auddy, B., Gomes, A., 1996. Viper venom neutralization by Indian medicinal plant (Hemidesmus indicus and P. indica) root extracts. Phytother. Res. 10, 58-61]. The present study reports the neutralization of viper and cobra venom by beta-sitosterol and stigmasterol isolated from the root extract of P. indica Less. (Asteraceae). The active fraction (containing the major compound beta-sitosterol and the minor compound stigmasterol) was isolated and purified by silica gel column chromatography and the structure was determined using spectroscopic analysis (EIMS, (1)H NMR, (13)C NMR). Anti-snake venom activity was studied in experimental animals. The active fraction was found to significantly neutralize viper venom-induced lethal, hemorrhagic, defibrinogenation, edema and PLA(2) activity. Cobra venom-induced lethality, cardiotoxicity, neurotoxicity, respiratory changes and PLA(2) activity were also antagonized by the active component. It potentiated commercial snake venom antiserum action against venom-induced lethality in male albino mice. The active fraction could antagonize venom-induced changes in lipid peroxidation and superoxide dismutase activity. This study suggests that beta-sitosterol and stigmasterol may play an important role, along with antiserum, in neutralizing snake venom-induced actions.
2004-06-17
This 3-D image taken by the left and right eyes of the panoramic camera on NASA Mars Exploration Rover Spirit shows the odd rock formation dubbed Cobra Hoods center. 3D glasses are necessary to view this image.
Analysis of a Light Cross Country Combat Vehicle - The Cobra
1951-06-01
34-’"- _.-- ; ..-".’ ;" - -; - Regular fracks > Beguiar tracks, baled on the conventional :d©«= trend defined i>y the close spacing of track links...Annex .2 ). The complexity of conventional steering mechanisms is high. Plan- etary gears «, hydraulic controls., brakes, and other accessories...sntiite vehicle« i£n straight ^ruaning*. the joint is completely closed and is held In that position by the two hydraulic cylinders» ’To mate
7 CFR 400.767 - Requester obligations.
Code of Federal Regulations, 2011 CFR
2011-01-01
....gov; or (iv) By overnight delivery to the Associate Administrator, Risk Management Agency, United... subpart must: (1) Be submitted: (i) In writing by certified mail, to the Associate Administrator, Risk Management Agency, United States Department of Agriculture, Stop Code 0801, 1400 Independence Avenue, SW...
7 CFR 400.767 - Requester obligations.
Code of Federal Regulations, 2010 CFR
2010-01-01
....gov; or (iv) By overnight delivery to the Associate Administrator, Risk Management Agency, United... subpart must: (1) Be submitted: (i) In writing by certified mail, to the Associate Administrator, Risk Management Agency, United States Department of Agriculture, Stop Code 0801, 1400 Independence Avenue, SW...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.
1995-12-31
In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less
Muliya, Sanath Krishna; Bhat, Mudraje Narayana
2016-08-01
To study the hematology and serum biochemistry parameters of Indian spectacled cobra (Naja naja) and Indian rat snake (Ptyas mucosa) and to evaluate the differences in the same between captive and wild populations. Animals were categorized into four groups, viz., wild Indian spectacled cobra (n=10), wild Indian rat snakes (n=10), captive Indian spectacled cobra (n=10), and captive Indian rat snake (n=10). The snakes were restrained with restraint tubes, and 2 ml of blood was collected from either heart or ventral coccygeal vein. Hematological examinations were performed manually and serum biochemistry assays were performed on semi-automated clinical chemistry analyzer. The values of total erythrocyte count, packed cell volume, and hemoglobin were slightly low in captive spectacled cobras and captive rat snakes compared to wild ones, whereas total leukocyte count was found to be slightly high in wild spectacled cobras compared to captive ones. All the recorded values of biochemical and electrolyte analytes were found to be well within expected range for snakes except for total protein and chloride levels in both the species which was slightly above the expected range. The hematology and serum biochemistry intervals of the two most common Indian snakes are presented here. The data will be useful in routine health evaluations and aiding in better medical management of the species studied. Since this study is the first to report complete hematologic and blood biochemical ranges for the study species, observations made here can also be used as referral intervals for future use.
Brunda, G; Sashidhar, R B; Sarin, R K
2006-08-01
An immunoglobulin Y (IgY) based indirect double antibody sandwich enzyme linked immunosorbent assay (ELISA) was developed for the detection of Indian cobra (Naja naja naja) venom in the biological samples of forensic origin. Polyclonal antibodies were raised and purified from chick egg yolk and rabbit serum. The cobra venom was sandwiched between immobilized affinity purified IgY and the rabbit IgG. The detection concentration of cobra venom was in the range of 0.1 to 300ng. The calibration plot was based on linear regression analysis (y=0.2581x+0.4375, r(2)=0.9886). The limit of detection of the assay was found to be 0.1ng. The coefficient of variation (CV) of different concentrations of working range in inter (n=6) and intra-assay (n=6) was observed to be less than 10%. The recovery of venom was found to be in the range of 80-99%, when different concentrations (0.002, 0.1, 0.2, 1, and 2microg) of cobra venom were spiked to pooled normal human serum (ml(-1)). No cross reactivity was observed with krait and viper venom in the immunoassay system in the concentration range of 0.1-1000ng. The method was initially, validated by analyzing specimens (autopsy) of experimental rats injected with cobra venom (1.2mgkg(-1) body mass). Further, human specimens (autopsy and biopsy) of snake bite victims of forensic origin were also analyzed. The methodology developed may find diagnostic application in forensic laboratories.
29 CFR 2590.701-2 - Definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Regulations Relating to Labor (Continued) EMPLOYEE BENEFITS SECURITY ADMINISTRATION, DEPARTMENT OF LABOR GROUP HEALTH PLANS RULES AND REGULATIONS FOR GROUP HEALTH PLANS Health Coverage Portability, Nondiscrimination... coverage, under a group health plan, that satisfies an applicable COBRA continuation provision. (3) COBRA...
Concentrating Solar Power Projects - Andasol-1 | Concentrating Solar Power
: UTE CT Andasol-1: Cobra (80%) and Sener (20%) Operator(s): Cobra O&M Generation Offtaker(s # of Modules per SCA: 12 SCA Manufacturer (Model): UTE CT Andasol-1 (SKAL-ET) Mirror Manufacturer
CO-Benefits Risk Assessment (COBRA) Health Impacts Screening and Mapping Tool
The COBRA (Co-Benefits Risk Assessment) screening tool can be used by state and local governments to estimate the health and economic benefits of clean energy policies. Find information about how to use the tool here.
ERIC Educational Resources Information Center
Pratt, Carl R.
1994-01-01
Describes an experiment that uses the cobra lily (Darlingtonia californica) and fruit flies (Drosophila virilis) to investigate predator-prey relationships in a classroom laboratory. Suggestions for classroom extension of this experimental system are provided. (ZWH)
Selective recruitment of the lower fibers of the trapezius muscle.
Arlotta, Melissa; Lovasco, Gina; McLean, Linda
2011-06-01
We aimed to determine the effectiveness of five isometric exercises at maximally activating the lower trapezius muscle in healthy subjects. Surface electromyography data were recorded from the upper, middle, and lower fibers of the trapezius muscle bilaterally while 18 healthy subjects performed five different exercises: Latissimus Pull-down, Prone Row, Prone V-Raise, Posterior Fly and Modified Prone Cobra. The peak activation was determined from the rectified and smoothed data to determine which exercise generated the highest amount of lower trapezius activity, and to determine which exercise best resulted in activation of the lower fibers of trapezius while minimizing activation of the upper and middle fibers of trapezius. Males and females demonstrated different patterns of lower trapezius recruitment and therefore the data were analyzed separately for each sex. For the males, the Prone Row exercise (2.84 ± 1.67 mV), the Posterior Fly (2.23 ± 1.00 mV) and the Modified Prone Cobra (2.26 ± 1.19 mV) exercises generated the highest EMG activity in the lower trapezius muscle. For the females, the Modified Prone Cobra (2.40 ± 1.32 mV) and the Prone Row (2.37 ± 1.14 mV) exercises generated higher activation than the Latissimus Pull Down (1.04 ± 0.56 mV), the Posterior Fly (1.62 ± 1.044 mV) and the Prone V-Raise (1.32 ± 1.07 mV). In both sexes, the Modified Prone Cobra, the Prone Row and the Latissimus Pull Down outperformed the other exercises in terms of maximizing lower trapezius activation while minimizing activation of the upper and middle fibers of trapezius. The Modified Prone Cobra showed lower relative activation of the upper trapezius muscle than did the Prone Row exercise. The Modified Prone Cobra and Prone Row exercises are the most effective exercises for targeted strengthening of the lower trapezius muscle in both sexes. The Modified Prone Cobra is somewhat better than the Prone Row due to the low activation of the upper trapezius muscle during this exercise. The Modified Prone Cobra exercise should therefore be considered as a manual muscle test position and as a strengthening exercise for the lower trapezius muscle fibers. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Knauber, R. N.
1982-01-01
This report describes a FORTRAN IV coded computer program for post-flight evaluation of a launch vehicle upper stage on-off reaction control system. Aerodynamic and thrust misalignment disturbances are computed as well as the total disturbing moments in pitch, yaw, and roll. Effective thrust misalignment angle time histories of the rocket booster motor are calculated. Disturbing moments are integrated and used to estimate the required control system total inpulse. Effective control system specific inpulse is computed for the boost and coast phases using measured control fuel useage. This method has been used for more than fifteen years for analyzing the NASA Scout launch vehicle second and third-stage reaction control system performance. The computer program is set up in FORTRAN IV for a CDC CYBER 175 system. With slight modification it can be used on other machines having a FORTRAN compiler. The program has optional CALCOMP plotting output. With this option the program requires 19K words of memory and has 786 cards. Running time on a CDC CYBER 175 system is less than three (3) seconds for a typical problem.
Defense Depot Mechanicsburg Total Quality Management Implementation Plan
1989-06-01
B T I TLEE 5 . FUNDING NUMBERS Defense Depot Mechanicsburg Total Quality Management Implementation Plan 6. AUTHOR(S) 7. PERFORMING ORGANIZATION NAME...Form 298 (Rev. 2-89) L296- 102 Acces.ion For NYI J ... I:: ted DEFENSE DEPOT MECHANICSBURG PENNSYLVANIAL--I By_ TOTAL QUALITY MANAGEMENT K_~ t buty-n...IMPLEMENTATION PLAN Avmail-t!Ilty Codes IvLl c 2Dd/or JUN 3 0 1989 iDizt Special PURPOSE The purpose of this Total Quality Management Implementation
Triep, Michael; Hess, David; Chaves, Humberto; Brücker, Christoph; Balmert, Alexander; Westhoff, Guido; Bleckmann, Horst
2013-01-01
The spitting cobra Naja pallida can eject its venom towards an offender from a distance of up to two meters. The aim of this study was to understand the mechanisms responsible for the relatively large distance covered by the venom jet although the venom channel is only of micro-scale. Therefore, we analysed factors that influence secondary flow and pressure drop in the venom channel, which include the physical-chemical properties of venom liquid and the morphology of the venom channel. The cobra venom showed shear-reducing properties and the venom channel had paired ridges that span from the last third of the channel to its distal end, terminating laterally and in close proximity to the discharge orifice. To analyze the functional significance of these ridges we generated a numerical and an experimental model of the venom channel. Computational fluid dynamics (CFD) and Particle-Image Velocimetry (PIV) revealed that the paired interior ridges shape the flow structure upstream of the sharp 90° bend at the distal end. The occurrence of secondary flow structures resembling Dean-type vortical structures in the venom channel can be observed, which induce additional pressure loss. Comparing a venom channel featuring ridges with an identical channel featuring no ridges, one can observe a reduction of pressure loss of about 30%. Therefore it is concluded that the function of the ridges is similar to guide vanes used by engineers to reduce pressure loss in curved flow channels. PMID:23671569
ERIC Educational Resources Information Center
Ware, Ronnie J.
In an effort to increase curriculum opportunities in a rural school district, a computer project was implemented involving grade 9-12 students chosen on the basis of national percentile scores, IQ, and desire to attend college. The project offered, through programmed computer instruction, physics, French I and II, and German I. One proctor was…
1984-10-01
is ready for engagement. The SCAS pitch, roll, and yaw engage switches energize the appropriate channels of the SCAS and the electrical solenoid valves ...I 4i TFF FF iff iT1Ii F1T ’I" 114~~~~7. FF+4441UF - - 4- 1;4 444 1~ FF1 111410 4 IB1: TIF FF 1F V4j UVFFtli T FF’HI F F FF F F tF1 F ’’ ’-FF11 1
Fung, Shin Yee; Tan, Nget Hong; Sim, Si Mui; Marinello, Enrico; Guerranti, Roberto; Aguiyi, John Chinyere
2011-04-01
Mucuna pruriens has been used by native Nigerians as a prophylactic for snakebite. The protective effects of M. pruriens seed extract (MPE) were investigated against the pharmacological actions of N. sputatrix (Javan spitting cobra) venom in rats. The results showed that MPE-pretreatment protected against cardiorespiratory and, to a lesser extent, neuromuscular depressant effects of N. sputatrix venom. These may be explained at least in part by the neutralisation of the cobra venom toxins by anti-MPE antibodies elicited by the MPE pretreatment.
Fung, Shin Yee; Tan, Nget Hong; Sim, Si Mui; Aguiyi, John C.
2012-01-01
Mucuna pruriens Linn. (velvet bean) has been used by native Nigerians as a prophylactic for snakebite. Rats pretreated with M. pruriens seed extract (MPE) have been shown to protect against the lethal and cardiovascular depressant effects of Naja sputatrix (Javan spitting cobra) venoms, and the protective effect involved immunological neutralization of the venom toxins. To investigate further the mechanism of the protective effect of MPE pretreatment against cobra venom toxicity, the actions of Naja sputatrix venom on spontaneously beating rat atria and aortic rings isolated from both MPE pretreated and untreated rats were studied. Our results showed that the MPE pretreatment conferred protection against cobra venom-induced depression of atrial contractility and atrial rate in the isolated atrial preparations, but it had no effect on the venom-induced contractile response of aortic ring preparation. These observations suggested that the protective effect of MPE pretreatment against cobra venom toxicity involves a direct protective action of MPE on the heart function, in addition to the known immunological neutralization mechanism, and that the protective effect does not involve action on blood vessel contraction. The results also suggest that M. pruriens seed may contain novel cardioprotective agent with potential therapeutic value. PMID:21785646
Staged Z-pinch experiments on the Mega-Ampere current driver COBRA
NASA Astrophysics Data System (ADS)
Valenzuela, Julio; Banasek, Jacob; Byvank, Thomas; Conti, Fabio; Greenly, John; Hammer, David; Potter, William; Rocco, Sophia; Ross, Michael; Wessel, Frank; Narkis, Jeff; Rahman, Hafiz; Ruskov, Emil; Beg, Farhat
2017-10-01
Experiments were conducted on the Cornell's 1 MA, 100 ns current driver COBRA with the goal of better understanding the Staged Z-pinch physics and validating MHD codes. We used a gas injector composed of an annular (1.2 cm radius) high atomic number (e.g., Ar or Kr) gas-puff and an on-axis plasma gun that delivers the ionized hydrogen target. Liner implosion velocity and stability were studied using laser shadowgraphy and interferometry as well as XUV imaging. From the data, the signature of the MRT instability and zippering effect can be seen, but time integrated X-ray imaging show a stable target plasma. A key component of the experiment was the use of optical Thomson scattering (TS) diagnostics to characterize the liner and target plasmas. By fitting the experimental scattered spectra with synthetic data, electron and ion temperature as well as density can be obtained. Preliminary analysis shows significant scattered line broadening from the plasma on-axis ( 0.5 mm diameter) which can be explained by either a low temperature H plasma with Te =Ti =75eV, or by a hot plasma with Ti =3keV, Te =350eV if an Ar-H mixture is present with an Ar fraction higher than 10%. Funded by the Advanced Research Projects Agency - Energy, DE-AR0000569.
From COBRA to the Seine, August 1944: A Microcosm of the Operational Art
1986-05-09
while Patto , c,,rducted operational maneuvers in the German rear. Hill 217 was decisive t erra i r. The 2nd SS Piz Div reached almost to St.Hilaire by...with Eberbach as a r-eplacement for- vonr Funick. But the combined effects oif TOTALIZE, stiffening pressure at lortamii aria Patto -n’ 5 cuit north
JPRS Report - Science & Technology USSR: Life Sciences.
1988-04-22
Demand for Cobra Venom Increases 7 Effect of Piracetam on Resistance of Higher Nervous Activity to Informational Overloads [E.G. Chkhubianishvili...gram, giving people health and life. 13227 Effect of Piracetam on Resistance of Higher Nervous Activity to Informational Overloads 18400140...Institute of Physiology imeni I.S. Beritashvili; presented by Academician S.P. Narikashvili 21 Apr 86] [Abstract] Although piracetam is a cyclic derivative
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... DEPARTMENT OF LABOR Employee Benefits Security Administration Proposed Extension of Information Collection Request Submitted for Public Comment; COBRA Notification Requirements--American Recovery and Reinvestment Act of 2009 as Amended AGENCY: Employee Benefits Security Administration, Department of Labor...
Conjunctival impression cytology in computer users.
Kumar, S; Bansal, R; Khare, A; Malik, K P S; Malik, V K; Jain, K; Jain, C
2013-01-01
It is known that the computer users develop the features of dry eye. To study the cytological changes in the conjunctiva using conjunctival impression cytology in computer users and a control group. Fifteen eyes of computer users who had used computers for more than one year and ten eyes of an age-and-sex matched control group (those who had not used computers) were studied by conjunctival impression cytology. Conjunctival impression cytology (CIC) results in the control group were of stage 0 and stage I while the computer user group showed CIC results between stages II to stage IV. Among the computer users, the majority ( > 90 %) showed stage III and stage IV changes. We found that those who used computers daily for long hours developed more CIC changes than those who worked at the computer for a shorter daily duration. © NEPjOPH.
Damerau, B; Lege, L; Oldigs, H D; Vogt, W
1975-01-01
Cobra venom, alone and in combination, on mast cell degranulation, histamine release and formation of prostaglandin-like activity (SRS-C) was studied in perfused guinea-pig lungs and in mast cell-containing rat peritoneal cell suspensions. For comparison, the effect of equivalent doses of whole cobra venom was investigated. 1. Cobra venom caused mast cell degranulation, histamine release and SRS-C formation in both systems. For comparable effects much higher doses had to be used in guine-pig lungs than in rat peritoneal cell suspensions. 2. Phase A showed little degranulation of mast cells in both systems, a limited histamine release in rat peritoneal cell suspensions and none in perfused guinea-pig lungs. It caused a considerable SRS-C formation in both, lung tissue and peritoneal cell suspensions. 3. DLF caused histamine release, SRS-C formation and mast cell degranulation in both systems; in rat peritoneal cell suspensions it acted almost as strong as equivalent doses of cobra venom, in guinea pig lungs it was much less active. 4. In rat peritoneal cell suspensions the effects of DLF and phase A in combination did not exceed the sum of their single effects. In guinea-pig lungs these two substances interacted in a potentiating synergism. It is concluded that DLF is the main cytotoxic principle of cobra venom, whereas ph-ase A alone is not cytotoxic. The difference in the synergism of DLF and ph-ase A between rat peritoneal cells and guinea-pig lungs may be due to two different actions of DLF and species differences as regards sensitivity against these actions.
A calculation procedure for viscous flow in turbomachines, volume 3. [computer programs
NASA Technical Reports Server (NTRS)
Khalil, I.; Sheoran, Y.; Tabakoff, W.
1980-01-01
A method for analyzing the nonadiabatic viscous flow through turbomachine blade passages was developed. The field analysis is based upon the numerical integration of the full incompressible Navier-Stokes equations, together with the energy equation on the blade-to-blade surface. A FORTRAN IV computer program was written based on this method. The numerical code used to solve the governing equations employs a nonorthogonal boundary fitted coordinate system. The flow may be axial, radial or mixed and there may be a change in stream channel thickness in the through-flow direction. The inputs required for two FORTRAN IV programs are presented. The first program considers laminar flows and the second can handle turbulent flows. Numerical examples are included to illustrate the use of the program, and to show the results that are obtained.
1991-05-31
Corporation High Precision Nonlinear Computer Modelling Technique for Quartz Crystal Oscillators ............... 341 R. Brendel, F. Djian, CNRS & E. Robert...34) A.1.5% IV.1 Results of the computations for resonators having circular electrodes. The model was applied to compute the resonances 0f-.I frequencies...having circular electrodes. *- I The model was applied to compute the resonances frequencies of the fundamental mode and of its anharmonics ,odel and
7 CFR 1980.302 - Definitions and abbreviations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... codes listed in exhibit E to subpart A of part 1924 applicable to single family residential construction...: (i) Self-care, (ii) Receptive and expressive language, (iii) Learning, (iv) Mobility, (v) Self... Lender on behalf of the borrower. Lender. The organization making, holding, and/or servicing the loan...
7 CFR 1980.302 - Definitions and abbreviations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... codes listed in exhibit E to subpart A of part 1924 applicable to single family residential construction...: (i) Self-care, (ii) Receptive and expressive language, (iii) Learning, (iv) Mobility, (v) Self... Lender on behalf of the borrower. Lender. The organization making, holding, and/or servicing the loan...
48 CFR 52.204-6 - Data Universal Numbering System (DUNS) Number.
Code of Federal Regulations, 2010 CFR
2010-10-01
... of business (industry). (x) Company Headquarters name and address (reporting relationship within your... office. (2) The offeror should be prepared to provide the following information: (i) Company legal.... (iii) Company physical street address, city, state and Zip Code. (iv) Company mailing address, city...
Zhi, Hui; Li, Xin; Wang, Peng; Gao, Yue; Gao, Baoqing; Zhou, Dianshuang; Zhang, Yan; Guo, Maoni; Yue, Ming; Shen, Weitao; Ning, Shangwei; Jin, Lianhong; Li, Xia
2018-01-04
Lnc2Meth (http://www.bio-bigdata.com/Lnc2Meth/), an interactive resource to identify regulatory relationships between human long non-coding RNAs (lncRNAs) and DNA methylation, is not only a manually curated collection and annotation of experimentally supported lncRNAs-DNA methylation associations but also a platform that effectively integrates tools for calculating and identifying the differentially methylated lncRNAs and protein-coding genes (PCGs) in diverse human diseases. The resource provides: (i) advanced search possibilities, e.g. retrieval of the database by searching the lncRNA symbol of interest, DNA methylation patterns, regulatory mechanisms and disease types; (ii) abundant computationally calculated DNA methylation array profiles for the lncRNAs and PCGs; (iii) the prognostic values for each hit transcript calculated from the patients clinical data; (iv) a genome browser to display the DNA methylation landscape of the lncRNA transcripts for a specific type of disease; (v) tools to re-annotate probes to lncRNA loci and identify the differential methylation patterns for lncRNAs and PCGs with user-supplied external datasets; (vi) an R package (LncDM) to complete the differentially methylated lncRNAs identification and visualization with local computers. Lnc2Meth provides a timely and valuable resource that can be applied to significantly expand our understanding of the regulatory relationships between lncRNAs and DNA methylation in various human diseases. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Tinti, S.; Tonini, R.
2013-07-01
Nowadays numerical models are a powerful tool in tsunami research since they can be used (i) to reconstruct modern and historical events, (ii) to cast new light on tsunami sources by inverting tsunami data and observations, (iii) to build scenarios in the frame of tsunami mitigation plans, and (iv) to produce forecasts of tsunami impact and inundation in systems of early warning. In parallel with the general recognition of the importance of numerical tsunami simulations, the demand has grown for reliable tsunami codes, validated through tests agreed upon by the tsunami community. This paper presents the tsunami code UBO-TSUFD that has been developed at the University of Bologna, Italy, and that solves the non-linear shallow water (NSW) equations in a Cartesian frame, with inclusion of bottom friction and exclusion of the Coriolis force, by means of a leapfrog (LF) finite-difference scheme on a staggered grid and that accounts for moving boundaries to compute sea inundation and withdrawal at the coast. Results of UBO-TSUFD applied to four classical benchmark problems are shown: two benchmarks are based on analytical solutions, one on a plane wave propagating on a flat channel with a constant slope beach; and one on a laboratory experiment. The code is proven to perform very satisfactorily since it reproduces quite well the benchmark theoretical and experimental data. Further, the code is applied to a realistic tsunami case: a scenario of a tsunami threatening the coasts of eastern Sicily, Italy, is defined and discussed based on the historical tsunami of 11 January 1693, i.e. one of the most severe events in the Italian history.
Fault-tolerance in Two-dimensional Topological Systems
NASA Astrophysics Data System (ADS)
Anderson, Jonas T.
This thesis is a collection of ideas with the general goal of building, at least in the abstract, a local fault-tolerant quantum computer. The connection between quantum information and topology has proven to be an active area of research in several fields. The introduction of the toric code by Alexei Kitaev demonstrated the usefulness of topology for quantum memory and quantum computation. Many quantum codes used for quantum memory are modeled by spin systems on a lattice, with operators that extract syndrome information placed on vertices or faces of the lattice. It is natural to wonder whether the useful codes in such systems can be classified. This thesis presents work that leverages ideas from topology and graph theory to explore the space of such codes. Homological stabilizer codes are introduced and it is shown that, under a set of reasonable assumptions, any qubit homological stabilizer code is equivalent to either a toric code or a color code. Additionally, the toric code and the color code correspond to distinct classes of graphs. Many systems have been proposed as candidate quantum computers. It is very desirable to design quantum computing architectures with two-dimensional layouts and low complexity in parity-checking circuitry. Kitaev's surface codes provided the first example of codes satisfying this property. They provided a new route to fault tolerance with more modest overheads and thresholds approaching 1%. The recently discovered color codes share many properties with the surface codes, such as the ability to perform syndrome extraction locally in two dimensions. Some families of color codes admit a transversal implementation of the entire Clifford group. This work investigates color codes on the 4.8.8 lattice known as triangular codes. I develop a fault-tolerant error-correction strategy for these codes in which repeated syndrome measurements on this lattice generate a three-dimensional space-time combinatorial structure. I then develop an integer program that analyzes this structure and determines the most likely set of errors consistent with the observed syndrome values. I implement this integer program to find the threshold for depolarizing noise on small versions of these triangular codes. Because the threshold for magic-state distillation is likely to be higher than this value and because logical
Design geometry and design/off-design performance computer codes for compressors and turbines
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1995-01-01
This report summarizes some NASA Lewis (i.e., government owned) computer codes capable of being used for airbreathing propulsion system studies to determine the design geometry and to predict the design/off-design performance of compressors and turbines. These are not CFD codes; velocity-diagram energy and continuity computations are performed fore and aft of the blade rows using meanline, spanline, or streamline analyses. Losses are provided by empirical methods. Both axial-flow and radial-flow configurations are included.
NASA Astrophysics Data System (ADS)
Boucher, Jean-Philippe; Clanet, Christophe; Quéré, David; Chevy, Frédéric
2017-08-01
The cobra wave is a popular physical phenomenon arising from the explosion of a metastable grillage made of popsicle sticks. The sticks are expelled from the mesh by releasing the elastic energy stored during the weaving of the structure. Here we analyze both experimentally and theoretically the propagation of the wave front depending on the properties of the sticks and the pattern of the mesh. We show that its velocity and its shape are directly related to the recoil imparted to the structure by the expelled sticks. Finally, we show that the cobra wave can only exist for a narrow range of parameters constrained by gravity and rupture of the sticks.
Joint Services Electronics Program Annual Progress Report.
1985-11-01
one symbol memory) adaptive lHuffman codes were performed, and the compression achieved was compared with that of Ziv - Lempel coding. As was expected...MATERIALS 8 4. Information Systems 9 4.1 REAL TIME STATISTICAL DATA PROCESSING 9 -. 4.2 DATA COMPRESSION for COMPUTER DATA STRUCTURES 9 5. PhD...a. Real Time Statistical Data Processing (T. Kailatb) b. Data Compression for Computer Data Structures (J. Gill) Acces Fo NTIS CRA&I I " DTIC TAB
1990-09-01
13 Bart Kuhn, GM-14 Samantha K. Maddox , GS-04 Mike Nakada, GM- 13 John Wolfe, GM-14 Reynaldo I. Monzon, GS- 12 Jose G. Suarez, GS- 11 19 Product...1410-09 GS-334-09 Janice Whiting Procurement Clerk Code 21 GS-1106-05 Separations Samantha Maddox Hoa T. Lu Supply Clerk Computer Specialist Code 21...Jennifer Thorp Royal S. Magnus Student Aide Personnel Research Psychologist Code 23 Code 12 GW-322-03 GS-180-11 Linda L. Turnmire Yvonne S. Baker Computer
[Spitting cobras: description of 2 cases in Djibouti].
Rouvin, B; Kone, M; N'diaye, M; Seck, M; Diatta, B
2010-02-01
The purpose of this report is to describe two cases involving ophthalmic exposure to venom from spitting cobras. Based on these cases, readers are reminded that eye injury can be prevented by low-cost treatment consisting of prompt, prolonged saline irrigation. This treatment also reduces pain.
A I-V analysis of irradiated Gallium Arsenide solar cells
NASA Technical Reports Server (NTRS)
Heulenberg, A.; Maurer, R. H.; Kinnison, J. D.
1991-01-01
A computer program was used to analyze the illuminated I-V characteristics of four sets of gallium arsenide (GaAs) solar cells irradiated with 1-MeV electrons and 10-MeV protons. It was concluded that junction regions (J sub r) dominate nearly all GaAs cells tested, except for irradiated Mitsubishi cells, which appear to have a different doping profile. Irradiation maintains or increases the dominance by J sub r. Proton irradiation increases J sub r more than does electron irradiation. The U.S. cells were optimized for beginning of life (BOL) and the Japanese for end of life (EOL). I-V analysis indicates ways of improving both the BOL and EOL performance of GaAs solar cells.
76 FR 39039 - Establishment of a New Drug Code for Marihuana Extract
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-05
... that have been derived from any plant of the genus cannabis and which contain cannabinols and... Nations Conventions on international drug control treat extracts from the cannabis plant differently than.... Cannabis and cannabis resin are listed in both schedule IV and schedule I of the Single Convention...
2015-09-01
Figures iv List of Tables iv 1. Introduction 1 2. Device Status Data 1 2.1 SNMP 1 2.2 NMS 1 2.3 ICMP Ping 2 3. Data Collection 2 4. Hydra ...Configuration 3 4.1 Status Codes 4 4.2 Request Time 5 4.3 Hydra BLOb Metadata 6 5. Data Processing 6 5.1 Hydra Data Processing Framework 6 5.1.1...Basic Components 6 5.1.2 Map Component 7 5.1.3 Postmap Methods 8 5.1.4 Data Flow 9 5.1.5 Distributed Processing Considerations 9 5.2 Specific Hydra
PHASE I MATERIALS PROPERTY DATABASE DEVELOPMENT FOR ASME CODES AND STANDARDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Lin, Lianshan
2013-01-01
To support the ASME Boiler and Pressure Vessel Codes and Standard (BPVC) in modern information era, development of a web-based materials property database is initiated under the supervision of ASME Committee on Materials. To achieve efficiency, the project heavily draws upon experience from development of the Gen IV Materials Handbook and the Nuclear System Materials Handbook. The effort is divided into two phases. Phase I is planned to deliver a materials data file warehouse that offers a depository for various files containing raw data and background information, and Phase II will provide a relational digital database that provides advanced featuresmore » facilitating digital data processing and management. Population of the database will start with materials property data for nuclear applications and expand to data covering the entire ASME Code and Standards including the piping codes as the database structure is continuously optimized. The ultimate goal of the effort is to establish a sound cyber infrastructure that support ASME Codes and Standards development and maintenance.« less
A Spatiotemporal Clustering Approach to Maritime Domain Awareness
2013-09-01
1997. [25] M. E. Celebi, “Effective initialization of k-means for color quantization,” 16th IEEE International Conference on Image Processing (ICIP...release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Spatiotemporal clustering is the process of grouping...Department of Electrical and Computer Engineering iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT Spatiotemporal clustering is the process of
Imaging Gallium Nitride High Electron Mobility Transistors to Identify Point Defects
2014-03-01
streamline the sample preparation procedure to maximize the yield of successful samples to be analyzed chemically in an energy dispersive spectrometry...transmission electron microscope (STEM), sample preparation 15. NUMBER OF PAGES 103 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT...Computer Engineering iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT The purpose of this thesis is to streamline the sample preparation
Accelerating next generation sequencing data analysis with system level optimizations.
Kathiresan, Nagarajan; Temanni, Ramzi; Almabrazi, Hakeem; Syed, Najeeb; Jithesh, Puthen V; Al-Ali, Rashid
2017-08-22
Next generation sequencing (NGS) data analysis is highly compute intensive. In-memory computing, vectorization, bulk data transfer, CPU frequency scaling are some of the hardware features in the modern computing architectures. To get the best execution time and utilize these hardware features, it is necessary to tune the system level parameters before running the application. We studied the GATK-HaplotypeCaller which is part of common NGS workflows, that consume more than 43% of the total execution time. Multiple GATK 3.x versions were benchmarked and the execution time of HaplotypeCaller was optimized by various system level parameters which included: (i) tuning the parallel garbage collection and kernel shared memory to simulate in-memory computing, (ii) architecture-specific tuning in the PairHMM library for vectorization, (iii) including Java 1.8 features through GATK source code compilation and building a runtime environment for parallel sorting and bulk data transfer (iv) the default 'on-demand' mode of CPU frequency is over-clocked by using 'performance-mode' to accelerate the Java multi-threads. As a result, the HaplotypeCaller execution time was reduced by 82.66% in GATK 3.3 and 42.61% in GATK 3.7. Overall, the execution time of NGS pipeline was reduced to 70.60% and 34.14% for GATK 3.3 and GATK 3.7 respectively.
42 CFR 411.108 - Taking into account entitlement to Medicare.
Code of Federal Regulations, 2012 CFR
2012-10-01
...) Terminating coverage because the individual has become entitled to Medicare, except as permitted under COBRA..., instructions to bill Medicare first for services furnished to Medicare beneficiaries without stipulating that... employment status, the GHP coverage is by virtue of the COBRA law rather than by virtue of the current...
42 CFR 411.108 - Taking into account entitlement to Medicare.
Code of Federal Regulations, 2013 CFR
2013-10-01
...) Terminating coverage because the individual has become entitled to Medicare, except as permitted under COBRA..., instructions to bill Medicare first for services furnished to Medicare beneficiaries without stipulating that... employment status, the GHP coverage is by virtue of the COBRA law rather than by virtue of the current...
42 CFR 411.108 - Taking into account entitlement to Medicare.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) Terminating coverage because the individual has become entitled to Medicare, except as permitted under COBRA..., instructions to bill Medicare first for services furnished to Medicare beneficiaries without stipulating that... employment status, the GHP coverage is by virtue of the COBRA law rather than by virtue of the current...
42 CFR 411.108 - Taking into account entitlement to Medicare.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) Terminating coverage because the individual has become entitled to Medicare, except as permitted under COBRA..., instructions to bill Medicare first for services furnished to Medicare beneficiaries without stipulating that... employment status, the GHP coverage is by virtue of the COBRA law rather than by virtue of the current...
26 CFR 54.4980B-8 - Paying for COBRA continuation coverage.
Code of Federal Regulations, 2010 CFR
2010-04-01
... employee's family are covered under the plan. The employee experiences a qualifying event that is the termination of the employee's employment. The employee's family qualifies for the disability extension because... with respect to the employee's family for the first 18 months of COBRA continuation coverage, and the...
A MATLAB based 3D modeling and inversion code for MT data
NASA Astrophysics Data System (ADS)
Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.
2017-07-01
The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.
Computer Code for the Determination of Ejection Seat/Man Aerodynamic Parameters.
1980-08-28
ARMS, and LES (computer code -- .,. ,... ,, ..,.., .: . .. ... ,-." . ;.’ -- I- ta names) and Seat consisted of 4 panels SEAT, BACK, PADD , and SIDE. An... general application of Eq. (I) is for blunt bodies at hypersonic speed, because accuracy of this equation becomes better at higher Mach number. Therefore...pressure coefficient is set equal to zero on those portions of the body that are invisible to a distant observer who views the body from the direction
E-O Sensor Signal Recognition Simulation: Computer Code SPOT I.
1978-10-01
scattering phase function PDCO , defined at the specified wavelength, given for each of the scattering angles defined. Currently, a maximum of sixty-four...PHASE MATRIX DATA IS DEFINED PDCO AVERAGE PROBABILITY FOR PHASE MATRIX DEFINITION NPROB PROBLEM NUMBER 54 Fig. 12. FLOWCHART for the SPOT Computer Code...El0.1 WLAM(N) Wavelength at which the aerosol single-scattering phase function set is defined (microns) 3 8El0.1 PDCO (N,I) Average probability for
BiKEGG: a COBRA toolbox extension for bridging the BiGG and KEGG databases.
Jamialahmadi, Oveis; Motamedian, Ehsan; Hashemi-Najafabadi, Sameereh
2016-10-18
Development of an interface tool between the Biochemical, Genetic and Genomic (BiGG) and KEGG databases is necessary for simultaneous access to the features of both databases. For this purpose, we present the BiKEGG toolbox, an open source COBRA toolbox extension providing a set of functions to infer the reaction correspondences between the KEGG reaction identifiers and those in the BiGG knowledgebase using a combination of manual verification and computational methods. Inferred reaction correspondences using this approach are supported by evidence from the literature, which provides a higher number of reconciled reactions between these two databases compared to the MetaNetX and MetRxn databases. This set of equivalent reactions is then used to automatically superimpose the predicted fluxes using COBRA methods on classical KEGG pathway maps or to create a customized metabolic map based on the KEGG global metabolic pathway, and to find the corresponding reactions in BiGG based on the genome annotation of an organism in the KEGG database. Customized metabolic maps can be created for a set of pathways of interest, for the whole KEGG global map or exclusively for all pathways for which there exists at least one flux carrying reaction. This flexibility in visualization enables BiKEGG to indicate reaction directionality as well as to visualize the reaction fluxes for different static or dynamic conditions in an animated manner. BiKEGG allows the user to export (1) the output visualized metabolic maps to various standard image formats or save them as a video or animated GIF file, and (2) the equivalent reactions for an organism as an Excel spreadsheet.
Nongonierma, Alice B; Mooney, Catherine; Shields, Denis C; FitzGerald, Richard J
2014-07-01
Molecular docking of a library of all 8000 possible tripeptides to the active site of DPP-IV was used to determine their binding potential. A number of tripeptides were selected for experimental testing, however, there was no direct correlation between the Vina score and their in vitro DPP-IV inhibitory properties. While Trp-Trp-Trp, the peptide with the best docking score, was a moderate DPP-IV inhibitor (IC50 216μM), Lineweaver and Burk analysis revealed its action to be non-competitive. This suggested that it may not bind to the active site of DPP-IV as assumed in the docking prediction. Furthermore, there was no significant link between DPP-IV inhibition and the physicochemical properties of the peptides (molecular mass, hydrophobicity, hydrophobic moment (μH), isoelectric point (pI) and charge). LIGPLOTs indicated that competitive inhibitory peptides were predicted to have both hydrophobic and hydrogen bond interactions with the active site of DPP-IV. DPP-IV inhibitory peptides generally had a hydrophobic or aromatic amino acid at the N-terminus, preferentially a Trp for non-competitive inhibitors and a broader range of residues for competitive inhibitors (Ile, Leu, Val, Phe, Trp or Tyr). Two of the potent DPP-IV inhibitors, Ile-Pro-Ile and Trp-Pro (IC50 values of 3.5 and 44.2μM, respectively), were predicted to be gastrointestinally/intestinally stable. This work highlights the needs to test the assumptions (i.e. competitive binding) of any integrated strategy of computational and experimental screening, in optimizing screening. Future strategies targeting allosteric mechanisms may need to rely more on structure-activity relationship modeling, rather than on docking, in computationally selecting peptides for screening. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, F.; Brown, K.; Flach, G.
The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material propertiesmore » via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was developed to link GoldSim with external codes (Smith III et al. 2010). The DLL uses a list of code inputs provided by GoldSim to create an input file for the external application, runs the external code, and returns a list of outputs (read from files created by the external application) back to GoldSim. In this way GoldSim provides: (1) a unified user interface to the applications, (2) the capability of coupling selected codes in a synergistic manner, and (3) the capability of performing probabilistic uncertainty analysis with the codes. GoldSim is made available by the GoldSim Technology Group as a free 'Player' version that allows running but not editing GoldSim models. The player version makes the software readily available to a wider community of users that would wish to use the CBP application but do not have a license for GoldSim.« less
Love, Lonnie
2018-06-12
ORNL's newly printed 3D Shelby Cobra was showcased at the 2015 NAIAS in Detroit. This "laboratory on wheels" uses the Shelby Cobra design, celebrating the 50th anniversary of this model and honoring the first vehicle to be voted a national monument. The Shelby was printed at the Department of Energyâs Manufacturing Demonstration Facility at ORNL using the BAAM (Big Area Additive Manufacturing) machine and is intended as a âplug-n-playâ laboratory on wheels. The Shelby will allow research and development of integrated components to be tested and enhanced in real time, improving the use of sustainable, digital manufacturing solutions in the automotive industry.
An approach for coupled-code multiphysics core simulations from a common input
Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...
2014-12-10
This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less
Performance Measurement, Visualization and Modeling of Parallel and Distributed Programs
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Sarukkai, Sekhar R.; Mehra, Pankaj; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
This paper presents a methodology for debugging the performance of message-passing programs on both tightly coupled and loosely coupled distributed-memory machines. The AIMS (Automated Instrumentation and Monitoring System) toolkit, a suite of software tools for measurement and analysis of performance, is introduced and its application illustrated using several benchmark programs drawn from the field of computational fluid dynamics. AIMS includes (i) Xinstrument, a powerful source-code instrumentor, which supports both Fortran77 and C as well as a number of different message-passing libraries including Intel's NX Thinking Machines' CMMD, and PVM; (ii) Monitor, a library of timestamping and trace -collection routines that run on supercomputers (such as Intel's iPSC/860, Delta, and Paragon and Thinking Machines' CM5) as well as on networks of workstations (including Convex Cluster and SparcStations connected by a LAN); (iii) Visualization Kernel, a trace-animation facility that supports source-code clickback, simultaneous visualization of computation and communication patterns, as well as analysis of data movements; (iv) Statistics Kernel, an advanced profiling facility, that associates a variety of performance data with various syntactic components of a parallel program; (v) Index Kernel, a diagnostic tool that helps pinpoint performance bottlenecks through the use of abstract indices; (vi) Modeling Kernel, a facility for automated modeling of message-passing programs that supports both simulation -based and analytical approaches to performance prediction and scalability analysis; (vii) Intrusion Compensator, a utility for recovering true performance from observed performance by removing the overheads of monitoring and their effects on the communication pattern of the program; and (viii) Compatibility Tools, that convert AIMS-generated traces into formats used by other performance-visualization tools, such as ParaGraph, Pablo, and certain AVS/Explorer modules.
NASA Astrophysics Data System (ADS)
Akhmedova, Sh; Semenkin, E.
2017-02-01
Previously, a meta-heuristic approach, called Co-Operation of Biology-Related Algorithms or COBRA, for solving real-parameter optimization problems was introduced and described. COBRA’s basic idea consists of a cooperative work of five well-known bionic algorithms such as Particle Swarm Optimization, the Wolf Pack Search, the Firefly Algorithm, the Cuckoo Search Algorithm and the Bat Algorithm, which were chosen due to the similarity of their schemes. The performance of this meta-heuristic was evaluated on a set of test functions and its workability was demonstrated. Thus it was established that the idea of the algorithms’ cooperative work is useful. However, it is unclear which bionic algorithms should be included in this cooperation and how many of them. Therefore, the five above-listed algorithms and additionally the Fish School Search algorithm were used for the development of five different modifications of COBRA by varying the number of component-algorithms. These modifications were tested on the same set of functions and the best of them was found. Ways of further improving the COBRA algorithm are then discussed.
46 CFR 53.01-3 - Adoption of section IV of the ASME Boiler and Pressure Vessel Code.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 2 2012-10-01 2012-10-01 false Adoption of section IV of the ASME Boiler and Pressure Vessel Code. 53.01-3 Section 53.01-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING HEATING BOILERS General Requirements § 53.01-3 Adoption of section IV of the ASME Boiler and Pressure Vessel Code. (a) Heating...
46 CFR 53.01-3 - Adoption of section IV of the ASME Boiler and Pressure Vessel Code.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 2 2014-10-01 2014-10-01 false Adoption of section IV of the ASME Boiler and Pressure Vessel Code. 53.01-3 Section 53.01-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING HEATING BOILERS General Requirements § 53.01-3 Adoption of section IV of the ASME Boiler and Pressure Vessel Code. (a) Heating...
46 CFR 53.01-3 - Adoption of section IV of the ASME Boiler and Pressure Vessel Code.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 2 2010-10-01 2010-10-01 false Adoption of section IV of the ASME Boiler and Pressure Vessel Code. 53.01-3 Section 53.01-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING HEATING BOILERS General Requirements § 53.01-3 Adoption of section IV of the ASME Boiler and Pressure Vessel Code. (a) Heating...
46 CFR 53.01-3 - Adoption of section IV of the ASME Boiler and Pressure Vessel Code.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 46 Shipping 2 2013-10-01 2013-10-01 false Adoption of section IV of the ASME Boiler and Pressure Vessel Code. 53.01-3 Section 53.01-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING HEATING BOILERS General Requirements § 53.01-3 Adoption of section IV of the ASME Boiler and Pressure Vessel Code. (a) Heating...
46 CFR 53.01-3 - Adoption of section IV of the ASME Boiler and Pressure Vessel Code.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 2 2011-10-01 2011-10-01 false Adoption of section IV of the ASME Boiler and Pressure Vessel Code. 53.01-3 Section 53.01-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING HEATING BOILERS General Requirements § 53.01-3 Adoption of section IV of the ASME Boiler and Pressure Vessel Code. (a) Heating...
40 CFR 147.2250 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2011 CFR
2011-07-01
... of the Federal Register on June 25, 1984. (1) Utah Water Pollution Control Act, Utah Code Annotated... Executive Secretary of Utah Water Pollution Control Committee on August 16, 1990). (b) Other laws. The... Department of Health, Division of Environmental Health, Bureau of Water Pollution Control, to EPA Region VIII...
40 CFR 147.2250 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2010 CFR
2010-07-01
... of the Federal Register on June 25, 1984. (1) Utah Water Pollution Control Act, Utah Code Annotated... Executive Secretary of Utah Water Pollution Control Committee on August 16, 1990). (b) Other laws. The... Department of Health, Division of Environmental Health, Bureau of Water Pollution Control, to EPA Region VIII...
40 CFR 147.1250 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2013 CFR
2013-07-01
... program administered by the Mississippi Department of Natural Resources approved by EPA pursuant to..., 1984. (1) Mississippi Air and Water Pollution Control Law, Mississippi Code Annotated sections 49-17-1 through 49-17-29 (1972) and Supp. 1983); (2) Mississippi Department of Natural Resources, Bureau of...
40 CFR 147.1250 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2014 CFR
2014-07-01
... program administered by the Mississippi Department of Natural Resources approved by EPA pursuant to..., 1984. (1) Mississippi Air and Water Pollution Control Law, Mississippi Code Annotated sections 49-17-1 through 49-17-29 (1972) and Supp. 1983); (2) Mississippi Department of Natural Resources, Bureau of...
40 CFR 147.1250 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2010 CFR
2010-07-01
... program administered by the Mississippi Department of Natural Resources approved by EPA pursuant to..., 1984. (1) Mississippi Air and Water Pollution Control Law, Mississippi Code Annotated sections 49-17-1 through 49-17-29 (1972) and Supp. 1983); (2) Mississippi Department of Natural Resources, Bureau of...
40 CFR 147.1250 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2012 CFR
2012-07-01
... program administered by the Mississippi Department of Natural Resources approved by EPA pursuant to..., 1984. (1) Mississippi Air and Water Pollution Control Law, Mississippi Code Annotated sections 49-17-1 through 49-17-29 (1972) and Supp. 1983); (2) Mississippi Department of Natural Resources, Bureau of...
40 CFR 147.1250 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2011 CFR
2011-07-01
... program administered by the Mississippi Department of Natural Resources approved by EPA pursuant to..., 1984. (1) Mississippi Air and Water Pollution Control Law, Mississippi Code Annotated sections 49-17-1 through 49-17-29 (1972) and Supp. 1983); (2) Mississippi Department of Natural Resources, Bureau of...
75 FR 18472 - Cooperative Conservation Partnership Initiative
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-12
... machine harvested. The crop may be grasses, legumes, or a combination of both. Indian land is an inclusive... in clause (i), (ii), (iii), or (iv) of section 170(h)(4)(A) of the Internal Revenue Code of 1986; is... the historic climax plant community is predominantly grasses, grass-like plants, forbs, or shrubs and...
Tagliaferri, Luca; Kovács, György; Autorino, Rosa; Budrukkar, Ashwini; Guinot, Jose Luis; Hildebrand, Guido; Johansson, Bengt; Monge, Rafael Martìnez; Meyer, Jens E; Niehoff, Peter; Rovirosa, Angeles; Takàcsi-Nagy, Zoltàn; Dinapoli, Nicola; Lanzotti, Vito; Damiani, Andrea; Soror, Tamer; Valentini, Vincenzo
2016-08-01
Aim of the COBRA (Consortium for Brachytherapy Data Analysis) project is to create a multicenter group (consortium) and a web-based system for standardized data collection. GEC-ESTRO (Groupe Européen de Curiethérapie - European Society for Radiotherapy & Oncology) Head and Neck (H&N) Working Group participated in the project and in the implementation of the consortium agreement, the ontology (data-set) and the necessary COBRA software services as well as the peer reviewing of the general anatomic site-specific COBRA protocol. The ontology was defined by a multicenter task-group. Eleven centers from 6 countries signed an agreement and the consortium approved the ontology. We identified 3 tiers for the data set: Registry (epidemiology analysis), Procedures (prediction models and DSS), and Research (radiomics). The COBRA-Storage System (C-SS) is not time-consuming as, thanks to the use of "brokers", data can be extracted directly from the single center's storage systems through a connection with "structured query language database" (SQL-DB), Microsoft Access(®), FileMaker Pro(®), or Microsoft Excel(®). The system is also structured to perform automatic archiving directly from the treatment planning system or afterloading machine. The architecture is based on the concept of "on-purpose data projection". The C-SS architecture is privacy protecting because it will never make visible data that could identify an individual patient. This C-SS can also benefit from the so called "distributed learning" approaches, in which data never leave the collecting institution, while learning algorithms and proposed predictive models are commonly shared. Setting up a consortium is a feasible and practicable tool in the creation of an international and multi-system data sharing system. COBRA C-SS seems to be well accepted by all involved parties, primarily because it does not influence the center's own data storing technologies, procedures, and habits. Furthermore, the method preserves the privacy of all patients.
ORNL Resolved Resonance Covariance Generation for ENDF/B-VII.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leal, Luiz C.; Guber, Klaus H.; Wiarda, Dorothea
2012-12-01
Resonance-parameter covariance matrix (RPCM) evaluations in the resolved resonance regionwere done at the Oak Ridge National Laboratory (ORNL) for the chromium isotopes, titanium isotopes, 19F, 58Ni, 60Ni, 35Cl, 37Cl, 39K, 41K, 55Mn, 233U, 235U, 238U, and 239Pu using the computer code SAMMY. The retroactive approach of the code SAMMY was used to generate the RPCMs for 233U. For 235U, the approach used for covariance generation was similar to the retroactive approach with the distinction that real experimental data were used as opposed to data generated from the resonance parameters. RPCMs for 238U and 239Pu were generated together with the resonancemore » parameter evaluations. The RPCMs were then converted in the ENDF format using the FILE32 representation. Alternatively, for computer storage reasons, the FILE32 was converted in the FILE33 cross section covariance matrix (CSCM). Both representations were processed using the computer code PUFF-IV. This paper describes the procedures used to generate the RPCM and CSCM in the resonance region for ENDF/B-VII.1. The impact of data uncertainty in nuclear reactor benchmark calculations is also presented.« less
1983-03-01
PARK MATERIALS RESEARCH LA.. L E CROSS ET AL. MAR 83 N00014-78-C-0291 F/G Pill! I 1.0 ü1- I 2.5 I.I 12.2 - li. 112.0 1.8 125 Ulli 1.4...DTIC L.E. Cross R.E. Newnham S-5- £arsch ÄELECTE J.V. Bi^gers ^k SEP 7 1983 s TABLE OF CONTENTS INTRODUCTION 1 SECTION I 2 1.0 STUDIES...Niobate. 85 87 89 91 93 & i • A • D ity Codes and/or Bial - -. • - 1 • ’ ’• i i i —m^^^mv IV APPENDIX 26 Dielectric Properties of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Killough, G.G.; Rohwer, P.S.
1974-03-01
INDOS1, INDOS2, and INDOS3 (the INDOS codes) are conversational FORTRAN IV programs, implemented for use in time-sharing mode on the ORNL PDP-10 System. These codes use ICRP10-10A models to estimate the radiation dose to an organ of the body of Reference Man resulting from the ingestion or inhalation of any one of various radionuclides. Two patterns of intake are simulated: intakes at discrete times and continuous intake at a constant rate. The IND0S codes provide tabular output of dose rate and dose vs time, graphical output of dose vs time, and punched-card output of organ burden and dose vs time.more » The models of internal dose calculation are discussed and instructions for the use of the INDOS codes are provided. The INDOS codes are available from the Radiation Shielding Information Center, Oak Ridge National Laboratory, P. O. Box X, Oak Ridge, Tennessee 37830. (auth)« less
Panagides, Nadya; Jackson, Timothy N.W.; Ikonomopoulou, Maria P.; Arbuckle, Kevin; Pretzler, Rudolf; Yang, Daryl C.; Ali, Syed A.; Koludarov, Ivan; Dobson, James; Sanker, Brittany; Asselin, Angelique; Santana, Renan C.; Hendrikx, Iwan; van der Ploeg, Harold; Tai-A-Pin, Jeremie; van den Bergh, Romilly; Kerkkamp, Harald M.I.; Vonk, Freek J.; Naude, Arno; Strydom, Morné A.; Jacobsz, Louis; Dunstan, Nathan; Jaeger, Marc; Hodgson, Wayne C.; Miles, John; Fry, Bryan G.
2017-01-01
The cytotoxicity of the venom of 25 species of Old World elapid snake was tested and compared with the morphological and behavioural adaptations of hooding and spitting. We determined that, contrary to previous assumptions, the venoms of spitting species are not consistently more cytotoxic than those of closely related non-spitting species. While this correlation between spitting and non-spitting was found among African cobras, it was not present among Asian cobras. On the other hand, a consistent positive correlation was observed between cytotoxicity and utilisation of the defensive hooding display that cobras are famous for. Hooding and spitting are widely regarded as defensive adaptations, but it has hitherto been uncertain whether cytotoxicity serves a defensive purpose or is somehow useful in prey subjugation. The results of this study suggest that cytotoxicity evolved primarily as a defensive innovation and that it has co-evolved twice alongside hooding behavior: once in the Hemachatus + Naja and again independently in the king cobras (Ophiophagus). There was a significant increase of cytotoxicity in the Asian Naja linked to the evolution of bold aposematic hood markings, reinforcing the link between hooding and the evolution of defensive cytotoxic venoms. In parallel, lineages with increased cytotoxicity but lacking bold hood patterns evolved aposematic markers in the form of high contrast body banding. The results also indicate that, secondary to the evolution of venom rich in cytotoxins, spitting has evolved three times independently: once within the African Naja, once within the Asian Naja, and once in the Hemachatus genus. The evolution of cytotoxic venom thus appears to facilitate the evolution of defensive spitting behaviour. In contrast, a secondary loss of cytotoxicity and reduction of the hood occurred in the water cobra Naja annulata, which possesses streamlined neurotoxic venom similar to that of other aquatic elapid snakes (e.g., hydrophiine sea snakes). The results of this study make an important contribution to our growing understanding of the selection pressures shaping the evolution of snake venom and its constituent toxins. The data also aid in elucidating the relationship between these selection pressures and the medical impact of human snakebite in the developing world, as cytotoxic cobras cause considerable morbidity including loss-of-function injuries that result in economic and social burdens in the tropics of Asia and sub-Saharan Africa. PMID:28335411
Panagides, Nadya; Jackson, Timothy N W; Ikonomopoulou, Maria P; Arbuckle, Kevin; Pretzler, Rudolf; Yang, Daryl C; Ali, Syed A; Koludarov, Ivan; Dobson, James; Sanker, Brittany; Asselin, Angelique; Santana, Renan C; Hendrikx, Iwan; van der Ploeg, Harold; Tai-A-Pin, Jeremie; van den Bergh, Romilly; Kerkkamp, Harald M I; Vonk, Freek J; Naude, Arno; Strydom, Morné A; Jacobsz, Louis; Dunstan, Nathan; Jaeger, Marc; Hodgson, Wayne C; Miles, John; Fry, Bryan G
2017-03-13
The cytotoxicity of the venom of 25 species of Old World elapid snake was tested and compared with the morphological and behavioural adaptations of hooding and spitting. We determined that, contrary to previous assumptions, the venoms of spitting species are not consistently more cytotoxic than those of closely related non-spitting species. While this correlation between spitting and non-spitting was found among African cobras, it was not present among Asian cobras. On the other hand, a consistent positive correlation was observed between cytotoxicity and utilisation of the defensive hooding display that cobras are famous for. Hooding and spitting are widely regarded as defensive adaptations, but it has hitherto been uncertain whether cytotoxicity serves a defensive purpose or is somehow useful in prey subjugation. The results of this study suggest that cytotoxicity evolved primarily as a defensive innovation and that it has co-evolved twice alongside hooding behavior: once in the Hemachatus + Naja and again independently in the king cobras ( Ophiophagus ). There was a significant increase of cytotoxicity in the Asian Naja linked to the evolution of bold aposematic hood markings, reinforcing the link between hooding and the evolution of defensive cytotoxic venoms. In parallel, lineages with increased cytotoxicity but lacking bold hood patterns evolved aposematic markers in the form of high contrast body banding. The results also indicate that, secondary to the evolution of venom rich in cytotoxins, spitting has evolved three times independently: once within the African Naja , once within the Asian Naja , and once in the Hemachatus genus. The evolution of cytotoxic venom thus appears to facilitate the evolution of defensive spitting behaviour. In contrast, a secondary loss of cytotoxicity and reduction of the hood occurred in the water cobra Naja annulata , which possesses streamlined neurotoxic venom similar to that of other aquatic elapid snakes (e.g., hydrophiine sea snakes). The results of this study make an important contribution to our growing understanding of the selection pressures shaping the evolution of snake venom and its constituent toxins. The data also aid in elucidating the relationship between these selection pressures and the medical impact of human snakebite in the developing world, as cytotoxic cobras cause considerable morbidity including loss-of-function injuries that result in economic and social burdens in the tropics of Asia and sub-Saharan Africa.
An Extensible NetLogo Model for Visualizing Message Routing Protocols
2017-08-01
the hard sciences to the social sciences to computer-generated art. NetLogo represents the world as a set of...describe the model is shown here; for the supporting methods , refer to the source code. Approved for public release; distribution is unlimited. 4 iv...if ticks - last-inject > time-to-inject [inject] if run# > #runs [stop] end Next, we present some basic statistics collected for the
MapMaker and PathTracer for tracking carbon in genome-scale metabolic models
Tervo, Christopher J.; Reed, Jennifer L.
2016-01-01
Constraint-based reconstruction and analysis (COBRA) modeling results can be difficult to interpret given the large numbers of reactions in genome-scale models. While paths in metabolic networks can be found, existing methods are not easily combined with constraint-based approaches. To address this limitation, two tools (MapMaker and PathTracer) were developed to find paths (including cycles) between metabolites, where each step transfers carbon from reactant to product. MapMaker predicts carbon transfer maps (CTMs) between metabolites using only information on molecular formulae and reaction stoichiometry, effectively determining which reactants and products share carbon atoms. MapMaker correctly assigned CTMs for over 97% of the 2,251 reactions in an Escherichia coli metabolic model (iJO1366). Using CTMs as inputs, PathTracer finds paths between two metabolites. PathTracer was applied to iJO1366 to investigate the importance of using CTMs and COBRA constraints when enumerating paths, to find active and high flux paths in flux balance analysis (FBA) solutions, to identify paths for putrescine utilization, and to elucidate a potential CO2 fixation pathway in E. coli. These results illustrate how MapMaker and PathTracer can be used in combination with constraint-based models to identify feasible, active, and high flux paths between metabolites. PMID:26771089
1991-03-01
management methodologies claim to be "expert systems" with security intelligence built into them to I derive a body of both facts and speculative data ... Data Administration considerations . III -21 IV. ARTIFICIAL INTELLIGENCE . .. .. .. . .. IV - 1 A. Description of Technologies . . . . . .. IV - 1 1...as intelligent gateways, wide area networks, and distributed databases for the distribution of logistics products. The integrity of CALS data and the
Simulation of 2D Kinetic Effects in Plasmas using the Grid Based Continuum Code LOKI
NASA Astrophysics Data System (ADS)
Banks, Jeffrey; Berger, Richard; Chapman, Tom; Brunner, Stephan
2016-10-01
Kinetic simulation of multi-dimensional plasma waves through direct discretization of the Vlasov equation is a useful tool to study many physical interactions and is particularly attractive for situations where minimal fluctuation levels are desired, for instance, when measuring growth rates of plasma wave instabilities. However, direct discretization of phase space can be computationally expensive, and as a result there are few examples of published results using Vlasov codes in more than a single configuration space dimension. In an effort to fill this gap we have developed the Eulerian-based kinetic code LOKI that evolves the Vlasov-Poisson system in 2+2-dimensional phase space. The code is designed to reduce the cost of phase-space computation by using fully 4th order accurate conservative finite differencing, while retaining excellent parallel scalability that efficiently uses large scale computing resources. In this poster I will discuss the algorithms used in the code as well as some aspects of their parallel implementation using MPI. I will also overview simulation results of basic plasma wave instabilities relevant to laser plasma interaction, which have been obtained using the code.
Fung, S Y; Tan, N H; Liew, S H; Sim, S M; Aguiyi, J C
2009-04-01
Seed of Mucuna pruriens (Velvet beans) has been prescribed by traditional medicine practitioners in Nigeria as a prophylactic oral antisnake remedy. In the present studies, we investigated the protective effects of M. pruriens seed extract (MPE) against histopathological changes induced by intravenous injection of Naja sputatrix (Malayan cobra) venom in rats pretreated with the seed extract. Examination by light microscope revealed that the venom induced histopathological changes in heart and blood vessels in liver, but no effect on brain, lung, kidney and spleen. The induced changes were prevented by pretreatment of the rats with MPE. Our results suggest that MPE pretreatment protects rat heart and liver blood vessels against cobra venom-induced damages.
MagLev Cobra: Test Facilities and Operational Experiments
NASA Astrophysics Data System (ADS)
Sotelo, G. G.; Dias, D. H. J. N.; de Oliveira, R. A. H.; Ferreira, A. C.; De Andrade, R., Jr.; Stephan, R. M.
2014-05-01
The superconducting MagLev technology for transportation systems is becoming mature due to the research and developing effort of recent years. The Brazilian project, named MagLev-Cobra, started in 1998. It has the goal of developing a superconducting levitation vehicle for urban areas. The adopted levitation technology is based on the diamagnetic and the flux pinning properties of YBa2Cu3O7-δ (YBCO) bulk blocks in the interaction with Nd-Fe-B permanent magnets. A laboratory test facility with permanent magnet guideway, linear induction motor and one vehicle module is been built to investigate its operation. The MagLev-Cobra project state of the art is presented in the present paper, describing some construction details of the new test line with 200 m.
MoRe-based tunnel junctions and their characteristics
NASA Astrophysics Data System (ADS)
Shaternik, V.; Larkin, S.; Noskov, V.; Chubatyy, V.; Sizontov, V.; Miroshnikov, A.; Karmazin, A.
2008-02-01
Perspective Josephson Mo-Re alloy-oxide-Pb, Mo-Re alloy-normal metal-oxide-Pb and Mo-Re alloy-normal metal-oxide-normal metal-Mo-Re alloy junctions have been fabricated and investigated. Thin (~50-100 nm) MoRe superconducting films are deposited on Al2O3 substrates by using a dc magnetron sputtering of MoRe target. Normal metal (Sn, Al) thin films are deposited on the MoRe films surfaces by thermal evaporation of metals in vacuum and oxidized to fabricate junctions oxide barriers. Quasiparticle I-V curves of the fabricated junctions were measured in wide range of voltages. To investigate a transparency spread for the fabricated junctions barriers the computer simulation of the measured quasiparticle I-V curves have been done in framework of the model of multiple Andreev reflections in double-barrier junction interfaces. It's demonstrated the investigated junctions can be described as highly asymmetric double-barrier Josephson junctions with great difference between the two barrier transparencies. The result of the comparison of experimental quasiparticle I-V curves and calculated ones is proposed and discussed. Also I-V curves of the fabricated junctions have been measured under microwave irradiation with 60 GHz frequency, clear Shapiro steps in the measured I-V curves were observed and discussed.
Constructing linkage maps in the genomics era with MapDisto 2.0.
Heffelfinger, Christopher; Fragoso, Christopher A; Lorieux, Mathias
2017-07-15
Genotyping by sequencing (GBS) generates datasets that are challenging to handle by current genetic mapping software with graphical interface. Geneticists need new user-friendly computer programs that can analyze GBS data on desktop computers. This requires improvements in computation efficiency, both in terms of speed and use of random-access memory (RAM). MapDisto v.2.0 is a user-friendly computer program for construction of genetic linkage maps. It includes several new major features: (i) handling of very large genotyping datasets like the ones generated by GBS; (ii) direct importation and conversion of Variant Call Format (VCF) files; (iii) detection of linkage, i.e. construction of linkage groups in case of segregation distortion; (iv) data imputation on VCF files using a new approach, called LB-Impute. Features i to iv operate through inclusion of new Java modules that are used transparently by MapDisto; (v) QTL detection via a new R/qtl graphical interface. The program is available free of charge at mapdisto.free.fr. mapdisto@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Computation of a Canadian SCWR unit cell with deterministic and Monte Carlo codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrisson, G.; Marleau, G.
2012-07-01
The Canadian SCWR has the potential to achieve the goals that the generation IV nuclear reactors must meet. As part of the optimization process for this design concept, lattice cell calculations are routinely performed using deterministic codes. In this study, the first step (self-shielding treatment) of the computation scheme developed with the deterministic code DRAGON for the Canadian SCWR has been validated. Some options available in the module responsible for the resonance self-shielding calculation in DRAGON 3.06 and different microscopic cross section libraries based on the ENDF/B-VII.0 evaluated nuclear data file have been tested and compared to a reference calculationmore » performed with the Monte Carlo code SERPENT under the same conditions. Compared to SERPENT, DRAGON underestimates the infinite multiplication factor in all cases. In general, the original Stammler model with the Livolant-Jeanpierre approximations are the most appropriate self-shielding options to use in this case of study. In addition, the 89 groups WIMS-AECL library for slight enriched uranium and the 172 groups WLUP library for a mixture of plutonium and thorium give the most consistent results with those of SERPENT. (authors)« less
NASA Astrophysics Data System (ADS)
Barker, H. W.; Stephens, G. L.; Partain, P. T.; Bergman, J. W.; Bonnel, B.; Campana, K.; Clothiaux, E. E.; Clough, S.; Cusack, S.; Delamere, J.; Edwards, J.; Evans, K. F.; Fouquart, Y.; Freidenreich, S.; Galin, V.; Hou, Y.; Kato, S.; Li, J.; Mlawer, E.; Morcrette, J.-J.; O'Hirok, W.; Räisänen, P.; Ramaswamy, V.; Ritter, B.; Rozanov, E.; Schlesinger, M.; Shibata, K.; Sporyshev, P.; Sun, Z.; Wendisch, M.; Wood, N.; Yang, F.
2003-08-01
The primary purpose of this study is to assess the performance of 1D solar radiative transfer codes that are used currently both for research and in weather and climate models. Emphasis is on interpretation and handling of unresolved clouds. Answers are sought to the following questions: (i) How well do 1D solar codes interpret and handle columns of information pertaining to partly cloudy atmospheres? (ii) Regardless of the adequacy of their assumptions about unresolved clouds, do 1D solar codes perform as intended?One clear-sky and two plane-parallel, homogeneous (PPH) overcast cloud cases serve to elucidate 1D model differences due to varying treatments of gaseous transmittances, cloud optical properties, and basic radiative transfer. The remaining four cases involve 3D distributions of cloud water and water vapor as simulated by cloud-resolving models. Results for 25 1D codes, which included two line-by-line (LBL) models (clear and overcast only) and four 3D Monte Carlo (MC) photon transport algorithms, were submitted by 22 groups. Benchmark, domain-averaged irradiance profiles were computed by the MC codes. For the clear and overcast cases, all MC estimates of top-of-atmosphere albedo, atmospheric absorptance, and surface absorptance agree with one of the LBL codes to within ±2%. Most 1D codes underestimate atmospheric absorptance by typically 15-25 W m-2 at overhead sun for the standard tropical atmosphere regardless of clouds.Depending on assumptions about unresolved clouds, the 1D codes were partitioned into four genres: (i) horizontal variability, (ii) exact overlap of PPH clouds, (iii) maximum/random overlap of PPH clouds, and (iv) random overlap of PPH clouds. A single MC code was used to establish conditional benchmarks applicable to each genre, and all MC codes were used to establish the full 3D benchmarks. There is a tendency for 1D codes to cluster near their respective conditional benchmarks, though intragenre variances typically exceed those for the clear and overcast cases. The majority of 1D codes fall into the extreme category of maximum/random overlap of PPH clouds and thus generally disagree with full 3D benchmark values. Given the fairly limited scope of these tests and the inability of any one code to perform extremely well for all cases begs the question that a paradigm shift is due for modeling 1D solar fluxes for cloudy atmospheres.
NASA Technical Reports Server (NTRS)
Sutliff, Daniel L.
2014-01-01
The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (i) mode blockage, (ii) liner insertion loss, (iii) short ducts, and (iv) mode reflection.
76 FR 55268 - Chromobacterium subtsugae Strain PRAA4-1T
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-07
... (irritation symptoms cleared by 24 hours; Toxicity Category IV). 9. Dermal sensitization--guinea pig... that Chromobacterium subtsugae strain PRAA4-1\\T\\ was not a dermal sensitizer to guinea pigs. IV... production (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311...
Quantifying and Reducing Curve-Fitting Uncertainty in Isc
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campanelli, Mark; Duck, Benjamin; Emery, Keith
2015-06-14
Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less
Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campanelli, Mark; Duck, Benjamin; Emery, Keith
Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less
Putney, Joy; Hilbert, Douglas; Paskaranandavadivel, Niranchan; Cheng, Leo K.; O'Grady, Greg; Angeli, Timothy R.
2016-01-01
Objective The aim of this study was to develop, validate, and apply a fully automated method for reducing large temporally synchronous artifacts present in electrical recordings made from the gastrointestinal (GI) serosa, which are problematic for properly assessing slow wave dynamics. Such artifacts routinely arise in experimental and clinical settings from motion, switching behavior of medical instruments, or electrode array manipulation. Methods A novel iterative COvaraiance-Based Reduction of Artifacts (COBRA) algorithm sequentially reduced artifact waveforms using an updating across-channel median as a noise template, scaled and subtracted from each channel based on their covariance. Results Application of COBRA substantially increased the signal-to-artifact ratio (12.8±2.5 dB), while minimally attenuating the energy of the underlying source signal by 7.9% on average (-11.1±3.9 dB). Conclusion COBRA was shown to be highly effective for aiding recovery and accurate marking of slow wave events (sensitivity = 0.90±0.04; positive-predictive value = 0.74±0.08) from large segments of in vivo porcine GI electrical mapping data that would otherwise be lost due to a broad range of contaminating artifact waveforms. Significance Strongly reducing artifacts with COBRA ultimately allowed for rapid production of accurate isochronal activation maps detailing the dynamics of slow wave propagation in the porcine intestine. Such mapping studies can help characterize differences between normal and dysrhythmic events, which have been associated with GI abnormalities, such as intestinal ischemia and gastroparesis. The COBRA method may be generally applicable for removing temporally synchronous artifacts in other biosignal processing domains. PMID:26829772
40 CFR 147.2200 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the in situ combustion of coal are regulated by the Rail Road Commission of Texas under a separate UIC... National Archives and Records Administration (NARA). For information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov/federal_register/code_of_federal...
40 CFR 147.2200 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the in situ combustion of coal are regulated by the Rail Road Commission of Texas under a separate UIC... National Archives and Records Administration (NARA). For information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov/federal_register/code_of_federal...
40 CFR 147.2200 - State-administered program-Class I, III, IV, and V wells.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the in situ combustion of coal are regulated by the Rail Road Commission of Texas under a separate UIC... National Archives and Records Administration (NARA). For information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov/federal_register/code_of_federal...
From the Kiss of a Cobra: A Sidelong View of Snakebite, Antivenin, and Serum Sickness.
ERIC Educational Resources Information Center
Graham, Douglas H.
1992-01-01
Article tells the story of how a New York City customs man was bit by a King Cobra from Thailand and the medical treatment he received. Describes three types of snake venoms: hemotoxins, cytotoxins, and neurotoxins. Explains how horses are used to produce antitoxins and side effects of the antitoxins on humans. (PR)
Construction and Initial Validation of the Color-Blind Racial Attitudes Scale (CoBRAS).
ERIC Educational Resources Information Center
Neville, Helen A.; Lilly, Roderick L.; Duran, Georgia; Lee, Richard M.; Browne, LaVonne
2000-01-01
Describes development of a conceptually grounded scale to assess cognitive aspects of color-blind racial attitudes. Factor analysis suggests that the 3-factor model is a good fit of data. States that CoBRAS was positively related to other indexes of racial attitudes indicating that greater endorsement of color-blind racial attitudes was related to…
Lerna, Anna; Esposito, Dalila; Conson, Massimiliano; Russo, Luigi; Massagli, Angelo
2012-01-01
The Picture Exchange Communication System (PECS) is a common treatment choice for non-verbal children with autism. However, little empirical evidence is available on the usefulness of PECS in treating social-communication impairments in autism. To test the effects of PECS on social-communicative skills in children with autism, concurrently taking into account standardized psychometric data, standardized functional assessment of adaptive behaviour, and information on social-communicative variables coded in an unstructured setting. Eighteen preschool children (mean age = 38.78 months) were assigned to two intervention approaches, i.e. PECS and Conventional Language Therapy (CLT). Both PECS (Phases I-IV) and CLT were delivered three times per week, in 30-min sessions, for 6 months. Outcome measures were the following: Autism Diagnostic Observation Schedule (ADOS) domain scores for Communication and Reciprocal Social Interaction; Language and Personal-Social subscales of the Griffiths' Mental Developmental Scales (GMDS); Communication and Social Abilities domains of the Vineland Adaptive Behavior Scales (VABS); and several social-communicative variables coded in an unstructured setting. Results demonstrated that the two groups did not differ at Time 1 (pre-treatment assessment), whereas at Time 2 (post-test) the PECS group showed a significant improvement with respect to the CLT group on the VABS social domain score and on almost all the social-communicative abilities coded in the unstructured setting (i.e. joint attention, request, initiation, cooperative play, but not eye contact). These findings showed that PECS intervention (Phases I-IV) can improve social-communicative skills in children with autism. This improvement is especially evident in standardized measures of adaptive behaviour and measures derived from the observation of children in an unstructured setting. © 2012 Royal College of Speech and Language Therapists.
Nonparametric Statistics Test Software Package.
1983-09-01
statis- tics because of their acceptance in the academic world, the availability of computer support, and flexibility in model builling. Nonparametric...25 I1l,lCELL WRITE(NCF,12 ) IvE (I ,RCCT(I) 122 FORMAT(IlXt 3(H5 9 1) IF( IeLT *NCELL) WRITE (NOF1123 J PARTV(I1J 123 FORMAT( Xll----’,FIo.3J 25 CONT
Pilot Judgment Training and Evaluation. Volume 3.
1982-06-01
Information Manual. 3-1 8. Flight computer . _ 9. Basic navigation: aeronautical charts (sectional and world 4- aeronautical charts); airspace... clouds , traffic, etc., when you needed to and still maintained the course. 4-13 I___________________ -- -I ~ =~- I- -- INSTRUCTOR LESSON PLAN PART I...maintain basic VFR. PART III Observable Behavior Sought: The student will make proper diversions from clouds to maintain basic VFR. PART IV Reinforcements
1983-01-01
Influence Scaling of 2D and 3D Shock/Turbulent ioundary Layer Interactions at Compression Corners." AIM Paper 81-334, January 1981. 5. Kubota, H...generating 3D shock wave/boundary layer interactions 2 Unswept sharp fin interaction and coordinate system 3 Cobra probe measurements of Peake (4) at Mach 4...were made by two Druck 50 PSI transducers, each in- stalled in a computer-controlled 48-port Model 48J4 Scani- valve and referenced to vacuum. A 250
Programming for 1.6 Millon cores: Early experiences with IBM's BG/Q SMP architecture
NASA Astrophysics Data System (ADS)
Glosli, James
2013-03-01
With the stall in clock cycle improvements a decade ago, the drive for computational performance has continues along a path of increasing core counts on a processor. The multi-core evolution has been expressed in both a symmetric multi processor (SMP) architecture and cpu/GPU architecture. Debates rage in the high performance computing (HPC) community which architecture best serves HPC. In this talk I will not attempt to resolve that debate but perhaps fuel it. I will discuss the experience of exploiting Sequoia, a 98304 node IBM Blue Gene/Q SMP at Lawrence Livermore National Laboratory. The advantages and challenges of leveraging the computational power BG/Q will be detailed through the discussion of two applications. The first application is a Molecular Dynamics code called ddcMD. This is a code developed over the last decade at LLNL and ported to BG/Q. The second application is a cardiac modeling code called Cardioid. This is a code that was recently designed and developed at LLNL to exploit the fine scale parallelism of BG/Q's SMP architecture. Through the lenses of these efforts I'll illustrate the need to rethink how we express and implement our computational approaches. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Scemama, Anthony; Caffarel, Michel; Oseret, Emmanuel; Jalby, William
2013-04-30
Various strategies to implement efficiently quantum Monte Carlo (QMC) simulations for large chemical systems are presented. These include: (i) the introduction of an efficient algorithm to calculate the computationally expensive Slater matrices. This novel scheme is based on the use of the highly localized character of atomic Gaussian basis functions (not the molecular orbitals as usually done), (ii) the possibility of keeping the memory footprint minimal, (iii) the important enhancement of single-core performance when efficient optimization tools are used, and (iv) the definition of a universal, dynamic, fault-tolerant, and load-balanced framework adapted to all kinds of computational platforms (massively parallel machines, clusters, or distributed grids). These strategies have been implemented in the QMC=Chem code developed at Toulouse and illustrated with numerical applications on small peptides of increasing sizes (158, 434, 1056, and 1731 electrons). Using 10-80 k computing cores of the Curie machine (GENCI-TGCC-CEA, France), QMC=Chem has been shown to be capable of running at the petascale level, thus demonstrating that for this machine a large part of the peak performance can be achieved. Implementation of large-scale QMC simulations for future exascale platforms with a comparable level of efficiency is expected to be feasible. Copyright © 2013 Wiley Periodicals, Inc.
1985-10-01
NOTE3 1W. KFY OORDS (Continwo =n reverse aide If necesesar aid ldwttlfy by" block ntmber) •JW7 Regions, COM-EOM Region Ident• fication GIFT Material...technique of mobna.tcri• i Geometr- (Com-Geom). The Com-Gem data is used as input to the Geometric Inf• •cation for Targets ( GIFT ) computer code to... GIFT ) 2 3 computer code. This report documents the combinatorial geometry (Com-Geom) target description data which is the input data for the GIFT code
He, Ying-Ying; Liu, Shu-Bai; Lee, Wen-Hui; Qian, Jin-Qiao; Zhang, Yun
2008-10-01
Snake venom Kunitz/BPTI members are good tools for understanding of structure-functional relationship between serine proteases and their inhibitors. A novel dual Kunitz/BPTI serine proteinase inhibitor named OH-TCI (trypsin- and chymotrypsin-dual inhibitor from Ophiophagus hannah) was isolated from king cobra venom by three chromatographic steps of gel filtration, trypsin affinity and reverse phase HPLC. OH-TCI is composed of 58 amino acid residues with a molecular mass of 6339Da. Successful expression of OH-TCI was performed as the maltose-binding fusion protein in E. coli DH5alpha. Much different from Oh11-1, the purified native and recombinant OH-TCI both had strong inhibitory activities against trypsin and chymotrypsin although the sequence identity (74.1%) between them is very high. The inhibitor constants (K(i)) of recombinant OH-TCI were 3.91 x 10(-7) and 8.46 x10(-8)M for trypsin and chymotrypsin, respectively. To our knowledge, it was the first report of Kunitz/BPTI serine proteinase inhibitor from snake venom that had equivalent trypsin and chymotrypsin inhibitory activities.
LLNL Mercury Project Trinity Open Science Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Shawn A.
The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.
Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmalz, Mark S
2011-07-24
Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G}more » for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient parallel computation of particle and fluid dynamics simulations. These problems occur throughout DOE, military and commercial sectors: the potential payoff is high. We plan to license or sell the solution to contractors for military and domestic applications such as disaster simulation (aerodynamic and hydrodynamic), Government agencies (hydrological and environmental simulations), and medical applications (e.g., in tomographic image reconstruction). Keywords - High-performance Computing, Graphic Processing Unit, Fluid/Particle Simulation. Summary for Members of Congress - Department of Energy has many simulation codes that must compute faster, to be effective. The Phase I research parallelized particle/fluid simulations for rocket combustion, for high-performance computing systems.« less
NASA Technical Reports Server (NTRS)
2004-01-01
This 3-D image taken by the left and right eyes of the panoramic camera on the Mars Exploration Rover Spirit shows the odd rock formation dubbed 'Cobra Hoods' (center). Rover scientists say this resistant rock is unlike anything they've seen on Mars so far. Spirit will investigate the rock in coming sols. The stereo pictures making up this image were captured on sol 156 (June 11, 2004).ERIC Educational Resources Information Center
Jensen, Nathan C.
2012-01-01
Starting in the 2010-11, administrators at the Fountain Lake School District implemented the Cobra Pride Incentive Program (CPIP), a merit pay program designed to financially reward all school employees with year-end bonuses primarily for significant improvements in student achievement. At the conclusion of the 2010-11 school year, over $800,000…
Polymer Light-Emitting Diode (PLED) Process Development
2003-12-01
conclusions and recommendations for Phase II of the Flexible Display Program. 15. SUBJECT TERMS LIGHT EMITTING DIODES LIQUID CRYSTAL DISPLAY SYSTEMS...space for Phase I and II confined by backplane complexity and substrate form...12 Figure 6. Semi automated I-V curve measurement setup consisting of Keithley power supply, computer and
Lee, Mui Li; Fung, Shin Yee; Chung, Ivy; Pailoor, Jayalakshmi; Cheah, Swee Hung; Tan, Nget Hong
2014-01-01
King cobra (Ophiophagus hannah) venom L-amino acid oxidase (OH-LAAO), a heat stable enzyme, has been shown to exhibit very potent anti-proliferative activity against human breast and lung tumorigenic cells but not in their non-tumorigenic counterparts. We further examine its in vitro and in vivo anti-tumor activity in a human prostate adenocarcinoma (PC-3) model. OH-LAAO demonstrated potent cytotoxicity against PC-3 cells with IC50 of 0.05 µg/mL after 72 h incubation in vitro. It induced apoptosis as evidenced with an increase in caspase-3/7 cleavages and an increase in annexin V-stained cells. To examine its in vivo anti-tumor activity, we treated PC-3 tumor xenograft implanted subcutaneously in immunodeficient NU/NU (nude) mice with 1 µg/g OH-LAAO given intraperitoneally (i.p.). After 8 weeks of treatment, OH-LAAO treated PC-3 tumors were markedly inhibited, when compared to the control group (P <0.05). TUNEL staining analysis on the tumor sections showed a significantly increase of apoptotic cells in the LAAO-treated animals. Histological examinations of the vital organs in these two groups showed no significant differences with normal tissues, indicating no obvious tissue damage. The treatment also did not cause any significant changes on the body weight of the mice during the duration of the study. These observations suggest that OH-LAAO cytotoxic effects may be specific to tumor xenografts and less to normal organs. Given its potent anti-tumor activities shown in vitro as well as in vivo, the king cobra venom LAAO can potentially be developed to treat prostate cancer and other solid tumors.
Tagliaferri, Luca; Gobitti, Carlo; Colloca, Giuseppe Ferdinando; Boldrini, Luca; Farina, Eleonora; Furlan, Carlo; Paiar, Fabiola; Vianello, Federica; Basso, Michela; Cerizza, Lorenzo; Monari, Fabio; Simontacchi, Gabriele; Gambacorta, Maria Antonietta; Lenkowicz, Jacopo; Dinapoli, Nicola; Lanzotti, Vito; Mazzarotto, Renzo; Russi, Elvio; Mangoni, Monica
2018-07-01
The big data approach offers a powerful alternative to Evidence-based medicine. This approach could guide cancer management thanks to machine learning application to large-scale data. Aim of the Thyroid CoBRA (Consortium for Brachytherapy Data Analysis) project is to develop a standardized web data collection system, focused on thyroid cancer. The Metabolic Radiotherapy Working Group of Italian Association of Radiation Oncology (AIRO) endorsed the implementation of a consortium directed to thyroid cancer management and data collection. The agreement conditions, the ontology of the collected data and the related software services were defined by a multicentre ad hoc working-group (WG). Six Italian cancer centres were firstly started the project, defined and signed the Thyroid COBRA consortium agreement. Three data set tiers were identified: Registry, Procedures and Research. The COBRA-Storage System (C-SS) appeared to be not time-consuming and to be privacy respecting, as data can be extracted directly from the single centre's storage platforms through a secured connection that ensures reliable encryption of sensible data. Automatic data archiving could be directly performed from Image Hospital Storage System or the Radiotherapy Treatment Planning Systems. The C-SS architecture will allow "Cloud storage way" or "distributed learning" approaches for predictive model definition and further clinical decision support tools development. The development of the Thyroid COBRA data Storage System C-SS through a multicentre consortium approach appeared to be a feasible tool in the setup of complex and privacy saving data sharing system oriented to the management of thyroid cancer and in the near future every cancer type. Copyright © 2018 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.
Liu, Chien-Chun; You, Chen-Hsien; Wang, Po-Jung; Yu, Jau-Song; Huang, Guo-Jen; Liu, Chien-Hsin; Hsieh, Wen-Chin; Lin, Chih-Chuan
2017-12-01
In Southeast Asia, envenoming resulting from cobra snakebites is an important public health issue in many regions, and antivenom therapy is the standard treatment for the snakebite. Because these cobras share a close evolutionary history, the amino acid sequences of major venom components in different snakes are very similar. Therefore, either monovalent or polyvalent antivenoms may offer paraspecific protection against envenomation of humans by several different snakes. In Taiwan, a bivalent antivenom-freeze-dried neurotoxic antivenom (FNAV)-against Bungarus multicinctus and Naja atra is available. However, whether this antivenom is also capable of neutralizing the venom of other species of snakes is not known. Here, to expand the clinical application of Taiwanese FNAV, we used an animal model to evaluate the neutralizing ability of FNAV against the venoms of three common snakes in Southeast Asia, including two 'true' cobras Naja kaouthia (Thailand) and Naja siamensis (Thailand), and the king cobra Ophiophagus hannah (Indonesia). We further applied mass spectrometry (MS)-based proteomic techniques to characterize venom proteomes and identify FNAV-recognizable antigens in the venoms of these Asian snakes. Neutralization assays in a mouse model showed that FNAV effectively neutralized the lethality of N. kaouthia and N. siamensis venoms, but not O. hannah venom. MS-based venom protein identification results further revealed that FNAV strongly recognized three-finger toxin and phospholipase A2, the major protein components of N. kaouthia and N. siamensis venoms. The characterization of venom proteomes and identification of FNAV-recognizable venom antigens may help researchers to further develop more effective antivenom designed to block the toxicity of dominant toxic proteins, with the ultimate goal of achieving broadly therapeutic effects against these cobra snakebites.
Chang, Hui-Ching; Tsai, Tein-Shun; Tsai, Inn-Ho
2013-08-26
This study deciphers the geographic variations of king cobra (Ophiophagus hannah) venom using functional proteomics. Pooled samples of king cobra venom (abbreviated as Ohv) were obtained from Indonesia, Malaysia, Thailand, and two provinces of China, namely Guangxi and Hainan. Using two animal models to test and compare the lethal effects, we found that the Chinese Ohvs were more fatal to mice, while the Southeast Asian Ohvs were more fatal to lizards (Eutropis multifasciata). Various phospholipases A2 (PLA2s), three-finger toxins (3FTxs) and Kunitz-type inhibitors were purified from these Ohvs and compared. Besides the two Chinese Ohv PLA2s with known sequences, eight novel PLA2s were identified from the five Ohv samples and their antiplatelet activities were compared. While two 3FTxs (namely oh-55 and oh-27) were common in all the Ohvs, different sets of 3FTx markers were present in the Chinese and Southeast Asian Ohvs. All the Ohvs contain the Kunitz inhibitor, OH-TCI, while only the Chinese Ohvs contain the inhibitor variant, Oh11-1. Relative to the Chinese Ohvs which contained more phospholipases, the Southeast Asian Ohvs had higher metalloproteinase, acetylcholine esterase, and alkaline phosphatase activities. Remarkable variations in five king cobra geographic samples reveal fast evolution and dynamic translational regulation of the venom which probably adapted to different prey ecology as testified by the lethal tests on mice and lizards. Our results predict possible variations of the king cobra envenoming to human and the importance of using local antivenin for snakebite treatment. Copyright © 2013 Elsevier B.V. All rights reserved.
Lee, Mui Li; Tan, Nget Hong; Fung, Shin Yee; Sekaran, Shamala Devi
2011-03-01
The major l-amino acid oxidase (LAAO, EC 1.4.3.2) of king cobra (Ophiophagus hannah) venom is known to be an unusual form of snake venom LAAO as it possesses unique structural features and unusual thermal stability. The antibacterial effects of king cobra venom LAAO were tested against several strains of clinical isolates including Staphylococcus aureus, Staphylococcus epidermidis, Pseudomonas aeruginosa, Klebsiella pneumoniae, and Escherichia coli using broth microdilution assay. For comparison, the antibacterial effects of several antibiotics (cefotaxime, kanamycin, tetracycline, vancomycin and penicillin) were also examined using the same conditions. King cobra venom LAAO was very effective in inhibiting the two Gram-positive bacteria (S. aureus and S. epidermidis) tested, with minimum inhibitory concentration (MIC) of 0.78μg/mL (0.006μM) and 1.56μg/mL (0.012μM) against S. aureus and S. epidermidis, respectively. The MICs are comparable to the MICs of the antibiotics tested, on a weight basis. However, the LAAO was only moderately effective against three Gram-negative bacteria tested (P. aeruginosa, K. pneumoniae and E. coli), with MIC ranges from 25 to 50μg/mL (0.2-0.4μM). Catalase at the concentration of 1mg/mL abolished the antibacterial effect of LAAO, indicating that the antibacterial effect of the enzyme involves generation of hydrogen peroxide. Binding studies indicated that king cobra venom LAAO binds strongly to the Gram-positive S. aureus and S. epidermidis, but less strongly to the Gram-negative E. coli and P. aeruginosa, indicating that specific binding to bacteria is important for the potent antibacterial activity of the enzyme. Copyright © 2010 Elsevier Inc. All rights reserved.
Computer Modeling and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pronskikh, V. S.
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less
1981-01-01
Reference Direction4 at " Is - (198) SNetwork’Ports. In either c•es, the port voltagemay be related to the appl &id field on the "segment by’ t~h constant...04 6.|• swot -0 1, i.61-03 45.766 17 0 0.117* 0.US30 ,0001 0.01111,31 1 I. K-03 1.137ft-04 i .3%$K-03 11.i1i is 0 0a1113 0.2178 0.0003 0.00339 1.1117K
1988-12-01
Figures ................................... v List of Tables ................................... vi I. Introduction ..................... 1 Background... 1 Problem ................................ 3 Scope .................................. 4 Approach...Collected Data on Variables ...... 136 Appendix D: Collected Data on Operations...............208 iv List of Figures Figure Page 2- 1 Keyword Search
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hindmarsh, A.C.; Sloan, L.J.; Dubois, P.F.
1978-12-01
This report supersedes the original version, dated June 1976. It describes four versions of a pair of subroutines for solving N x N systems of linear algebraic equations. In each case, the first routine, DEC, performs an LU decomposition of the matrix with partial pivoting, and the second, SOL, computes the solution vector by back-substitution. The first version is in Fortran IV, and is derived from routines DECOMP and SOLVE written by C.B. Moler. The second is a version for the CDC 7600 computer using STACKLIB. The third is a hand-coded (Compass) version for the 7600. The fourth is amore » vectorized version for the CDC STAR, renamed DECST and SOLST. Comparative tests on these routines are also described. The Compass version is faster than the others on the 7600 by factors of up to 5. The major revisions to the original report, and to the subroutines described, are an updated description of the availability of each version of DEC/SOL; correction of some errors in the Compass version, as altered so as to be compatible with FTN; and a new STAR version, which runs much faster than the earlier one. The standard Fortran version, the Fortran/STACKLIB version, and the object code generated from the Compass version and available in STACKLIB have not been changed.« less
Deciphering the Diagnostic Codes: A Guide for School Counselors. Practical Skills for Counselors.
ERIC Educational Resources Information Center
Jones, W. Paul
Although school counselors have more contact with children and adolescents than most other human service professionals, they are frequently left out of discussions on diagnostic coding. Ways in which school counselors can use the codes in the Diagnostic and Statistical Manual of Mental Disorders IV (DSM-IV) are explored in this text. The book…
Modeling of current characteristics of segmented Langmuir probe on DEMETER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Imtiaz, Nadia; Marchand, Richard; Lebreton, Jean-Pierre
We model the current characteristics of the DEMETER Segmented Langmuir probe (SLP). The probe is used to measure electron density and temperature in the ionosphere at an altitude of approximately 700 km. It is also used to measure the plasma flow velocity in the satellite frame of reference. The probe is partitioned into seven collectors: six electrically insulated spherical segments and a guard electrode (the rest of the sphere and the small post). Comparisons are made between the predictions of the model and DEMETER measurements for actual ionospheric plasma conditions encountered along the satellite orbit. Segment characteristics are computed numericallymore » with PTetra, a three-dimensional particle in cell simulation code. In PTetra, space is discretized with an unstructured tetrahedral mesh, thus, enabling a good representation of the probe geometry. The model also accounts for several physical effects of importance in the interaction of spacecraft with the space environment. These include satellite charging, photoelectron, and secondary electron emissions. The model is electrostatic, but it accounts for the presence of a uniform background magnetic field. PTetra simulation results show different characteristics for the different probe segments. The current collected by each segment depends on its orientation with respect to the ram direction, the plasma composition, the magnitude, and the orientation of the magnetic field. It is observed that the presence of light H{sup +} ions leads to a significant increase in the ion current branch of the I-V curves of the negatively polarized SLP. The effect of the magnetic field is demonstrated by varying its magnitude and direction with respect to the reference magnetic field. It is found that the magnetic field appreciably affects the electron current branch of the I-V curves of certain segments on the SLP, whereas the ion current branch remains almost unaffected. PTetra simulations are validated by comparing the computed characteristics and their angular anisotropy with the DEMETER measurements, as simulation results are found to be in good agreement with the measurements.« less
Near-Infrared Neuroimaging with NinPy
Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas
2009-01-01
There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449
Proposals to Subsidize Health Insurance for the Unemployed
1998-01-01
firms with 20 or more employees to continue offering health coverage to workers who separate from the firm. However, firms may charge former employees ...employment-based health plans must make continuation coverage available to former employees and covered family members. Sepated workers may continue COBRA... workers in firms of 20 or more employees who participate in an existing employer-sponsored health plan are eligible to continue coverage under COBRA
NASA Technical Reports Server (NTRS)
Fisher, Charles D.; Braun, David F.; Kaluzny, Joel V.; Seiffert, Mic D.; Dekany, Richard G.; Ellis, Richard S.; Smith, Roger S.
2012-01-01
The Prime Focus Spectrograph (PFS) is a fiber fed multi-object spectrometer for the Subaru Telescope that will conduct a variety of targeted surveys for studies of dark energy, galaxy evolution, and galactic archaeology. The key to the instrument is a high density array of fiber positioners placed at the prime focus of the Subaru Telescope. The system, nicknamed "Cobra", will be capable of rapidly reconfiguring the array of 2394 optical fibers to the image positions of astronomical targets in the focal plane with high accuracy. The system uses 2394 individual "SCARA robot" mechanisms that are 7.7mm in diameter and use 2 piezo-electric rotary motors to individually position each of the optical fibers within its patrol region. Testing demonstrates that the Cobra positioner can be moved to within 5 micrometers of an astronomical target in 6 move iterations with a success rate of 95%. The Cobra system is a key aspect of PFS that will enable its unprecedented combination of high-multiplex factor and observing efficiency on the Subaru telescope. The requirements, design, and prototyping efforts for the fiber positioner system for the PFS are described here as are the plans for modular construction, assembly, integration, functional testing, and performance validation.
Simplified gas sensor model based on AlGaN/GaN heterostructure Schottky diode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, Subhashis, E-mail: subhashis.ds@gmail.com; Majumdar, S.; Kumar, R.
2015-08-28
Physics based modeling of AlGaN/GaN heterostructure Schottky diode gas sensor has been investigated for high sensitivity and linearity of the device. Here the surface and heterointerface properties are greatly exploited. The dependence of two dimensional electron gas (2DEG) upon the surface charges is mainly utilized. The simulation of Schottky diode has been done in Technology Computer Aided Design (TCAD) tool and I-V curves are generated, from the I-V curves 76% response has been recorded in presence of 500 ppm gas at a biasing voltage of 0.95 Volt.
FM Tactical Communications under Intentional Interference.
1975-06-06
FUEN WIIHOUT JAt4i.,IV(i*) If. YGU WISH 1*i EXECU1&. ANtHI IiG ~E s PUA7 naIn --- a- l--f*-~ * igure 15. Sample Computer Runs,(aUnsu1.table Coma.o Link...AM and SSB Modulation,* IRE Transaction on Military Electronics, NIL-5t (January 1961), p. do 6. J. Rose Heverly, Range, Mobility and Transmis’slon
Tetrahedral Hohlraum Visualization and Pointings
NASA Astrophysics Data System (ADS)
Klare, K. A.; Wallace, J. M.; Drake, D.
1997-11-01
In designing experiments for Omega, the tetrahedral hohlraum (a sphere with four holes) can make full use of all 60 beams. There are some complications: the beams must clear the laser entrance hole (LEH), must miss a central capsule, absolutely must not go out the other LEHs, and should distribute in the interior of the hohlraum to maximize the uniformity of irradiation on the capsule while keeping reasonable laser spot sizes. We created a 15-offset coordinate system with which an IDL program computes clearances, writes a file for QuickDraw 3D (QD3D) visualization, and writes input for the viewfactor code RAYNA IV. Visualizing and adjusting the parameters by eye gave more reliable results than computer optimization. QD3D images permitted quick live rotations to determine offsets. The clearances obtained insured safe operation and good physics. The viewfactor code computes the initial irradiation of the hohlraum and capsule or of a uniform hohlraum source with the loss through the four LEHs and shows a high degree of uniformity with both, better for lasers because this deposits more energy near the LEHs to compensate for the holes.
LSPRAY-IV: A Lagrangian Spray Module
NASA Technical Reports Server (NTRS)
Raju, M. S.
2012-01-01
LSPRAY-IV is a Lagrangian spray solver developed for application with parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and/or Monte Carlo Probability Density Function (PDF) solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type for the gas flow grid representation. It is mainly designed to predict the flow, thermal and transport properties of a rapidly vaporizing spray. Some important research areas covered as a part of the code development are: (1) the extension of combined CFD/scalar-Monte- Carlo-PDF method to spray modeling, (2) the multi-component liquid spray modeling, and (3) the assessment of various atomization models used in spray calculations. The current version contains the extension to the modeling of superheated sprays. The manual provides the user with an understanding of various models involved in the spray formulation, its code structure and solution algorithm, and various other issues related to parallelization and its coupling with other solvers.
Comparative transcriptomics reveals similarities and differences between astrocytoma grades.
Seifert, Michael; Garbe, Martin; Friedrich, Betty; Mittelbronn, Michel; Klink, Barbara
2015-12-16
Astrocytomas are the most common primary brain tumors distinguished into four histological grades. Molecular analyses of individual astrocytoma grades have revealed detailed insights into genetic, transcriptomic and epigenetic alterations. This provides an excellent basis to identify similarities and differences between astrocytoma grades. We utilized public omics data of all four astrocytoma grades focusing on pilocytic astrocytomas (PA I), diffuse astrocytomas (AS II), anaplastic astrocytomas (AS III) and glioblastomas (GBM IV) to identify similarities and differences using well-established bioinformatics and systems biology approaches. We further validated the expression and localization of Ang2 involved in angiogenesis using immunohistochemistry. Our analyses show similarities and differences between astrocytoma grades at the level of individual genes, signaling pathways and regulatory networks. We identified many differentially expressed genes that were either exclusively observed in a specific astrocytoma grade or commonly affected in specific subsets of astrocytoma grades in comparison to normal brain. Further, the number of differentially expressed genes generally increased with the astrocytoma grade with one major exception. The cytokine receptor pathway showed nearly the same number of differentially expressed genes in PA I and GBM IV and was further characterized by a significant overlap of commonly altered genes and an exclusive enrichment of overexpressed cancer genes in GBM IV. Additional analyses revealed a strong exclusive overexpression of CX3CL1 (fractalkine) and its receptor CX3CR1 in PA I possibly contributing to the absence of invasive growth. We further found that PA I was significantly associated with the mesenchymal subtype typically observed for very aggressive GBM IV. Expression of endothelial and mesenchymal markers (ANGPT2, CHI3L1) indicated a stronger contribution of the micro-environment to the manifestation of the mesenchymal subtype than the tumor biology itself. We further inferred a transcriptional regulatory network associated with specific expression differences distinguishing PA I from AS II, AS III and GBM IV. Major central transcriptional regulators were involved in brain development, cell cycle control, proliferation, apoptosis, chromatin remodeling or DNA methylation. Many of these regulators showed directly underlying DNA methylation changes in PA I or gene copy number mutations in AS II, AS III and GBM IV. This computational study characterizes similarities and differences between all four astrocytoma grades confirming known and revealing novel insights into astrocytoma biology. Our findings represent a valuable resource for future computational and experimental studies.
1983-09-01
6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA
Towards fault tolerant adiabatic quantum computation.
Lidar, Daniel A
2008-04-25
I show how to protect adiabatic quantum computation (AQC) against decoherence and certain control errors, using a hybrid methodology involving dynamical decoupling, subsystem and stabilizer codes, and energy gaps. Corresponding error bounds are derived. As an example, I show how to perform decoherence-protected AQC against local noise using at most two-body interactions.
Network Coding for Function Computation
ERIC Educational Resources Information Center
Appuswamy, Rathinakumar
2011-01-01
In this dissertation, the following "network computing problem" is considered. Source nodes in a directed acyclic network generate independent messages and a single receiver node computes a target function f of the messages. The objective is to maximize the average number of times f can be computed per network usage, i.e., the "computing…
30 CFR 206.353 - How do I determine transmission deductions?
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Depreciation under paragraphs (g) and (h) of this section and a return on undepreciated capital investment under paragraphs (g) and (i) of this section or (iv) A return on the capital investment in the..., are not allowable expenses. (g) To compute costs associated with capital investment, a lessee may use...
30 CFR 206.354 - How do I determine generating deductions?
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Depreciation under paragraphs (g) and (h) of this section and a return on undepreciated capital investment under paragraphs (g) and (i) of this section; or (iv) A return on capital investment in the power plant... allowable expenses. (g) To compute costs associated with capital investment, a lessee may use either...
Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing
NASA Technical Reports Server (NTRS)
Ozguner, Fusun
1996-01-01
Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.
Boundary modelling of the stellarator Wendelstein 7-X
NASA Astrophysics Data System (ADS)
Renner, H.; Strumberger, E.; Kisslinger, J.; Nührenberg, J.; Wobig, H.
1997-02-01
To justify the design of the divertor plates in W7-X the magnetic fields of finite-β HELIAS equilibria for the so-called high-mirror case have been computed for various average β-values up to < β > = 0.04 with the NEMEC free-boundary equilibrium code [S.P. Hirshman, W.I. van Rij and W.I. Merkel, Comput. Phys. Commun. 43 (1986) 143] in combination with the newly developed MFBE (magnetic field solver for finite-beta equilibria) code. In a second study the unloading of the target plates by radiation was investigated. The B2 code [B.J. Braams, Ph.D. Thesis, Rijksuniversiteit Utrecht (1986)] was applied for the first time to stellarators to provide of a self-consistent modelling of the SOL including effects of neutrals and impurities.
Local non-Calderbank-Shor-Steane quantum error-correcting code on a three-dimensional lattice
NASA Astrophysics Data System (ADS)
Kim, Isaac H.
2011-05-01
We present a family of non-Calderbank-Shor-Steane quantum error-correcting code consisting of geometrically local stabilizer generators on a 3D lattice. We study the Hamiltonian constructed from ferromagnetic interaction of overcomplete set of local stabilizer generators. The degenerate ground state of the system is characterized by a quantum error-correcting code whose number of encoded qubits are equal to the second Betti number of the manifold. These models (i) have solely local interactions; (ii) admit a strong-weak duality relation with an Ising model on a dual lattice; (iii) have topological order in the ground state, some of which survive at finite temperature; and (iv) behave as classical memory at finite temperature.
Hot gas ingestion effects on fuel control surge recovery and AH-1 rotor drive train torque spikes
NASA Technical Reports Server (NTRS)
Tokarski, Frank; Desai, Mihir; Books, Martin; Zagranski, Raymond
1994-01-01
This report summarizes the work accomplished through computer simulation to understand the impact of the hydromechanical turbine assembly (TA) fuel control on rocket gas ingestion induced engine surges on the AH-1 (Cobra) helicopter. These surges excite the lightly damped torsional modes of the Cobra rotor drive train and can cause overtorqueing of the tail rotor shaft. The simulation studies show that the hydromechanical TA control has a negligible effect on drive train resonances because its response is sufficiently attenuated at the resonant frequencies. However, a digital electronic control working through the TA control's separate, emergency fuel metering system has been identified as a solution to the overtorqueing problem. State-of-the-art software within the electronic control can provide active damping of the rotor drive train to eliminate excessive torque spikes due to any disturbances including engine surges and aggressive helicopter maneuvers. Modifications to the existing TA hydromechanical control are relatively minor, and existing engine sensors can be utilized by the electronic control. Therefore, it is concluded that the combination of full authority digital electronic control (FADEC) with hydromechanical backup using the existing TA control enhances flight safety, improves helicopter performance, reduces pilot workload, and provides a substantial payback for very little investment.
Di Meo, I; Marchet, S; Lamperti, C; Zeviani, M; Viscomi, C
2017-10-01
Leigh syndrome (LS) is the most common infantile mitochondrial encephalopathy. No treatment is currently available for this condition. Mice lacking Ndufs4, encoding NADH: ubiquinone oxidoreductase iron-sulfur protein 4 (NDUFS4) recapitulates the main findings of complex I (cI)-related LS, including severe multisystemic cI deficiency and progressive neurodegeneration. In order to develop a gene therapy approach for LS, we used here an AAV2/9 vector carrying the human NDUFS4 coding sequence (hNDUFS4). We administered AAV2/9-hNDUFS4 by intravenous (IV) and/or intracerebroventricular (ICV) routes to either newborn or young Ndufs4 -/- mice. We found that IV administration alone was only able to correct the cI deficiency in peripheral organs, whereas ICV administration partially corrected the deficiency in the brain. However, both treatments failed to improve the clinical phenotype or to prolong the lifespan of Ndufs4 -/- mice. In contrast, combined IV and ICV treatments resulted, along with increased cI activity, in the amelioration of the rotarod performance and in a significant prolongation of the lifespan. Our results indicate that extraneurological organs have an important role in LS pathogenesis and provide an insight into current limitations of adeno-associated virus (AAV)-mediated gene therapy in multisystem disorders. These findings warrant future investigations to develop new vectors able to efficiently target multiple organs.
1990-06-04
Bell NAH-1G (USA 70-15979 NASA-736) FLITE Cobra helicopter hovering on Ames ramp is successor to the original FLITE Cobra. It has been used extensively in joint NASA/Army human factors research in the areas of night vision displays and voice communications since its arrival in 1987. Note: Used in publication in Flight Research at Ames; 57 Years of Development and Validation of Aeronautical Technology NASA SP-1998-3300 fig 140
A proof for loop-law constraints in stoichiometric metabolic networks
2012-01-01
Background Constraint-based modeling is increasingly employed for metabolic network analysis. Its underlying assumption is that natural metabolic phenotypes can be predicted by adding physicochemical constraints to remove unrealistic metabolic flux solutions. The loopless-COBRA approach provides an additional constraint that eliminates thermodynamically infeasible internal cycles (or loops) from the space of solutions. This allows the prediction of flux solutions that are more consistent with experimental data. However, it is not clear if this approach over-constrains the models by removing non-loop solutions as well. Results Here we apply Gordan’s theorem from linear algebra to prove for the first time that the constraints added in loopless-COBRA do not over-constrain the problem beyond the elimination of the loops themselves. Conclusions The loopless-COBRA constraints can be reliably applied. Furthermore, this proof may be adapted to evaluate the theoretical soundness for other methods in constraint-based modeling. PMID:23146116
Cobra Fiber-Optic Positioner Upgrade
NASA Technical Reports Server (NTRS)
Fisher, Charles D.; Braun, David F.; Kaluzny, Joel V.
2013-01-01
A prime focus spectrometer (PFS), along with corrective optics, will mount in place of the secondary mirror of the Subaru telescope on Mauna Kea, Hawaii. This will allow simultaneous observations of cosmologic targets. It will enable large-scale galactic archeology and dark energy surveys to help unlock the secrets of the universe. To perform these cosmologic surveys, an array of 2,400 optical fibers needs to be independently positioned within the 498-mm-diameter focal plane of the PFS instrument to collect light from galaxies and stars for spectrographic analyses. To allow for independent re-positioning of the fibers, a very small positioner (7.7 mm in diameter) is required. One hundred percent coverage of the focal plane is also required, so these small actuators need to cover a patrol region of 9.5 mm in diameter. To optimize the amount of light that can be collected, the fibers need to be placed within 5 micrometers of their intended target (either a star or galaxy). The Cobra Fiber Positioner was designed to meet the size and accuracy requirements stated above. Cobra is a two-degrees-of-freedom mechanism that can position an optical fiber in the focal plane of the PFS instrument to a precision of 5 micrometers. It is a theta-phi style positioner containing two rotary piezo tube motors with one offset from the other, which enables the optic fibers to be placed anywhere in a small circular patrol region. The patrol region of the actuator is such that the array of 2,400 positioners allows for full coverage of the instrument focal plane by overlapping the patrol areas. A second-generation Cobra positioner was designed based on lessons learned from the original prototype built in 2009. Improvements were made to the precision of the ceramic motor parts, and hard stops were redesigned to minimize friction and prevent jamming. These changes resulted in reducing the number of move iterations required to position the optical fiber within 5 micrometers of its target. At the time of this reporting, there are still many tests to be performed that will validate system level performance, but on an individual level, the Cobra positioner demonstrates excellent performance and will enable the PFS instrument to make unprecedented measurements of the universe. What is unique about the upgrades made to the Cobra positioner is the improved performance due to the design changes in the hard stops and the ceramic end caps of the motors. Other changes were made to reduce the unit cost of a Cobra positioner without affecting the performance, since thousands of these devices will have to be built for the PFS instrument.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, E.R.
1983-09-01
The appendixes for the Saguaro Power Plant includes the following: receiver configuration selection report; cooperating modes and transitions; failure modes analysis; control system analysis; computer codes and simulation models; procurement package scope descriptions; responsibility matrix; solar system flow diagram component purpose list; thermal storage component and system test plans; solar steam generator tube-to-tubesheet weld analysis; pipeline listing; management control schedule; and system list and definitions.
Dr. Phil's Art Corner: Searching and Cobra Canyon.
2018-01-01
Philip Alexander, M.D., is a native Texan, retired physician, and accomplished musician and artist. After 41 years as an internal medicine physician, Dr. Phil retired from his practice in College Station in 2016. A lifelong musician and former music professor, he often performs as an oboe soloist for the Brazos Valley Symphony Orchestra. He began exploring visual art in 1980, evolving from pencil sketches-including an official White House portrait of President Ronald Reagan-to the computer-generated drawings featured in this journal. His images, which first appeared in this journal in the spring of 2012, are his own original creations.
Mesh-based Monte Carlo code for fluorescence modeling in complex tissues with irregular boundaries
NASA Astrophysics Data System (ADS)
Wilson, Robert H.; Chen, Leng-Chun; Lloyd, William; Kuo, Shiuhyang; Marcelo, Cynthia; Feinberg, Stephen E.; Mycek, Mary-Ann
2011-07-01
There is a growing need for the development of computational models that can account for complex tissue morphology in simulations of photon propagation. We describe the development and validation of a user-friendly, MATLAB-based Monte Carlo code that uses analytically-defined surface meshes to model heterogeneous tissue geometry. The code can use information from non-linear optical microscopy images to discriminate the fluorescence photons (from endogenous or exogenous fluorophores) detected from different layers of complex turbid media. We present a specific application of modeling a layered human tissue-engineered construct (Ex Vivo Produced Oral Mucosa Equivalent, EVPOME) designed for use in repair of oral tissue following surgery. Second-harmonic generation microscopic imaging of an EVPOME construct (oral keratinocytes atop a scaffold coated with human type IV collagen) was employed to determine an approximate analytical expression for the complex shape of the interface between the two layers. This expression can then be inserted into the code to correct the simulated fluorescence for the effect of the irregular tissue geometry.
Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes
NASA Astrophysics Data System (ADS)
Harrington, James William
Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present a local classical processing scheme for correcting errors on toric codes, which demonstrates that quantum information can be maintained in two dimensions by purely local (quantum and classical) resources.
Robots Learn to Recognize Individuals from Imitative Encounters with People and Avatars
NASA Astrophysics Data System (ADS)
Boucenna, Sofiane; Cohen, David; Meltzoff, Andrew N.; Gaussier, Philippe; Chetouani, Mohamed
2016-02-01
Prior to language, human infants are prolific imitators. Developmental science grounds infant imitation in the neural coding of actions, and highlights the use of imitation for learning from and about people. Here, we used computational modeling and a robot implementation to explore the functional value of action imitation. We report 3 experiments using a mutual imitation task between robots, adults, typically developing children, and children with Autism Spectrum Disorder. We show that a particular learning architecture - specifically one combining artificial neural nets for (i) extraction of visual features, (ii) the robot’s motor internal state, (iii) posture recognition, and (iv) novelty detection - is able to learn from an interactive experience involving mutual imitation. This mutual imitation experience allowed the robot to recognize the interactive agent in a subsequent encounter. These experiments using robots as tools for modeling human cognitive development, based on developmental theory, confirm the promise of developmental robotics. Additionally, findings illustrate how person recognition may emerge through imitative experience, intercorporeal mapping, and statistical learning.
Robots Learn to Recognize Individuals from Imitative Encounters with People and Avatars
Boucenna, Sofiane; Cohen, David; Meltzoff, Andrew N.; Gaussier, Philippe; Chetouani, Mohamed
2016-01-01
Prior to language, human infants are prolific imitators. Developmental science grounds infant imitation in the neural coding of actions, and highlights the use of imitation for learning from and about people. Here, we used computational modeling and a robot implementation to explore the functional value of action imitation. We report 3 experiments using a mutual imitation task between robots, adults, typically developing children, and children with Autism Spectrum Disorder. We show that a particular learning architecture - specifically one combining artificial neural nets for (i) extraction of visual features, (ii) the robot’s motor internal state, (iii) posture recognition, and (iv) novelty detection - is able to learn from an interactive experience involving mutual imitation. This mutual imitation experience allowed the robot to recognize the interactive agent in a subsequent encounter. These experiments using robots as tools for modeling human cognitive development, based on developmental theory, confirm the promise of developmental robotics. Additionally, findings illustrate how person recognition may emerge through imitative experience, intercorporeal mapping, and statistical learning. PMID:26844862
Army-NASA aircrew/aircraft integration program (A3I) software detailed design document, phase 3
NASA Technical Reports Server (NTRS)
Banda, Carolyn; Chiu, Alex; Helms, Gretchen; Hsieh, Tehming; Lui, Andrew; Murray, Jerry; Shankar, Renuka
1990-01-01
The capabilities and design approach of the MIDAS (Man-machine Integration Design and Analysis System) computer-aided engineering (CAE) workstation under development by the Army-NASA Aircrew/Aircraft Integration Program is detailed. This workstation uses graphic, symbolic, and numeric prototyping tools and human performance models as part of an integrated design/analysis environment for crewstation human engineering. Developed incrementally, the requirements and design for Phase 3 (Dec. 1987 to Jun. 1989) are described. Software tools/models developed or significantly modified during this phase included: an interactive 3-D graphic cockpit design editor; multiple-perspective graphic views to observe simulation scenarios; symbolic methods to model the mission decomposition, equipment functions, pilot tasking and loading, as well as control the simulation; a 3-D dynamic anthropometric model; an intermachine communications package; and a training assessment component. These components were successfully used during Phase 3 to demonstrate the complex interactions and human engineering findings involved with a proposed cockpit communications design change in a simulated AH-64A Apache helicopter/mission that maps to empirical data from a similar study and AH-1 Cobra flight test.
NASA Astrophysics Data System (ADS)
Karriem, Veronica V.
Nuclear reactor design incorporates the study and application of nuclear physics, nuclear thermal hydraulic and nuclear safety. Theoretical models and numerical methods implemented in computer programs are utilized to analyze and design nuclear reactors. The focus of this PhD study's is the development of an advanced high-fidelity multi-physics code system to perform reactor core analysis for design and safety evaluations of research TRIGA-type reactors. The fuel management and design code system TRIGSIMS was further developed to fulfill the function of a reactor design and analysis code system for the Pennsylvania State Breazeale Reactor (PSBR). TRIGSIMS, which is currently in use at the PSBR, is a fuel management tool, which incorporates the depletion code ORIGEN-S (part of SCALE system) and the Monte Carlo neutronics solver MCNP. The diffusion theory code ADMARC-H is used within TRIGSIMS to accelerate the MCNP calculations. It manages the data and fuel isotopic content and stores it for future burnup calculations. The contribution of this work is the development of an improved version of TRIGSIMS, named TRIGSIMS-TH. TRIGSIMS-TH incorporates a thermal hydraulic module based on the advanced sub-channel code COBRA-TF (CTF). CTF provides the temperature feedback needed in the multi-physics calculations as well as the thermal hydraulics modeling capability of the reactor core. The temperature feedback model is using the CTF-provided local moderator and fuel temperatures for the cross-section modeling for ADMARC-H and MCNP calculations. To perform efficient critical control rod calculations, a methodology for applying a control rod position was implemented in TRIGSIMS-TH, making this code system a modeling and design tool for future core loadings. The new TRIGSIMS-TH is a computer program that interlinks various other functional reactor analysis tools. It consists of the MCNP5, ADMARC-H, ORIGEN-S, and CTF. CTF was coupled with both MCNP and ADMARC-H to provide the heterogeneous temperature distribution throughout the core. Each of these codes is written in its own computer language performing its function and outputs a set of data. TRIGSIMS-TH provides an effective use and data manipulation and transfer between different codes. With the implementation of feedback and control- rod-position modeling methodologies, the TRIGSIMS-TH calculations are more accurate and in a better agreement with measured data. The PSBR is unique in many ways and there are no "off-the-shelf" codes, which can model this design in its entirety. In particular, PSBR has an open core design, which is cooled by natural convection. Combining several codes into a unique system brings many challenges. It also requires substantial knowledge of both operation and core design of the PSBR. This reactor is in operation decades and there is a fair amount of studies and developments in both PSBR thermal hydraulics and neutronics. Measured data is also available for various core loadings and can be used for validation activities. The previous studies and developments in PSBR modeling also aids as a guide to assess the findings of the work herein. In order to incorporate new methods and codes into exiting TRIGSIMS, a re-evaluation of various components of the code was performed to assure the accuracy and efficiency of the existing CTF/MCNP5/ADMARC-H multi-physics coupling. A new set of ADMARC-H diffusion coefficients and cross sections was generated using the SERPENT code. This was needed as the previous data was not generated with thermal hydraulic feedback and the ARO position was used as the critical rod position. The B4C was re-evaluated for this update. The data exchange between ADMARC-H and MCNP5 was modified. The basic core model is given a flexibility to allow for various changes within the core model, and this feature was implemented in TRIGSIMS-TH. The PSBR core in the new code model can be expanded and changed. This allows the new code to be used as a modeling tool for design and analyses of future code loadings.
NASA Astrophysics Data System (ADS)
Baker, Catherine M.
Teaching people with disabilities tech skills empowers them to create solutions to problems they encounter and prepares them for careers. However, computer science is typically taught in a highly visual manner which can present barriers for people who are blind. The goal of this dissertation is to understand and decrease those barriers. The first projects I present looked at the barriers that blind students face. I first present the results of my survey and interviews with blind students with degrees in computer science or related fields. This work highlighted the many barriers that these blind students faced. I then followed-up on one of the barriers mentioned, access to technology, by doing a preliminary accessibility evaluation of six popular integrated development environments (IDEs) and code editors. I found that half were unusable and all had some inaccessible portions. As access to visual information is a barrier in computer science education, I present three projects I have done to decrease this barrier. The first project is Tactile Graphics with a Voice (TGV). This project investigated an alternative to Braille labels for those who do not know Braille and showed that TGV was a potential alternative. The next project was StructJumper, which created a modified abstract syntax tree that blind programmers could use to navigate through code with their screen reader. The evaluation showed that users could navigate more quickly and easily determine the relationships of lines of code when they were using StructJumper compared to when they were not. Finally, I present a tool for dynamic graphs (the type with nodes and edges) which had two different modes for handling focus changes when moving between graphs. I found that the modes support different approaches for exploring the graphs and therefore preferences are mixed based on the user's preferred approach. However, both modes had similar accuracy in completing the tasks. These projects are a first step towards the goal of making computer science education more accessible to blind students. By identifying the barriers that exist and creating solutions to overcome them, we can support increasing the number of blind students in computer science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sartori, E.; Roussin, R.W.
This paper presents a brief review of computer codes concerned with checking, plotting, processing and using of covariances of neutron cross-section data. It concentrates on those available from the computer code information centers of the United States and the OECD/Nuclear Energy Agency. Emphasis will be placed also on codes using covariances for specific applications such as uncertainty analysis, data adjustment and data consistency analysis. Recent evaluations contain neutron cross section covariance information for all isotopes of major importance for technological applications of nuclear energy. It is therefore important that the available software tools needed for taking advantage of this informationmore » are widely known as hey permit the determination of better safety margins and allow the optimization of more economic, I designs of nuclear energy systems.« less
Parallelizing a peanut butter sandwich
NASA Astrophysics Data System (ADS)
Quenette, S. M.
2005-12-01
This poster aims to demonstrate, in a novel way, why contemporary computational code development is seemingly hard to a geodynamics modeler (i.e. a non-computer-scientist). For example, to utilise comtemporary computer hardware, parallelisation is required. But why do we chose the explicit approach (MPI) over an implicit (OpenMP) one? How does this relate to the typical geodynamics codes. And do we face this same style of problems in every day life? We aim to demonstrate that the little bit of complexity, fore-thought and effort is worth its while.
An approach to the origin of self-replicating system. I - Intermolecular interactions
NASA Technical Reports Server (NTRS)
Macelroy, R. D.; Coeckelenbergh, Y.; Rein, R.
1978-01-01
The present paper deals with the characteristics and potentialities of a recently developed computer-based molecular modeling system. Some characteristics of current coding systems are examined and are extrapolated to the apparent requirements of primitive prebiological coding systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donald D Dudenhoeffer; Burce P Hallbert
Instrumentation, Controls, and Human-Machine Interface (ICHMI) technologies are essential to ensuring delivery and effective operation of optimized advanced Generation IV (Gen IV) nuclear energy systems. In 1996, the Watts Bar I nuclear power plant in Tennessee was the last U.S. nuclear power plant to go on line. It was, in fact, built based on pre-1990 technology. Since this last U.S. nuclear power plant was designed, there have been major advances in the field of ICHMI systems. Computer technology employed in other industries has advanced dramatically, and computing systems are now replaced every few years as they become functionally obsolete. Functionalmore » obsolescence occurs when newer, more functional technology replaces or supersedes an existing technology, even though an existing technology may well be in working order.Although ICHMI architectures are comprised of much of the same technology, they have not been updated nearly as often in the nuclear power industry. For example, some newer Personal Digital Assistants (PDAs) or handheld computers may, in fact, have more functionality than the 1996 computer control system at the Watts Bar I plant. This illustrates the need to transition and upgrade current nuclear power plant ICHMI technologies.« less
NASA Astrophysics Data System (ADS)
Chen, Tsoching; Kozloff, Kenneth M.; Goldstein, Steven A.; Morris, Michael D.
2004-07-01
Osteogenesis imperfecta (OI) is genetic defect in which the genes that code for the α1(I) or α2(I) chains of type I collagen are defective. The defects often result in substitution of a bulky amino acid for glycine, causing formation of collagen that can not form the normal triple helix. Depending on the details of the defects, the outcomes range from controllable to lethal. This study focuses on OI type IV, a more common and moderately severe form of the disease. People with the disease have a substantial increase in the risk and rate of fracture. We examine the spectroscopic consequences of these defects, using a mouse model (BRTL) that mimics OI type IV. We compare Raman images from tibial cortical tissue of wild-type mice and BRTL mice with single copy of mutation and show that both mineral to matrix ratios and collagen inter-fibril cross-links are different in wild-type and mutant mice.
Livanov, G A; Batotsyrenkov, B V; Lodiagin, A N; Andrianov, A Iu; Kuznetsov, O A; Loladze, A T; Baranov, D V
2014-01-01
This paper reports a case of severe acute intoxication with an animal poison after a bite by the monocled cobra. Combined treatment including artificial lung ventilation, infusion-detoxication and desensitizing (hormonal) therapy, hemosorption, correction of metabolic disorders with cytoflavin, antibacterial therapy had positive effect on the patient's condition and ensured the favourable outcome ofpotentially lethal poisoning without the use ofa specific anti-snake venom serum.
Gowtham, Yashonandana J; Mahadeswaraswamy, Y H; Girish, K S; K, Kemparaju
2014-07-01
The venom of the largest venomous snake, the king cobra (Ophiophagus hannah), is still out of league for the production of therapeutic polyvalent antivenom nor it is characterized immunologically in the Indian subcontinent. In the present study, the king cobra venom is comparatively studied for the cross-reactivity/reactivity and toxicity neutralization by the locally available equine therapeutic polyvalent BSV and VB antivenoms, and monovalent antivenom (OH-IgG) prepared in rabbit. None of the two therapeutic antivenoms procured from two different firms showed any signs of cross-reactivity in terms of antigen-antibody precipitin lines in immunodouble diffusion assay; however, a weak and an insignificant cross-reactivity pattern was observed in ELISA and Western blot studies. Further, both BSV and VB antivenoms failed to neutralize proteolytic, hyaluronidase and phospholipase activities as well as toxic properties such as edema, myotoxicity and lethality of the venom. As expected, OH-IgG showed strong reactivity in immunodouble diffusion, ELISA and in Western blot analysis and also neutralized both enzyme activities as well as the toxic properties of the venom. Thus, the study provides insight into the likely measures that are to be taken in cases of accidental king cobra bites for which the Indian subcontinent is still not prepared for. Copyright © 2014 Elsevier B.V. All rights reserved.
Calderas, Carlos; Condado, Jose Francisco; Condado, Jose Antonio; Flores, Alejandra; Mueller, Amy; Thomas, Jack; Nakatani, Daisaku; Honda, Yasuhiro; Waseda, Katsuhisa; Fitzgerald, Peter
2014-01-01
The Cobra-P drug-eluting stent (DES) system consists of cobalt chromium alloy with bio-absorbable siloxane sol-gel matrix coating that elutes low dose paclitaxel within 6 months. The aim of this first-in-man trial was to evaluate the safety and performance of 2 doses of the Cobra-P DES. A total of 60 lesions (54 patients) were sequentially assigned to 2 different paclitaxel doses: group A (3.7 μg/18mm, n=30) or group B (8 μg/18mm, n=30). The primary endpoint was MACE at 4 months defined as cardiac death, myocardial infarction, and target lesion revascularization. Patient and lesion characteristics were matched between the 2 groups except for male sex. MACE at 4 months was 3.3% and 0% respectively (P=1.000) and at 1-year follow-up remained unchanged. In-stent late loss at 4 months was similar in both groups (0.36 ± 0.30mm and 0.34 ± 0.20mm P=.773). In this FIM study, implantation of the Cobra-P low dose paclitaxel-eluting stent with a bioabsorbable sol-gel coating was proven to be feasible and safe. Moderate neointimal proliferation was observed as well as an acceptable MACE rate up to 1 year. © 2014.
Measurement Systems Advisory Group
1974-04-01
US NavyLib MSAG- 1 REPORT 001 qL5 ID I APRIL 1974 MA2C91SMAR 2 0 19 81 F Prepared For "LONG RANGE ACOUSTIC PROPAGATION PROJECT 7 opt/ icl S[ D _STRIBU...34 j •( - _> ...ailability Codes IV !VM Avail and/or -~’Dist IspecialL K OFFICE OF NAVAL RESEARCH Department of the Navy Washington, D.C.I L 1 April 23...respect and very much appreciated. The report will be distributed to selected members of the user community for their information and comment. Because 1
Effects of Ultraviolet Radiation on the Oxygen Uptake Rate of the Rabbit Cornea
1989-07-01
typical of a noncoherent source Optometrist, Ph.D. exposure. IV Effects on Corneal Oxygen Uptake-Lattimore 117 AvxAtl,-blity Codes 1- -il and/or , "t i...romator entrance slit by the housing optics . A 10 reciprocity (i.e., the biologic effects or endpoints cm quartz-enclosed water chamber was placed be...remove the infrared radiation. The exit optical taneous output at 350.7 and 356.4 nm (3:1 ratio), beam was focused by a quartz lens with a beam size
COM-GEOM Interactive Display Debugger (CIDD)
1984-08-01
necessery and Identify by block nlum.ber) Target Description GIFT interactive Computer Graphics SolIi d Geone t ry Combintatorial Gecometry * COM-GLOM 120...program was written to speed up the process of formulating the Com-Geom data used by the Geometric Information for Targets ( GIFT ) 1,2 computer code...Polyhedron Lawrence W. Bain, Mathew J. Reisinger, "The GIFT Code User Manual; Volume I, Introduction and Input Requirements (u)," BRL Report No. 1802
Sesquinaries, Magnetics and Atmospheres: Studies of the Terrestrial Moons and Exoplanets
2016-12-01
support provided by Red Sky Research, LLC. Computational support was provided by the NASA Ames Mission Design Division (Code RD) for research...Systems Branch (Code SST), NASA Ames Research Center, provided supercomputer access and computational resources for the work in Chapter 5. I owe a...huge debt of gratitude to Dr. Pete Worden, Dr. Steve Zornetzer, Dr. Alan Weston ( NASA ), and Col. Carol Welsch, Lt. Col Joe Nance and Lt. Col Brian
Numerical Electromagnetic Code (NEC)-Basic Scattering Code. Part I. User’s Manual.
1979-09-01
Command RT : 29 I. Command PG: 32 J. Command GP: 35 K. Command CG: 36 L. Command SG: 39 M. Command AM: 44 N. Conumand PR: 48 0. Command NP: 49 P...these points and con- firm the validity of the solution. 1 0 1 -.- ’----.- ... The source presently considered in the computer code is an Plec - tric...Range Input 28 * RT : Translate and/or Rotate Coordinates 29 SG: Source Geometry Input IQ TO: Test Data Generation Options 17 [IN: Units of Input U)S
NASA Technical Reports Server (NTRS)
Komatsu, G. K.; Stellen, J. M., Jr.
1976-01-01
Measurements have been made of the high energy thrust ions, (Group I), high angle/high energy ions (Group II), and high angle/low energy ions (Group IV) of a mercury electron bombardment thruster in the angular divergence range from 0 deg to greater than 90 deg. The measurements have been made as a function of thrust ion current, propellant utilization efficiency, bombardment discharge voltage, screen and accelerator grid potential (accel-decel ratio) and neutralizer keeper potential. The shape of the Group IV (charge exchange) ion plume has remained essentially fixed within the range of variation of the engine operation parameters. The magnitude of the charge exchange ion flux scales with thrust ion current, for good propellant utilization conditions. For fixed thrust ion current, charge exchange ion flux increases for diminishing propellant utilization efficiency. Facility effects influence experimental accuracies within the range of propellant utilization efficiency used in the experiments. The flux of high angle/high energy Group II ions is significantly diminished by the use of minimum decel voltages on the accelerator grid. A computer model of charge exchange ion production and motion has been developed. The program allows computation of charge exchange ion volume production rate, total production rate, and charge exchange ion trajectories for "genuine" and "facilities effects" particles. In the computed flux deposition patterns, the Group I and Group IV ion plumes exhibit a counter motion.
Kalman Filter Time Series Analysis of Gamma-Ray Data from NaI(T1) Detectors for the ND6620 Computer.
1985-05-08
J ., ,mi. m- n.I Illa IdI~ikll di I I I " i~i l ll . . ... " " . .... " ". . . . .. MIDAS FORTRAN IV 21 DEC...Washington, D.C. ApprovedI for publi releasec. di ~tiiin unlimited . ... ... ... . . . . . . . . . . . . . . S~ i’C a;s (,.-,ON OF _45 S ACE REPORT...be gaussian with a mean of zero and covariance Rk wnich is known or can oe estimated. Tne oehavior of the source between times k and K+ l is assumed
Approximate Single-Diode Photovoltaic Model for Efficient I-V Characteristics Estimation
Ting, T. O.; Zhang, Nan; Guan, Sheng-Uei; Wong, Prudence W. H.
2013-01-01
Precise photovoltaic (PV) behavior models are normally described by nonlinear analytical equations. To solve such equations, it is necessary to use iterative procedures. Aiming to make the computation easier, this paper proposes an approximate single-diode PV model that enables high-speed predictions for the electrical characteristics of commercial PV modules. Based on the experimental data, statistical analysis is conducted to validate the approximate model. Simulation results show that the calculated current-voltage (I-V) characteristics fit the measured data with high accuracy. Furthermore, compared with the existing modeling methods, the proposed model reduces the simulation time by approximately 30% in this work. PMID:24298205
Research in Parallel Algorithms and Software for Computational Aerosciences
DOT National Transportation Integrated Search
1996-04-01
Phase I is complete for the development of a Computational Fluid Dynamics : with automatic grid generation and adaptation for the Euler : analysis of flow over complex geometries. SPLITFLOW, an unstructured Cartesian : grid code developed at Lockheed...
A computer program for simulating salinity loads in streams
Glover, Kent C.
1978-01-01
A FORTRAN IV program that simulates salinity loads in streams is described. Daily values of stream-discharge in cubic feet per second, or stream-discharge and specific conductance in micromhos, are used to estimate daily loads in tons by one of five available methods. The loads are then summarized by computing either total and mean monthly loads or various statistics for each calendar day. Results are output in tabular and, if requested, punch card format. Under selection of appropriate methods for estimating and summarizing daily loads is provided through the coding of program control cards. The program is designed to interface directly with data retrieved from the U.S. Geological Survey WATSTORE Daily Values File. (Woodard-USGS)
Salko, Robert K.; Schmidt, Rodney C.; Avramova, Maria N.
2014-11-23
This study describes major improvements to the computational infrastructure of the CTF subchannel code so that full-core, pincell-resolved (i.e., one computational subchannel per real bundle flow channel) simulations can now be performed in much shorter run-times, either in stand-alone mode or as part of coupled-code multi-physics calculations. These improvements support the goals of the Department Of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL) Energy Innovation Hub to develop high fidelity multi-physics simulation tools for nuclear energy design and analysis.
Hazard Assessment Computer System HACS/UIM Users’ Operation Manual. Volume I.
1981-09-01
941999-A U NCL A SSI7IED USCG-D-75-AL R_1 3 ~hhE~ I EEmhh.EEohmhE 2 I 1.I25 1.fl4 L MICROCOP RtfSCLUTItN IEST HTAK ’I’l ONAL BURLAU OF STANDARDS-1963...to assist in obtaining the compound recognition code used to refer- ence data for a particular chemical, a separate set of indices have been produced...and are given in a separate report. These indices enable a user of HACS to obtain a compound recognition code for a chemical given either the compound
Salakij, Chaleow; Salakij, Jarernsak; Apibal, Suntaree; Narkkong, Nual-Anong; Chanhome, Lawan; Rochanapat, Nirachara
2002-01-01
King cobras (Ophiophagus hannah) have been captive-bred at Queen Saovabha Memorial Institute since 1996 to supply venom for antivenom production. Hematologic tests would be useful for evaluating the health of the snakes, however, basic hematologic data and morphology have not been described for this species. The purpose of this study was to determine basic hematologic values and evaluate light microscopic, cytochemical, and electron microscopic characteristics of king cobra blood cells. Blood samples from 13 wild-caught and 15 captive-bred king cobras were collected into EDTA from the ventral caudal vein. A CBC was done using standard methods. Significant differences between groups were determined using t-tests. Cytochemical stains (periodic acid-Schiff [PAS], Sudan black B [SBB], alpha-naphthyl acetate esterase [ANAE], acid phosphatase [AcP], and beta-glucuronidase [beta-glu]), and scanning and transmission electron microscopy were done using standard techniques. Eighteen snakes (64.3%) were positive for Hepatozoon infection. Hepatozoon organisms were detected nearly twice as frequently in wild-caught (11/13) as in captive-bred (7/15) snakes. Total WBC, azurophil, and lymphocyte counts were higher and fibrinogen concentration was lower in Hepatozoon-positive snakes. Captive-bred snakes had higher RBC values, lower azurophil, heterophil, and punctate reticulocyte percentages, and higher lymphocyte numbers compared with wild-caught snakes. Lymphocytes were the most commonly observed WBCs, and stained positive with PAS, ANAE, AcP, and beta-glu. Azurophil granules stained positive with SBB, PAS, and ANAE. Heterophils were the largest WBCs; their granules stained with SBB, ANAE, and beta-glu. Basophil granules stained with PAS, SBB, ANAE, and beta-glu. Thrombocytes were strongly positive with PAS. Transmission electron microscopic examination revealed organelles within all WBCs except eosinophils and revealed the gamonts of Hepatozoon sp in RBCs and azurophils. These results provide comparative hematologic data and a guide for identification of blood cells in wild-caught and captive-bred king cobra snakes. Hepatozoon infection was relatively common, but was not associated with severe hematologic abnormalities.
Modeling and Simulation of Explosively Driven Electromechanical Devices
NASA Astrophysics Data System (ADS)
Demmie, Paul N.
2002-07-01
Components that store electrical energy in ferroelectric materials and produce currents when their permittivity is explosively reduced are used in a variety of applications. The modeling and simulation of such devices is a challenging problem since one has to represent the coupled physics of detonation, shock propagation, and electromagnetic field generation. The high fidelity modeling and simulation of complicated electromechanical devices was not feasible prior to having the Accelerated Strategic Computing Initiative (ASCI) computers and the ASCI developed codes at Sandia National Laboratories (SNL). The EMMA computer code is used to model such devices and simulate their operation. In this paper, I discuss the capabilities of the EMMA code for the modeling and simulation of one such electromechanical device, a slim-loop ferroelectric (SFE) firing set.
Initial Ada components evaluation
NASA Technical Reports Server (NTRS)
Moebes, Travis
1989-01-01
The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.
Comparative 1D and 3D numerical investigation of open-channel junction flows and energy losses
NASA Astrophysics Data System (ADS)
Luo, Hao; Fytanidis, Dimitrios K.; Schmidt, Arthur R.; García, Marcelo H.
2018-07-01
The complexity of open channel confluences stems from flow mixing, secondary circulation, post-confluence flow separation, contraction and backwater effects. These effects in turn result in a large number of parameters required to adequately quantify the junction induced hydraulic resistance and describe mean flow pattern and turbulent flow structures due to flow merging. The recent development in computing power advances the application of 3D Computational Fluid Dynamics (CFD) codes to visualize and understand the Confluence Hydrodynamic Zone (CHZ). Nevertheless, 1D approaches remain the mainstay in large drainage network or waterway system modeling considering computational efficiency and data availability. This paper presents (i) a modified 1D nonlinear dynamic model; (ii) a fully 3D non-hydrostatic, Reynolds-averaged Navier-Stokes Equations (RANS)-based, Computational Fluid Dynamics (CFD) model; (iii) an analysis of changing confluence hydrodynamics and 3D turbulent flow structure under various controls; (iv) a comparison of flow features (i.e. upstream water depths, energy losses and post-confluence contraction) predicted by 1D and 3D models; and (v) parameterization of 3D flow characteristics in 1D modeling through the computation of correction coefficients associated with contraction, energy and momentum. The present comprehensive 3D numerical investigation highlights the driving mechanisms for junction induced energy losses. Moreover, the comparative 1D and 3D study quantifies the deviation of 1D approximations and associated underlying assumptions from the 'true' resultant flow field. The study may also shed light on improving the accuracy of the 1D large network modeling through the parameterization of the complex 3D feature of the flow field and correction of interior boundary conditions at junctions of larger angles and/or with substantial lateral inflows. Moreover, the enclosed numerical investigations may enhance the understanding of the primary mechanisms contributing to hydraulic structure induced turbulent flow behavior and increased hydraulic resistance.
A Parametric Regression of the Cost of Base Realignment Action (COBRA) Model
1993-09-20
Douglas D. Hardman , Captain, USAF Michael S. Nelson, Captain, USAF AFIT/GEE/ENS/93S-03 93 P’ 8 143 Approved for public release, distribution unlimited 93... Hardman CLASS: GEE 93S Captain Michael Nelson TITLE: A Parametric Regression of the Cost of Base Realignment Action (COBRA) Model DEFENSE DATE: 20...Science in Engineering and Environmental Management Douglas D. Hardman , B.S.E.E. Michael S. Nelson, B.S.C.E Captain, USAF Captain, USAF September 1993
Host Defense against Opportunist Microorganisms Following Trauma.
1979-06-01
patients were total hemolytic complement (CH5 0 ), C3 conversion by inulin and cobra venom factor (CoVF), and itmunochemical concentrations of Clq, C4, C2...were normal or elevated for the entire study period. C3 conversion by inulin and CoVF and the concentration of properdin were reduced in the sera of the...measured in all patients were total hemolytic complement (CH5 0), C3 conversion by inulin and cobra venom factor (CoVF), and inunochemical
2011-03-01
FIGURES Figure 1. Radar image of the eye of Typhoon Cobra on 18 December 1944 from a ship located at the center of the area shown (from NOAA Library at...System Research and Predictability Experiment T- PARC : THORPEX-Pacific Asian Regional Campaign TS: Tropical Storm TUTT: Tropical Upper...Figure 1. Radar image of the eye of Typhoon Cobra on 18 December 1944 from a ship located at the center of the area shown (from NOAA Library at
Complement Depletion Protects Lupus-prone Mice from Ischemia-reperfusion-initiated Organ Injury
2012-10-25
injury, we sought to evaluate whether complement inhibition mitigates organ damage. We found that complement deple- tion with cobra venom factor... venom factor and C5a receptor antagonist were able to protect mice from local tissue damage, treatment with C5a receptor antagonist was not able to...Complement depletion or blockage of the complement pathway using molecules such as cobra venom factor (CVF) (24, 33) and C5a receptor antagonists (C5aRA
Cobra communications switch integration program
NASA Technical Reports Server (NTRS)
Shively, Robert J.; Haworth, Loran A.; Szoboszlay, Zoltan; Murray, F. Gerald
1989-01-01
The paper describes a design modification to reduce the visual and manual workload associated with the radio selection and communications tasks in the U.S. Army AH-1 Cobra helicopter. The modification involves the integration of the radio selection and microphone actuating tasks into a single operation controlled by the transmit-intercom switch. Ground-based and flight tests were conducted to evaluate the modified configuration during twelve flight tasks. The results show that the proposed configuration performs twice as fast as the original configuration.
34 CFR 692.71 - What activities may be funded under the SLEAP Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... achievement; or (2) Wish to enter a program of study leading to a career in— (i) Information technology; (ii) Mathematics, computer science, or engineering; (iii) Teaching; or (iv) Other fields determined by the State to...
38 CFR 17.501 - Confidential and privileged documents.
Code of Federal Regulations, 2012 CFR
2012-07-01
... facility: (i) Medical records reviews, (ii) Drug usage evaluations, (iii) Blood usage reviews, (iv... quality assurance reviews. (e) Documents which are confidential and privileged may be in written, computer... of Boards of Investigations; (5) Completed patient satisfaction survey questionnaires and findings...
38 CFR 17.501 - Confidential and privileged documents.
Code of Federal Regulations, 2014 CFR
2014-07-01
... facility: (i) Medical records reviews, (ii) Drug usage evaluations, (iii) Blood usage reviews, (iv... quality assurance reviews. (e) Documents which are confidential and privileged may be in written, computer... of Boards of Investigations; (5) Completed patient satisfaction survey questionnaires and findings...
38 CFR 17.501 - Confidential and privileged documents.
Code of Federal Regulations, 2010 CFR
2010-07-01
... facility: (i) Medical records reviews, (ii) Drug usage evaluations, (iii) Blood usage reviews, (iv... quality assurance reviews. (e) Documents which are confidential and privileged may be in written, computer... of Boards of Investigations; (5) Completed patient satisfaction survey questionnaires and findings...
38 CFR 17.501 - Confidential and privileged documents.
Code of Federal Regulations, 2011 CFR
2011-07-01
... facility: (i) Medical records reviews, (ii) Drug usage evaluations, (iii) Blood usage reviews, (iv... quality assurance reviews. (e) Documents which are confidential and privileged may be in written, computer... of Boards of Investigations; (5) Completed patient satisfaction survey questionnaires and findings...
38 CFR 17.501 - Confidential and privileged documents.
Code of Federal Regulations, 2013 CFR
2013-07-01
... facility: (i) Medical records reviews, (ii) Drug usage evaluations, (iii) Blood usage reviews, (iv... quality assurance reviews. (e) Documents which are confidential and privileged may be in written, computer... of Boards of Investigations; (5) Completed patient satisfaction survey questionnaires and findings...
Performance of a parallel code for the Euler equations on hypercube computers
NASA Technical Reports Server (NTRS)
Barszcz, Eric; Chan, Tony F.; Jesperson, Dennis C.; Tuminaro, Raymond S.
1990-01-01
The performance of hypercubes were evaluated on a computational fluid dynamics problem and the parallel environment issues were considered that must be addressed, such as algorithm changes, implementation choices, programming effort, and programming environment. The evaluation focuses on a widely used fluid dynamics code, FLO52, which solves the two dimensional steady Euler equations describing flow around the airfoil. The code development experience is described, including interacting with the operating system, utilizing the message-passing communication system, and code modifications necessary to increase parallel efficiency. Results from two hypercube parallel computers (a 16-node iPSC/2, and a 512-node NCUBE/ten) are discussed and compared. In addition, a mathematical model of the execution time was developed as a function of several machine and algorithm parameters. This model accurately predicts the actual run times obtained and is used to explore the performance of the code in interesting but yet physically realizable regions of the parameter space. Based on this model, predictions about future hypercubes are made.
Measurement and analysis of solar cell current-voltage characteristics
NASA Technical Reports Server (NTRS)
Olsen, Larry C.; Addis, F. William; Doyle, Dan H.; Miller, Wesley A.
1985-01-01
Approaches to measurement and analysis of solar cell current-voltage characteristics under dark and illuminated conditions are discussed. Measurements are taken with a computer based data acquisition system for temperatures in the range of -100 to +100 C. In the fitting procedure, the various I(oi) and C(i) as well as R(S) and R(SH) are determined. Application to current-voltage analyses of high efficiency silicon cells and Boeing CdS/CuInSe2 are discussed. In silicon MINP cells, it is found that at low voltages a tunneling mechanism is dominant, while at larger voltages the I-V characteristics are usually dominated by emitter recombination. In the case of Boeing cells, a current transport model based on a tunneling mechanism and interface recombination acting in series has been developed as a result of I-V analyses.
15 CFR 740.7 - Computers (APP).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 4A003. (2) Technology and software. License Exception APP authorizes exports of technology and software... License Exception. (2) Access and release restrictions. (i)[Reserved] (ii) Technology and source code. Technology and source code eligible for License Exception APP may not be released to nationals of Cuba, Iran...
NASA Technical Reports Server (NTRS)
Schmidt, G.; Ruster, R.; Czechowsky, P.
1983-01-01
The SOUSY-VHF-Radar operates at a frequency of 53.5 MHz in a valley in the Harz mountains, Germany, 90 km from Hanover. The radar controller, which is programmed by a 16-bit computer holds 1024 program steps in core and controls, via 8 channels, the whole radar system: in particular the master oscillator, the transmitter, the transmit-receive-switch, the receiver, the analog to digital converter, and the hardware adder. The high-sensitivity receiver has a dynamic range of 70 dB and a video bandwidth of 1 MHz. Phase coding schemes are applied, in particular for investigations at mesospheric heights, in order to carry out measurements with the maximum duty cycle and the maximum height resolution. The computer takes the data from the adder to store it in magnetic tape or disc. The radar controller is programmed by the computer using simple FORTRAN IV statements. After the program has been loaded and the computer has started the radar controller, it runs automatically, stopping at the program end. In case of errors or failures occurring during the radar operation, the radar controller is shut off caused either by a safety circuit or by a power failure circuit or by a parity check system.
ERIC Educational Resources Information Center
Yearout, Robert D., Ed.
This set of proceedings documents includes 407 papers representative of the 1,825 papers and posters presented at a conference on undergraduate research. Volume I contains papers on the arts and humanities. Examples of topics include collaborative art, music composition using computer technology, interpreting Roman morality, gay marriage, and…
Health information management: an introduction to disease classification and coding.
Mony, Prem Kumar; Nagaraj, C
2007-01-01
Morbidity and mortality data constitute an important component of a health information system and their coding enables uniform data collation and analysis as well as meaningful comparisons between regions or countries. Strengthening the recording and reporting systems for health monitoring is a basic requirement for an efficient health information management system. Increased advocacy for and awareness of a uniform coding system together with adequate capacity building of physicians, coders and other allied health and information technology personnel would pave the way for a valid and reliable health information management system in India. The core requirements for the implementation of disease coding are: (i) support from national/institutional health administrators, (ii) widespread availability of the ICD-10 material for morbidity and mortality coding; (iii) enhanced human and financial resources; and (iv) optimal use of informatics. We describe the methodology of a disease classification and codification system as also its applications for developing and maintaining an effective health information management system for India.
Di Meo, I; Marchet, S; Lamperti, C; Zeviani, M; Viscomi, C
2017-01-01
Leigh syndrome (LS) is the most common infantile mitochondrial encephalopathy. No treatment is currently available for this condition. Mice lacking Ndufs4, encoding NADH: ubiquinone oxidoreductase iron-sulfur protein 4 (NDUFS4) recapitulates the main findings of complex I (cI)-related LS, including severe multisystemic cI deficiency and progressive neurodegeneration. In order to develop a gene therapy approach for LS, we used here an AAV2/9 vector carrying the human NDUFS4 coding sequence (hNDUFS4). We administered AAV2/9-hNDUFS4 by intravenous (IV) and/or intracerebroventricular (ICV) routes to either newborn or young Ndufs4−/− mice. We found that IV administration alone was only able to correct the cI deficiency in peripheral organs, whereas ICV administration partially corrected the deficiency in the brain. However, both treatments failed to improve the clinical phenotype or to prolong the lifespan of Ndufs4−/− mice. In contrast, combined IV and ICV treatments resulted, along with increased cI activity, in the amelioration of the rotarod performance and in a significant prolongation of the lifespan. Our results indicate that extraneurological organs have an important role in LS pathogenesis and provide an insight into current limitations of adeno-associated virus (AAV)-mediated gene therapy in multisystem disorders. These findings warrant future investigations to develop new vectors able to efficiently target multiple organs. PMID:28753212
NASA Astrophysics Data System (ADS)
Regis, Rommel G.
2014-02-01
This article develops two new algorithms for constrained expensive black-box optimization that use radial basis function surrogates for the objective and constraint functions. These algorithms are called COBRA and Extended ConstrLMSRBF and, unlike previous surrogate-based approaches, they can be used for high-dimensional problems where all initial points are infeasible. They both follow a two-phase approach where the first phase finds a feasible point while the second phase improves this feasible point. COBRA and Extended ConstrLMSRBF are compared with alternative methods on 20 test problems and on the MOPTA08 benchmark automotive problem (D.R. Jones, Presented at MOPTA 2008), which has 124 decision variables and 68 black-box inequality constraints. The alternatives include a sequential penalty derivative-free algorithm, a direct search method with kriging surrogates, and two multistart methods. Numerical results show that COBRA algorithms are competitive with Extended ConstrLMSRBF and they generally outperform the alternatives on the MOPTA08 problem and most of the test problems.
First Results from the Cornell COBRA Accelerator for Light Ion ICF Research
NASA Astrophysics Data System (ADS)
Lindholm, F.; Krastelev, E. G.; Greenly, J. B.; Kusse, B. R.
1996-11-01
COBRA, the Cornell Beam Research Accelerator, is a four-stage linear induction adder based on the Sandia National Laboratories SABRE accelerator design. The full 4 × 1 MV, 200 kA, 40 ns COBRA was completed in June 1996, after a year of initial operation with a single stage. Accelerator operation will be described, and first experimental results of power coupling and ion beam generation using a closely-coupled (short MITL) applied-B extraction ion diode load will be presented. A diagnostic package for beam optics including local microdivergence and aiming measurements is being developed, and results from both the single-stage experiments and new experiments on the full accelerator will be presented. A 20 ns, 15% voltage precursor to the main pulse resulting from coupling through the nonlinear magnetization characteristic of the Metglas^circR core at high magnetization rate was seen in the single-cell experiments. This mechanism will be discussed and its consequences on the full accelerator will be investigated.
COBRA 9121: Federal liability for patient screening and transfer.
Frew, S A
1988-01-01
Health care is no longer a simple cottage industry of individual providers. Increases in competition and government regulation have transformed the old structure of health care into a fend-for-yourself marketplace dominated by multi-institutional corporations. In order to accomplish this change, health care providers have had to alter their locus of attention from the patient to the bottom line. As a result, it is not surprising to find corporate business practices interspersed among the traditional health care practices. On March 1, 1987, the federal government began an assault on a casualty of this new market oriental philosophy, patient transfers or "dumping". COBRA 9121 is an "anti-dumping" law designed to prevent hospitals from continuing this practice. The vehicle for ensuring that the statute's broad provisions are followed is a set of "sudden death" probations. For example, under COBRA, hospitals found guilty of knowing or negligent violations may be suspended or terminated from receiving all Medicare reimbursement. One way to avoid these "sudden death" probations is to understand the implications of this law.
Chun, Eun Hee; Kim, Youn Jin; Woo, Jae Hee
2016-06-01
The aim of this study was to compare the effect of intravenous (I.V.) dexamethasone with that of perineural dexamethasone on the prolongation of analgesic duration of single-shot interscalene brachial plexus blocks (SISB) in patients undergoing arthroscopic shoulder surgery. We performed a prospective, randomized, double-blind, placebo-controlled study. Patients undergoing elective arthroscopic shoulder surgery with ultrasound-guided SISB were enrolled and randomized into 2 groups. A total volume of 12 mL of the study drug was prepared with a final concentration of 0.5% ropivacaine. In the I.V. group, patients received SISB using ropivacaine 5 mg mL with normal saline (control) with dexamethasone 5 mg I.V. injection. In the perineural group, patients received SISB using ropivacaine 5 mg mL with dexamethasone 5 mg, with normal saline 1 mL I.V. injection. The primary outcome was the time to the first analgesic request, defined as the time between the end of the operation and the first request of analgesics by the patient. The secondary outcomes included patient satisfaction scores, side effects, and neurological symptoms. Patients were randomly assigned to 1 of the 2 groups using a computer-generated randomization table. An anesthesiologist blinded to the group assignments prepared the solutions for injection. The patients and the investigator participating in the study were also blinded to the group assignments. One hundred patients were randomized. Data were analyzed for 99 patients. One case in the I.V. group was converted to open surgery and was therefore not included in the study. Perineural dexamethasone significantly prolonged analgesic duration (median, standard error: 1080 minutes, 117.5 minutes) compared with I.V. dexamethasone (810 minutes, 48.1 minutes) (P = 0.02). There were no significant differences in side effects, neurological symptoms, or changes in blood glucose values between the 2 groups. Our results show that perineural dexamethasone 5 mg is more effective than I.V. dexamethasone 5 mg with regard to analgesic duration of SISB for arthroscopic shoulder surgery.
Proposed standards for peer-reviewed publication of computer code
USDA-ARS?s Scientific Manuscript database
Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...
Culbertson, C N; Wangerin, K; Ghandourah, E; Jevremovic, T
2005-08-01
The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for neutron capture therapy related modeling. A boron neutron capture therapy model was analyzed comparing COG calculational results to results from the widely used MCNP4B (Monte Carlo N-Particle) transport code. The approach for computing neutron fluence rate and each dose component relevant in boron neutron capture therapy is described, and calculated values are shown in detail. The differences between the COG and MCNP predictions are qualified and quantified. The differences are generally small and suggest that the COG code can be applied for BNCT research related problems.
NASA Technical Reports Server (NTRS)
Kikuchi, Hideaki; Kalia, Rajiv; Nakano, Aiichiro; Vashishta, Priya; Iyetomi, Hiroshi; Ogata, Shuji; Kouno, Takahisa; Shimojo, Fuyuki; Tsuruta, Kanji; Saini, Subhash;
2002-01-01
A multidisciplinary, collaborative simulation has been performed on a Grid of geographically distributed PC clusters. The multiscale simulation approach seamlessly combines i) atomistic simulation backed on the molecular dynamics (MD) method and ii) quantum mechanical (QM) calculation based on the density functional theory (DFT), so that accurate but less scalable computations are performed only where they are needed. The multiscale MD/QM simulation code has been Grid-enabled using i) a modular, additive hybridization scheme, ii) multiple QM clustering, and iii) computation/communication overlapping. The Gridified MD/QM simulation code has been used to study environmental effects of water molecules on fracture in silicon. A preliminary run of the code has achieved a parallel efficiency of 94% on 25 PCs distributed over 3 PC clusters in the US and Japan, and a larger test involving 154 processors on 5 distributed PC clusters is in progress.
NASA Astrophysics Data System (ADS)
Su, Wan-Ching; Chang, Ting-Chang; Liao, Po-Yung; Chen, Yu-Jia; Chen, Bo-Wei; Hsieh, Tien-Yu; Yang, Chung-I.; Huang, Yen-Yu; Chang, Hsi-Ming; Chiang, Shin-Chuan; Chang, Kuan-Chang; Tsai, Tsung-Ming
2017-03-01
This paper investigates the degradation behavior of InGaZnO thin film transistors (TFTs) under negative bias illumination stress (NBIS). TFT devices with two different source and drain layouts were exanimated: one having a parallel format electrode and the other with UI format electrode. UI means that source/drain electrodes shapes is defined as a forked-shaped structure. The I-V curve of the parallel electrode exhibited a symmetric degradation under forward and reverse sweeping in the saturation region after 1000 s NBIS. In contrast, the I-V curve of the UI electrode structure under similar conditions was asymmetric. The UI electrode structure also shows a stretch-out phenomenon in its C-V measurement. Finally, this work utilizes the ISE-Technology Computer Aided Design (ISE-TCAD) system simulations, which simulate the electron field and IV curves, to analyze the mechanisms dominating the parallel and UI device degradation behaviors.
Automatically generated code for relativistic inhomogeneous cosmologies
NASA Astrophysics Data System (ADS)
Bentivegna, Eloisa
2017-02-01
The applications of numerical relativity to cosmology are on the rise, contributing insight into such cosmological problems as structure formation, primordial phase transitions, gravitational-wave generation, and inflation. In this paper, I present the infrastructure for the computation of inhomogeneous dust cosmologies which was used recently to measure the effect of nonlinear inhomogeneity on the cosmic expansion rate. I illustrate the code's architecture, provide evidence for its correctness in a number of familiar cosmological settings, and evaluate its parallel performance for grids of up to several billion points. The code, which is available as free software, is based on the Einstein Toolkit infrastructure, and in particular leverages the automated code generation capabilities provided by its component Kranc.
AH-1S communication switch integration program
NASA Technical Reports Server (NTRS)
Haworth, Loran; Szoboszlay, Zoltan; Shively, Robert; Bick, Frank J.
1989-01-01
The C-6533/ARC communication system as installed on the test AH-1E Cobra helicopter was modified to allow discrete radio selection of all aircraft radios at the cyclic radio/intercommunication system switch. The current Cobra-fleet use of the C-6533 system is cumbersome, particularly during low-altitude operations. Operationally, the current system C-6533 configuration and design requires the pilot to estimate when he can safely remove his hand from an active flight control to select radios during low-altitude flight. The pilot must then physically remove his hand from the flight control, look inside the cockpit to select and verify the radio selection and then effect the selected radio transmission by activating the radio/ICS switch on the cyclic. This condition is potentially hazardous, especially during low-level flight at night in degraded weather. To improve pilot performance, communications effectiveness, and safety, manprint principles were utilized in the selection of a design modification. The modified C-6533 design was kept as basic as possible for potential Cobra-fleet modification. The communications system was modified and the design was subsequently flight-tested by the U.S. Army Aeroflightdynamics Directorate and NASA at the NASA Ames Research Center, Mountain View, California. The design modification enables the Cobra pilot to maintain hands-on flight controls while selecting radios during nap-of-the-Earth (NOE) flight without looking inside the cockpit which resulted in reduced pilot workload ratings, better pilot handling quality ratings and increased flight safety for the NOE flight environment.
Ben-Tov, Daniela; Abraham, Yael; Stav, Shira; Thompson, Kevin; Loraine, Ann; Elbaum, Rivka; de Souza, Amancio; Pauly, Markus; Kieber, Joseph J.; Harpaz-Saad, Smadar
2015-01-01
Differentiation of the maternally derived seed coat epidermal cells into mucilage secretory cells is a common adaptation in angiosperms. Recent studies identified cellulose as an important component of seed mucilage in various species. Cellulose is deposited as a set of rays that radiate from the seed upon mucilage extrusion, serving to anchor the pectic component of seed mucilage to the seed surface. Using transcriptome data encompassing the course of seed development, we identified COBRA-LIKE2 (COBL2), a member of the glycosylphosphatidylinositol-anchored COBRA-LIKE gene family in Arabidopsis (Arabidopsis thaliana), as coexpressed with other genes involved in cellulose deposition in mucilage secretory cells. Disruption of the COBL2 gene results in substantial reduction in the rays of cellulose present in seed mucilage, along with an increased solubility of the pectic component of the mucilage. Light birefringence demonstrates a substantial decrease in crystalline cellulose deposition into the cellulosic rays of the cobl2 mutants. Moreover, crystalline cellulose deposition into the radial cell walls and the columella appears substantially compromised, as demonstrated by scanning electron microscopy and in situ quantification of light birefringence. Overall, the cobl2 mutants display about 40% reduction in whole-seed crystalline cellulose content compared with the wild type. These data establish that COBL2 plays a role in the deposition of crystalline cellulose into various secondary cell wall structures during seed coat epidermal cell differentiation. PMID:25583925
Goddard Visiting Scientist Program
NASA Technical Reports Server (NTRS)
2000-01-01
Under this Indefinite Delivery Indefinite Quantity (IDIQ) contract, USRA was expected to provide short term (from I day up to I year) personnel as required to provide a Visiting Scientists Program to support the Earth Sciences Directorate (Code 900) at the Goddard Space Flight Center. The Contractor was to have a pool, or have access to a pool, of scientific talent, both domestic and international, at all levels (graduate student to senior scientist), that would support the technical requirements of the following laboratories and divisions within Code 900: 1) Global Change Data Center (902); 2) Laboratory for Atmospheres (Code 910); 3) Laboratory for Terrestrial Physics (Code 920); 4) Space Data and Computing Division (Code 930); 5) Laboratory for Hydrospheric Processes (Code 970). The research activities described below for each organization within Code 900 were intended to comprise the general scope of effort covered under the Visiting Scientist Program.
Dewaraja, Yuni K; Ljungberg, Michael; Majumdar, Amitava; Bose, Abhijit; Koral, Kenneth F
2002-02-01
This paper reports the implementation of the SIMIND Monte Carlo code on an IBM SP2 distributed memory parallel computer. Basic aspects of running Monte Carlo particle transport calculations on parallel architectures are described. Our parallelization is based on equally partitioning photons among the processors and uses the Message Passing Interface (MPI) library for interprocessor communication and the Scalable Parallel Random Number Generator (SPRNG) to generate uncorrelated random number streams. These parallelization techniques are also applicable to other distributed memory architectures. A linear increase in computing speed with the number of processors is demonstrated for up to 32 processors. This speed-up is especially significant in Single Photon Emission Computed Tomography (SPECT) simulations involving higher energy photon emitters, where explicit modeling of the phantom and collimator is required. For (131)I, the accuracy of the parallel code is demonstrated by comparing simulated and experimental SPECT images from a heart/thorax phantom. Clinically realistic SPECT simulations using the voxel-man phantom are carried out to assess scatter and attenuation correction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolan, Sam R.; Barack, Leor; Wardell, Barry
2011-10-15
This is the second in a series of papers aimed at developing a practical time-domain method for self-force calculations in Kerr spacetime. The key elements of the method are (i) removal of a singular part of the perturbation field with a suitable analytic 'puncture' based on the Detweiler-Whiting decomposition, (ii) decomposition of the perturbation equations in azimuthal (m-)modes, taking advantage of the axial symmetry of the Kerr background, (iii) numerical evolution of the individual m-modes in 2+1 dimensions with a finite-difference scheme, and (iv) reconstruction of the physical self-force from the mode sum. Here we report an implementation of themore » method to compute the scalar-field self-force along circular equatorial geodesic orbits around a Kerr black hole. This constitutes a first time-domain computation of the self-force in Kerr geometry. Our time-domain code reproduces the results of a recent frequency-domain calculation by Warburton and Barack, but has the added advantage of being readily adaptable to include the backreaction from the self-force in a self-consistent manner. In a forthcoming paper--the third in the series--we apply our method to the gravitational self-force (in the Lorenz gauge).« less
Brown, Peter; Pullan, Wayne; Yang, Yuedong; Zhou, Yaoqi
2016-02-01
The three dimensional tertiary structure of a protein at near atomic level resolution provides insight alluding to its function and evolution. As protein structure decides its functionality, similarity in structure usually implies similarity in function. As such, structure alignment techniques are often useful in the classifications of protein function. Given the rapidly growing rate of new, experimentally determined structures being made available from repositories such as the Protein Data Bank, fast and accurate computational structure comparison tools are required. This paper presents SPalignNS, a non-sequential protein structure alignment tool using a novel asymmetrical greedy search technique. The performance of SPalignNS was evaluated against existing sequential and non-sequential structure alignment methods by performing trials with commonly used datasets. These benchmark datasets used to gauge alignment accuracy include (i) 9538 pairwise alignments implied by the HOMSTRAD database of homologous proteins; (ii) a subset of 64 difficult alignments from set (i) that have low structure similarity; (iii) 199 pairwise alignments of proteins with similar structure but different topology; and (iv) a subset of 20 pairwise alignments from the RIPC set. SPalignNS is shown to achieve greater alignment accuracy (lower or comparable root-mean squared distance with increased structure overlap coverage) for all datasets, and the highest agreement with reference alignments from the challenging dataset (iv) above, when compared with both sequentially constrained alignments and other non-sequential alignments. SPalignNS was implemented in C++. The source code, binary executable, and a web server version is freely available at: http://sparks-lab.org yaoqi.zhou@griffith.edu.au. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Terminal Ballistic Application of Hydrodynamic Computer Code Calculations.
1977-04-01
F1’T.D—AO*I 065 BALLISTIC RESEARCH LABS ABnoflN PR0VIM eRotic j~o NTERMiNAL BALLISIIC APPLICATION OF HYDRODYNAMIC C~I~~U7ER COVE CA—ET C(U) I APR 77...this short- coming of the code, design solutions using a combined calculational and empirical design procedure were tried . 18 --- - -- -- - --- -rn...In this calculation , the exp losive was conf ined on its periphery by a steel casing. The calculated liner shape is shown at 18 m icroseconds af
Budera, P; Osmančík, P; Talavera, D; Fojt, R; Kraupnerová, A; Žďárská, J; Vaněk, T; Straka, Z
2017-01-01
Treatment of persistent and long-standing persistent atrial fibrillation is not successfully managed by methods of catheter ablation or pharmacotherapy. Hybrid ablation (i.e. combination of minimally invasive surgical ablation, followed by electrophysiological assessment and subsequent endocardial catheter ablation to complete the entire intended procedure) presents an ever more used and very promising treatment method. Patients underwent thoracoscopic ablation of pulmonary veins and posterior wall of the left atrium (the box-lesion) with use of the COBRA Fusion catheter; thoracoscopic occlusion of the left atrial appendage using the AtriClip system was also done in later patients. After 23 months, electrophysiological assessment and catheter ablation followed. In this article we summarize a strategy of the surgical part of the hybrid procedure performed in our centre. We describe the surgery itself (including possible periprocedural complications) and we also present our short-term results, especially with respect to subsequent electrophysiological findings. Data of the first 51 patients were analyzed. The first 25 patients underwent unilateral ablation; the mean time of surgery was 102 min. Subsequent 26 patients underwent the bilateral procedure with the mean surgery time of 160 min. Serious complications included 1 stroke, 1 phrenic nerve palsy and 2 surgical re-explorations for bleeding. After 1 month, 65% of patients showed sinus rhythm. The box-lesion was found complete during electrophysiological assessment in 38% of patients and after catheter ablation, 96% of patients were discharged in sinus rhythm. The surgical part of the hybrid procedure with use of the minimally invasive approach and the COBRA Fusion catheter is a well-feasible method with a low number of periprocedural complications. For electrophysiologists, it provides a very good basis for successful completion of the hybrid ablation.Key words: atrial fibrillation hybrid ablation - thoracoscopy catheter ablation electrophysiology assessment.
The complete process of large elastic-plastic deflection of a cantilever
NASA Astrophysics Data System (ADS)
Wu, Xiaoqiang; Yu, Tongxi
1986-11-01
An extension of the Elastica theory is developed to study the large deflection of an elastic-perfectly plastic horizontal cantilever beam subjected to a vertical concentrated force at its tip. The entire process is divided into four stages: I.elastic in the whole cantilever; II.loading and developing of the plastic region; III.unloading in the plastic region; and IV.reverse loading. Solutions for stages I and II are presented in a closed form. A combination of closed-form solution and numerical integration is presented for stage III. Finally, stage IV is qualitatively studied. Computed results are given and compared with those from small-deflection theory and from the Elastica theory.
NASA Astrophysics Data System (ADS)
Manjanaik, N.; Parameshachari, B. D.; Hanumanthappa, S. N.; Banu, Reshma
2017-08-01
Intra prediction process of H.264 video coding standard used to code first frame i.e. Intra frame of video to obtain good coding efficiency compare to previous video coding standard series. More benefit of intra frame coding is to reduce spatial pixel redundancy with in current frame, reduces computational complexity and provides better rate distortion performance. To code Intra frame it use existing process Rate Distortion Optimization (RDO) method. This method increases computational complexity, increases in bit rate and reduces picture quality so it is difficult to implement in real time applications, so the many researcher has been developed fast mode decision algorithm for coding of intra frame. The previous work carried on Intra frame coding in H.264 standard using fast decision mode intra prediction algorithm based on different techniques was achieved increased in bit rate, degradation of picture quality(PSNR) for different quantization parameters. Many previous approaches of fast mode decision algorithms on intra frame coding achieved only reduction of computational complexity or it save encoding time and limitation was increase in bit rate with loss of quality of picture. In order to avoid increase in bit rate and loss of picture quality a better approach was developed. In this paper developed a better approach i.e. Gaussian pulse for Intra frame coding using diagonal down left intra prediction mode to achieve higher coding efficiency in terms of PSNR and bitrate. In proposed method Gaussian pulse is multiplied with each 4x4 frequency domain coefficients of 4x4 sub macro block of macro block of current frame before quantization process. Multiplication of Gaussian pulse for each 4x4 integer transformed coefficients at macro block levels scales the information of the coefficients in a reversible manner. The resulting signal would turn abstract. Frequency samples are abstract in a known and controllable manner without intermixing of coefficients, it avoids picture getting bad hit for higher values of quantization parameters. The proposed work was implemented using MATLAB and JM 18.6 reference software. The proposed work measure the performance parameters PSNR, bit rate and compression of intra frame of yuv video sequences in QCIF resolution under different values of quantization parameter with Gaussian value for diagonal down left intra prediction mode. The simulation results of proposed algorithm are tabulated and compared with previous algorithm i.e. Tian et al method. The proposed algorithm achieved reduced in bit rate averagely 30.98% and maintain consistent picture quality for QCIF sequences compared to previous algorithm i.e. Tian et al method.
NASA Technical Reports Server (NTRS)
Piersol, Allan G.
1991-01-01
Analytical expressions have been derived to describe the mean square error in the estimation of the maximum rms value computed from a step-wise (or running) time average of a nonstationary random signal. These analytical expressions have been applied to the problem of selecting the optimum averaging times that will minimize the total mean square errors in estimates of the maximum sound pressure levels measured inside the Titan IV payload fairing (PLF) and the Space Shuttle payload bay (PLB) during lift-off. Based on evaluations of typical Titan IV and Space Shuttle launch data, it has been determined that the optimum averaging times for computing the maximum levels are (1) T (sub o) = 1.14 sec for the maximum overall level, and T(sub oi) = 4.88 f (sub i) (exp -0.2) sec for the maximum 1/3 octave band levels inside the Titan IV PLF, and (2) T (sub o) = 1.65 sec for the maximum overall level, and T (sub oi) = 7.10 f (sub i) (exp -0.2) sec for the maximum 1/3 octave band levels inside the Space Shuttle PLB, where f (sub i) is the 1/3 octave band center frequency. However, the results for both vehicles indicate that the total rms error in the maximum level estimates will be within 25 percent the minimum error for all averaging times within plus or minus 50 percent of the optimum averaging time, so a precise selection of the exact optimum averaging time is not critical. Based on these results, linear averaging times (T) are recommended for computing the maximum sound pressure level during lift-off.
Bhullar, Indermeet S; Frykberg, Eric R; Siragusa, Daniel; Chesire, David; Paul, Julia; Tepas, Joseph J; Kerwin, Andrew J
2012-05-01
To determine whether angioembolization (AE) in hemodynamically stable adult patients with blunt splenic trauma (BST) at high risk for failure of nonoperative management (NOM) (contrast blush [CB] on computed tomography, high-grade IV-V injuries, or decreasing hemoglobin) results in lower failure rates than reported. The records of patients with BST from July 2000 to December 2010 at a Level I trauma center were retrospectively reviewed using National Trauma Registry of the American College of Surgeons. Failure of NOM (FNOM) occurred if splenic surgery was required after attempted NOM. Logistic regression analysis was used to identify factors associated with FNOM. A total of 1,039 patients with BST were found. Pediatric patients (age <17 years), those who died in the emergency department, and those requiring immediate surgery for hemodynamic instability were excluded. Of the 539 (64% of all BST) hemodynamically stable patients who underwent NOM, 104 (19%) underwent AE and 435 (81%) were observed without AE (NO-AE). FNOM for the various groups were as follows: overall NOM (4%), NO-AE (4%), and AE (4%). There was no significant difference in FNOM for NO-AE versus AE for grades I to III: grade I (1% vs. 0%, p = 1), grade II (2% vs. 0%, p = 0.318), and grade III (5% vs. 0%, p = 0.562); however, a significant decrease in FNOM was noted with the addition of AE for grades IV to V: grade IV (23% vs. 3%, p = 0.04) and grade V (63% vs. 9%, p = 0.03). Statistically significant independent risk factors for FNOM were grade IV to V injuries and CB. Application of strictly defined selection criteria for NOM and AE in patients with BST resulted in one of the lowest overall FNOM rates (4%). Hemodynamically stable BST patients are candidates for NOM with selective AE for high-risk patients with grade IV to V injuries, CB on initial computed tomography, and/or decreasing hemoglobin levels. III, therapeutic study.
Summary Report of Working Group 2: Computation
NASA Astrophysics Data System (ADS)
Stoltz, P. H.; Tsung, R. S.
2009-01-01
The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) new hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.
Summary Report of Working Group 2: Computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoltz, P. H.; Tsung, R. S.
2009-01-22
The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) newmore » hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balkey, K.; Witt, F.J.; Bishop, B.A.
1995-06-01
Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less
Demir, Guray; Cukurova, Zafer; Eren, Gulay; Tekdos, Yasemin; Hergunsel, Oya
2012-07-01
We aimed to investigate the effect on children undergoing Computed Tomography (CT) or Magnetic Resonance Imaging (MRI), their parents and attending anesthesiologist of "multiphase sedation" which we define as "the intended sedation level achieved with one or more agents through the same or different routes with more than one administration". One hundred children and their parents were randomly allocated to one of two study groups. In phase 1; in Group I the patients were given midazolam (0.5mg.kg(-1)) in 5 mL fruit juice, and the ones in control group (Group II) were given only fruit juice. After intravenous (iv) cannulation; in phase II, boluses of propofol were given to achieve the adequate sedation for imaging. Anxiety scores of children and their parents were recorded using Oucher scale and STAI, respectively, and parental satisfaction was evaluated by visual analogue scale (VAS). The number of attempts for iv cannulation, length of time for preparation, and amount of hypnotics were recorded. Anxiety state of children was similar between groups before premedication, but later it was lower in Group I. Before procedure, STAI score of parents was similar and later it was lower in Group I. Parental satisfaction in Group I was higher. The number of attempts for iv cannulation and required propofol dose was less in Group I. "Multiphase sedation" procedure provides children to feel less pain and anxiety, and decreases parental anxiety while increasing their satisfaction. It supplies a comfortable and safe sedation, as it provides a short and problem-free preparation process for the attending anesthetist as well. Copyright © 2012 Elsevier Editora Ltda. All rights reserved.
Annual Report of the ECSU Home-Institution Support Program (1993)
1993-09-30
summer of 1992. Stephanie plans to attend graduate school at the University of Alabama at Birmingham. r 3 . Deborah Jones has attended the ISSP program for...computer equipment Component #2 A visiting lecturer series Component # 3 : Students pay & faculty release time Component #4 Student/sponsor travel program...DTXC QUA, ty rNpBT 3 S. 0. CODE: 1133 DISBURSING CODE: N001 79 AGO CODE: N66005 CAGE CODE: OJLKO 3 PART I: A succinct narrative which should
Three-Dimensional Modeling of Fracture Clusters in Geothermal Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghassemi, Ahmad
The objective of this is to develop a 3-D numerical model for simulating mode I, II, and III (tensile, shear, and out-of-plane) propagation of multiple fractures and fracture clusters to accurately predict geothermal reservoir stimulation using the virtual multi-dimensional internal bond (VMIB). Effective development of enhanced geothermal systems can significantly benefit from improved modeling of hydraulic fracturing. In geothermal reservoirs, where the temperature can reach or exceed 350oC, thermal and poro-mechanical processes play an important role in fracture initiation and propagation. In this project hydraulic fracturing of hot subsurface rock mass will be numerically modeled by extending the virtual multiplemore » internal bond theory and implementing it in a finite element code, WARP3D, a three-dimensional finite element code for solid mechanics. The new constitutive model along with the poro-thermoelastic computational algorithms will allow modeling the initiation and propagation of clusters of fractures, and extension of pre-existing fractures. The work will enable the industry to realistically model stimulation of geothermal reservoirs. The project addresses the Geothermal Technologies Office objective of accurately predicting geothermal reservoir stimulation (GTO technology priority item). The project goal will be attained by: (i) development of the VMIB method for application to 3D analysis of fracture clusters; (ii) development of poro- and thermoelastic material sub-routines for use in 3D finite element code WARP3D; (iii) implementation of VMIB and the new material routines in WARP3D to enable simulation of clusters of fractures while accounting for the effects of the pore pressure, thermal stress and inelastic deformation; (iv) simulation of 3D fracture propagation and coalescence and formation of clusters, and comparison with laboratory compression tests; and (v) application of the model to interpretation of injection experiments (planned by our industrial partner) with reference to the impact of the variations in injection rate and temperature, rock properties, and in-situ stress.« less
2009-01-01
THE FOREGOING STATEMENT. QUOTATION FROM, ABSTRACTION FROM, OR REPRODUCTION OF ALL OR ANY PART OF THIS DOCVMENT IS PERMITTED PROVIDED PROPER...Capabilities and Limitations 7 AH-IW/Z Cobra’s Role in Support ofECO 8 CH-53E Super Stallion Capabilities and Limitations 9 CH-53E Super Stallion’s Role...of aircraft. Analysis of the roles and capabilities of the AH-IW Super Cobra, CH-53E Super Stallion , MV-22B Osprey, and the UH- IN Huey will identify
COBRA System Engineering Processes to Achieve SLI Strategic Goals
NASA Technical Reports Server (NTRS)
Ballard, Richard O.
2003-01-01
The COBRA Prototype Main Engine Development Project was an endeavor conducted as a joint venture between Pratt & Whitney and Aerojet to conduct risk reduction in LOX/LH2 main engine technology for the NASA Space Launch Initiative (SLI). During the seventeen months of the project (April 2001 to September 2002), approximately seventy reviews were conducted, beginning with the Engine Systems Requirements Review (SRR) and ending with the Engine Systems Interim Design Review (IDR). This paper discusses some of the system engineering practices used to support the reviews and the overall engine development effort.
Optimization methods and silicon solar cell numerical models
NASA Technical Reports Server (NTRS)
Girardini, K.
1986-01-01
The goal of this project is the development of an optimization algorithm for use with a solar cell model. It is possible to simultaneously vary design variables such as impurity concentrations, front junction depth, back junctions depth, and cell thickness to maximize the predicted cell efficiency. An optimization algorithm has been developed and interfaced with the Solar Cell Analysis Program in 1 Dimension (SCAPID). SCAPID uses finite difference methods to solve the differential equations which, along with several relations from the physics of semiconductors, describe mathematically the operation of a solar cell. A major obstacle is that the numerical methods used in SCAPID require a significant amount of computer time, and during an optimization the model is called iteratively until the design variables converge to the value associated with the maximum efficiency. This problem has been alleviated by designing an optimization code specifically for use with numerically intensive simulations, to reduce the number of times the efficiency has to be calculated to achieve convergence to the optimal solution. Adapting SCAPID so that it could be called iteratively by the optimization code provided another means of reducing the cpu time required to complete an optimization. Instead of calculating the entire I-V curve, as is usually done in SCAPID, only the efficiency is calculated (maximum power voltage and current) and the solution from previous calculations is used to initiate the next solution.
ERIC Educational Resources Information Center
Department of the Treasury, Washington, DC.
The report contains legal studies on important tax code provisions related to philanthropic giving. This is Volume IV in a five volume series examining the relationship between nonprofit institutions and their donors. Seventeen papers comprise the report. Tax code provisions which are discussed include eligibility for tax exemption; distinctions…
Massive Symbolic Mathematical Computations and Their Applications
1988-08-16
NUMBER ORGANIZATION (if appi cable) AFOSR I A_ /__ I F49620-87- C -0113 Bc. ADDRESS (City, Stare, and ZIP Code) %. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT...TASK WORK UNIT - < ’/I/ "//ELEMENT NO. NO. NO. ACCESSION NO. /,, AF,; c 9r ;- 6 (4/tL’ " ’ ’! /K’, 11 TITLE (Incoue Secuirty Classification) Massive...DARPA R & D Status Report AFOSR.m. 8 8-1 12Contract No. F49620-87- C -0113 MASSIVE SYMBOLIC MATHEMATICAL COMPUTATIONS AND THEIR APPLICATIONS Quarterly
Nuclear Weapon Environment Model. Volume II. Computer Code User’s Guide.
1979-02-01
J.R./IfW-09obArt AT NAME AND ADDRESS 10 PROGRAM ELEMENT PROJECT. TASK ’A a *0 RK UONGANIZATION TRW Defense and Space Systems GroupA 8WOKUINMES One...SIZE I I& DENSITY / DENSITY ZERO ,-NO OR TIME TOO YES LARGE? I CALL SIZER I r SETUP GRID IDIAGNOSTICI -7 PRINT DESIRED NOY-LOOP .? D I INCREMENT Y I I
User's manual for the BNW-I optimization code for dry-cooled power plants. Volume I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braun, D.J.; Daniel, D.J.; De Mier, W.V.
1977-01-01
This User's Manual provides information on the use and operation of three versions of BNW-I, a computer code developed by Battelle, Pacific Northwest Laboratory (PNL) as a part of its activities under the ERDA Dry Cooling Tower Program. These three versions of BNW-I were used as reported elsewhere to obtain comparative incremental costs of electrical power production by two advanced concepts (one using plastic heat exchangers and one using ammonia as an intermediate heat transfer fluid) and a state-of-the-art system. The computer program offers a comprehensive method of evaluating the cost savings potential of dry-cooled heat rejection systems and componentsmore » for power plants. This method goes beyond simple ''figure-of-merit'' optimization of the cooling tower and includes such items as the cost of replacement capacity needed on an annual basis and the optimum split between plant scale-up and replacement capacity, as well as the purchase and operating costs of all major heat rejection components. Hence, the BNW-I code is a useful tool for determining potential cost savings of new heat transfer surfaces, new piping or other components as part of an optimized system for a dry-cooled power plant.« less
Tang, Jie; Li, Wenxiu; Lv, Faqin; Zhang, Huiqin; Zhang, Lihai; Wang, Yuexiang; Li, Junlai; Yang, Li
2009-04-01
To compare the diagnostic value of contrast-enhanced ultrasonography (CEUS) with contrast-enhanced computed tomography (CECT) for the detection of different grading of solid organ injuries in blunt abdominal trauma in animals. A self-made miniature tools were used as models to simulate a blunt hepatic or splenic trauma in 16 and 14 anesthetized dogs, respectively. Baseline ultrasound, CEUS and CECT were used to detect traumatic injuries of livers and spleens. The degree of injuries was determined by CEUS according to the American Association for the Surgery of Trauma (AAST) scale and the results compared with injury scale based on CECT evaluation. CEUS showed 22 hepatic injury sites in 16 animals and 17 splenic injury sites in other 14 animals. According to AAST scale, 2 grade I, 4 grade II, 3 grade III, 5 grade IV and 2 grade V hepatic lesions were present in 16 animals; 2 grade I, 4 grade II, 6 grade III and 2 grade IV splenic lesions in 14 animals. On CECT scan, 21 hepatic and 17 splenic injuries were demonstrated. According to Becker CT scaling for hepatic injury, 1 grade I, 2 grade II, 4 grade III, 5 grade IV and 2 grade V hepatic injuries were present. On the basis of Buntain spleen scaling, 2 grade I, 5 grade II, 5 grade III, 2 grade IV splenic injuries were showed. After Spearman rank correlation analysis, the agreement of CEUS with CECT on the degree of hepatic and splenic injury is 93.3% and 92.9%, respectively. CT is currently considered as the reference method for grading blunt abdominal trauma, according to experiment results, CEUS grading showed high levels of concordance with CECT. CEUS can accurately determine the degree of injury and will play an important role in clinical application.
Friedman tongue position and cone beam computed tomography in patients with obstructive sleep apnea.
Harvey, Rebecca; O'Brien, Louise; Aronovich, Sharon; Shelgikar, Anita; Hoff, Paul; Palmisano, John; Stanley, Jeffrey
2017-10-01
Evaluate the correlation between Friedman Tongue Position (FTP) and airway cephalometrics in patients with obstructive sleep apnea (OSA). Retrospective review of adult patients with OSA undergoing Cone Beam Computed Tomography (CBCT). Collected data included age, sex, body mass index, apnea hypopnea index, FTP, and airway cephalometric parameters. Data analyses were performed using ANOVA, dichotomous t-testing, and linear regression. 203 patients were included in the analysis. (M:F 132:71). The mean posterior airway space (PAS) was inversely correlated ( p = 0.001, r =.119) with higher FTP grades with means of 12.3 mm, 7.9 mm, 6.6 mm, and 4.3 mm, I-IV respectively. Minimal cross-sectional area for patients with FTP I-IV was 245.7, 179.8, 137.6, and 74.2 mm, 2 respectively ( p = 0.002, r = .095). Mean hyoid-mandibular plane (H-MP) for FTP I-IV was 20.6 mm, 20.4 mm, 24.7 mm, and 28.9 mm respectively. No statistically significant difference between H-MP values when comparing patients with FTP I or II ( p = 0.22). There were statistically significant differences when these two groups were individually compared to FTP III and IV ( p = 0.002). Linear regression analysis confirmed an independent association between FTP and PAS (β = -2.06, p < 0.001), minimal cross-sectional area (β = -45.07, p = 0.02), and H-MP (β = 3.03, p = 0.01) controlling for BMI, age, AHI, and sex. Use of FTP is supported by objective CBCT cephalometric results, in particular the PAS, minimal cross-sectional area, and H-MP. Understanding the correlation between objective measurements of retroglossal collapse should allow Otolaryngologists to more confidently select patients who may require surgery to address the retroglossal area, particularly when the ability to perform cephalometric analysis is not possible. 4.
76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-07
...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...
Cobra venom cytotoxins; apoptotic or necrotic agents?
Ebrahim, Karim; Shirazi, Farshad H; Mirakabadi, Abbas Zare; Vatanpour, Hossein
2015-12-15
Organs homeostasis is controlled by a dynamic balance between cell proliferation and apoptosis. Failure to induction of apoptosis has been implicated in tumor development. Cytotoxin-I (CTX-I) and cytotoxin-II (CTX-II) are two physiologically active polypeptides found in Caspian cobra venom. Anticancer activity and mechanism of cell death induced by these toxins have been studied. The toxins were purified by different chromatographic steps and their cytotoxicity and pattern of cell death were determined by MTT, LDH release, acridine orange/ethidium bromide (AO/EtBr) double staining, flow cytometric analysis, caspase-3 activity and neutral red assays. The IC50 of CTX-II in MCF-7, HepG2, DU-145 and HL-60 was 4.1 ± 1.3, 21.2 ± 4.4, 9.4 ± 1.8 μg/mL and 16.3 ± 1.9 respectively while the IC50 of this toxin in normal MDCK cell line was 54.5 ± 3.9 μg/mL. LDH release suddenly increase after a specific toxins concentrations in all cell lines. AO/EtBr double staining, flow cytometric analysis and caspase-3 activity assay confirm dose and time-dependent induction of apoptosis by both toxins. CTX-I and CTX-II treated cells lost their lysosomal membrane integrity and couldn't uptake neutral red day. CTX-I and CTX-II showed significant anticancer activity with minimum effects on normal cells and better IC50 compared to current anticancer drug; cisplatin. They induce their apoptotic effect via lysosomal pathways and release of cathepsins to cytosol. These effects were seen in limited rage of toxins concentrations and pattern of cell death rapidly changes to necrosis by increase in toxin's concentration. In conclusion, significant apoptogenic effects of these toxins candidate them as a possible anticancer agent. Copyright © 2015 Elsevier Ltd. All rights reserved.
Haralur, Satheesh B.; Al-Qahtani, Ali S.; Al-Qarni, Marie M.; Al-Homrany, Rami M.; Aboalkhair, Ayyob E.; Madalakote, Sujatha S.
2015-01-01
Aim: To study the awareness, attitude, practice and facilities among the different categories of dental laboratories in Abha city. Materials and Methods: A total of 80 dental technicians were surveyed in the study. The dental laboratories included in the study were teaching institute (Group I), Government Hospital (Group II), Private Dental Clinic (Group III) and Independent laboratory (Group IV). The pre-tested anonymous questionnaire was used to understand knowledge, attitude, facilities, practice and orientation regarding biomedical waste management. Results: The knowledge of biomedical waste categories, colour coding and segregation was better among Group I (55-65%) and Group II (65-75%). The lowest standard of waste disposal was practiced at Group IV (15-20%) and Group III (25-35%). The availability of disposal facilities was poor at Group IV. The continuous education on biomedical waste management lacked in all the Groups. Conclusion: The significant improvement in disposal facilities was required at Group III and Group IV laboratories. All dental technicians were in need of regular training of biomedical waste management. Clinical Significance: The dental laboratories are an integral part of dental practice. The dental laboratories are actively involved in the generation, handling and disposal of biomedical waste. Hence, it is important to assess the biomedical waste management knowledge, attitude, facilities and practice among different categories of dental laboratories. PMID:26962373
A high performance scientific cloud computing environment for materials simulations
NASA Astrophysics Data System (ADS)
Jorissen, K.; Vila, F. D.; Rehr, J. J.
2012-09-01
We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.
Massive Data, the Digitization of Science, and Reproducibility of Results
Stodden, Victoria
2018-04-27
As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the âReproducible Research Standardâ (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.
Delpeuch, Amina; Ruivard, Marc; Abergel, Armand; Aumaitre, Olivier; Boisgard, Stéphane; Bagel, Sandrine; Sautou, Valérie
2018-03-08
Background Intravenous (IV) iron preparations bypass the difficulties (malabsorption and side effects) associated with oral iron for the treatment of iron deficiency anaemia (IDA). Ferric carboxymaltose (FCM) can be administered as a single infusion over short periods of time but is more expensive than iron sucrose (IS) when the patients are hospitalized. Objectives To evaluate the appropriateness of FCM prescriptions and to establish the economic impact of this management (including disease coding) compared to the use of IV IS. Setting This study was conducted for inpatients in all departments (orthopaedic department, gastroenterology department and two units of the internal medicine department) where FCM was widely prescribed. Method We retrospectively identified 224 patients, diagnosed with IDA using laboratory parameters and/or disease coding, who received FCM between January and December 2014. Main outcome measure The primary outcome was the rate of appropriateness of FCM prescriptions and the financial impact compared to IV IS. Results 89 Patients were included. The total additional cost for an inappropriate prescription of IV FCM (68% of cases) was of 6053 €. The total incremental cost of unsuitable disease coding was estimated at 31,688 €. Indications for IV FCM were categorized: intestinal bleeding (31%), malabsorption (17%), intolerance (9%) and refractory to oral iron (7%). The majority of patients (62%) received 1000 mg of FCM per week. The average length of hospital stay was of 10 days. Conclusion The prescription of IV iron was appropriate in most cases but did not necessarily require FCM. The use of IV IS, in many cases, could present a cost-saving option for inpatients with IDA. The lack of an IDA coding generated incremental costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Ford, W.E. III; Petrie, L.M.
AMPX-77 is a modular system of computer programs that pertain to nuclear analyses, with a primary emphasis on tasks associated with the production and use of multigroup cross sections. AH basic cross-section data are to be input in the formats used by the Evaluated Nuclear Data Files (ENDF/B), and output can be obtained in a variety of formats, including its own internal and very general formats, along with a variety of other useful formats used by major transport, diffusion theory, and Monte Carlo codes. Processing is provided for both neutron and gamma-my data. The present release contains codes all writtenmore » in the FORTRAN-77 dialect of FORTRAN and wig process ENDF/B-V and earlier evaluations, though major modules are being upgraded in order to process ENDF/B-VI and will be released when a complete collection of usable routines is available.« less
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
General-Purpose Serial Interface For Remote Control
NASA Technical Reports Server (NTRS)
Busquets, Anthony M.; Gupton, Lawrence E.
1990-01-01
Computer controls remote television camera. General-purpose controller developed to serve as interface between host computer and pan/tilt/zoom/focus functions on series of automated video cameras. Interface port based on 8251 programmable communications-interface circuit configured for tristated outputs, and connects controller system to any host computer with RS-232 input/output (I/O) port. Accepts byte-coded data from host, compares them with prestored codes in read-only memory (ROM), and closes or opens appropriate switches. Six output ports control opening and closing of as many as 48 switches. Operator controls remote television camera by speaking commands, in system including general-purpose controller.
26 CFR 1.382-1 - Table of contents.
Code of Federal Regulations, 2010 CFR
2010-04-01
... certain issuances of stock. (1) Introduction. (2) Small issuance exception. (i) In general. (ii) Small...) Adjustments for stock splits and similar transactions. (D) Exception. (iv) Short taxable years. (3) Other...) Computation of value. (c) Short taxable year. (d) Successive ownership changes and absorption of a section 382...
NASA Technical Reports Server (NTRS)
Pratt, D. T.
1984-01-01
An interactive computer code for simulation of a high-intensity turbulent combustor as a single point inhomogeneous stirred reactor was developed from an existing batch processing computer code CDPSR. The interactive CDPSR code was used as a guide for interpretation and direction of DOE-sponsored companion experiments utilizing Xenon tracer with optical laser diagnostic techniques to experimentally determine the appropriate mixing frequency, and for validation of CDPSR as a mixing-chemistry model for a laboratory jet-stirred reactor. The coalescence-dispersion model for finite rate mixing was incorporated into an existing interactive code AVCO-MARK I, to enable simulation of a combustor as a modular array of stirred flow and plug flow elements, each having a prescribed finite mixing frequency, or axial distribution of mixing frequency, as appropriate. Further increase the speed and reliability of the batch kinetics integrator code CREKID was increased by rewriting in vectorized form for execution on a vector or parallel processor, and by incorporating numerical techniques which enhance execution speed by permitting specification of a very low accuracy tolerance.
NASA Technical Reports Server (NTRS)
Pratt, D. T.; Radhakrishnan, K.
1986-01-01
The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.
Ye, Yong; Li, Yue; Fang, Fei
2014-05-05
Cobra neurotoxin (NT) has central analgesic effects, but it is difficult to pass through brain blood barrier (BBB). A novel method of red light induction is designed to help NT across BBB, which is based on photosensitizer activation by red light to generate reactive oxygen species (ROS) to open BBB. The effects were evaluated on cell models and animals in vivo with illumination by semiconductor laser at 670nm on photosensitizer pheophorbide isolated from silkworm excrement. Brain microvascular endothelial cells and astrocytes were co-cultured to build up BBB cell model. The radioactivity of (125)I-NT was measured in cells and tissues for NT permeation. Three ways of cranial irradiation, nasal cavity and intravascular irradiation were tested with combined injection of (125)I-NT 20μg/kg and pheophorbide 100μg/kg to rats, and organs of rats were separated and determined the radioactivity. Paw pressure test in rats, hot plate and writhing test in mice were applied to appraise the analgesic effects. NT across BBB cell model increased with time of illumination, and reached stable level after 60min. So did ROS in cells. NT mainly distributed in liver and kidney of rats, significantly increased in brain after illumination, and improved analgesic effects. Excitation of pheophorbide at red light produces ROS to open BBB, help NT enter brain, and enhance its central action. This research provides a new method for drug across BBB to improve its central role. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Keenan, F. P.; Conlon, E. S.; Bowden, D. A.; Feibelman, W. A.; Pradhan, Anil K.
1992-01-01
Theoretical O IV electron density sensitive emission line ratios, determined using electron impact excitation rates calculated with the R-matrix code, are presented for R(sub 1) = I(1407.4 A)/I(1401.2 A), R(sub 2) = I(1404.8 A)/I(1401.2A), R(sub 3) = I(1399.8 A)/(1401.2 A), and R(sub 4) = I(1397.2 A)/I(1401.2 A). The observed values of R(sub 1)-R(sub 4), measured from high resolution spectra obtained with the International Ultraviolet Explorer (IUE) satellite, lead to electron densities that are compatible, and which are also in good agreement with those deduced from line ratios in other species. This provides observational support for the accuracy of the atomic data adopted in the present calculations.
A decoding procedure for the Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Lim, R. S.
1978-01-01
A decoding procedure is described for the (n,k) t-error-correcting Reed-Solomon (RS) code, and an implementation of the (31,15) RS code for the I4-TENEX central system. This code can be used for error correction in large archival memory systems. The principal features of the decoder are a Galois field arithmetic unit implemented by microprogramming a microprocessor, and syndrome calculation by using the g(x) encoding shift register. Complete decoding of the (31,15) code is expected to take less than 500 microsecs. The syndrome calculation is performed by hardware using the encoding shift register and a modified Chien search. The error location polynomial is computed by using Lin's table, which is an interpretation of Berlekamp's iterative algorithm. The error location numbers are calculated by using the Chien search. Finally, the error values are computed by using Forney's method.
Chiou, Victor Y-Neng
2008-07-01
Immunotherapy for treatment of snake bites has been based on mammalian IgG. Recently, polyvalent ovine Fab has become available. However, papain, used in the Fab fragmentation process, is a human allergen. Avian eggs are a source of antibodies and a truncated version of IgY, IgY(DeltaFc), is found in ducks. In this study, we induced duck antibodies by using detoxified cobra and krait venoms and then purified IgY(DeltaFc) antibodies from the hyperimmune duck egg yolk. Ducks were used for immunization and their eggs were collected for antibody production. ICR strain female mice were used in the in vivo neutralization test. Monovalent antivenoms to Formosan cobra venom and Formosan multi-banded krait venom were raised and purified from hyper-immune duck egg yolk individually. The LD(50) of venoms were determined by subcutaneous injection of different venom doses into the mice. The survival/death ratios were recorded after 24 hours. The antibody purified from egg yolk showed high titer response to its immunogen (cobra or krait venom) by an ELISA. Overall, the antibodies from duck eggs efficiently protected mice from envenomations. The antivenoms purified from the egg yolk of ducks immunized with cobra venom and krait venom neutralized the lethal effects of these venoms with good efficacy in a mouse model. The antivenoms were effective in neutralizing lethality in mice injected at 4xLD(50) of venoms. These results indicate that antibodies derived from ducks can serve as a new source for the generation of antivenoms.
COBRA accelerator for Sandia ICF diode research at Cornell University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, D.L.; Ingwersen, P.; Bennett, L.F.
1995-05-01
The new COBRA accelerator is being built in stages at the Laboratory of Plasma Studies in Cornell University where its applications will include extraction diode and ion beam research in support of the light ion inertial confinement fusion (ICF) program at Sandia National Laboratories. The 4- to 5-MV, 125- to 250-kA accelerator is based on a four-cavity inductive voltage adder (IVA) design. It is a combination of new ferromagnetically-isolated cavities and self magnetically insulated transmission line (MITL) hardware and components from existing Sandia and Cornell facilities: Marx generator capacitors, hardware, and power supply from the DEMON facility; water pulse formingmore » lines (PFL) and gas switch from the Subsystem Test Facility (STF); a HERMES-III intermediate store capacitor (ISC); and a modified ion diode from Cornell`s LION. The present accelerator consists of a single modified cavity similar to those of the Sandia SABRE accelerator and will be used to establish an operating system for the first stage initial lower voltage testing. Four new cavities will be fabricated and delivered in the first half of FY96 to complete the COBRA accelerator. COBRA is unique in the sense that each cavity is driven by a single pulse forming line, and the IVA output polarity may be reversed by rotating the cavities 180{degrees} about their vertical axis. The site preparations, tank construction, and diode design and development are taking place at Cornell with growing enthusiasm as this machine becomes a reality. Preliminary results with the single cavity and short positive inner cylinder MITL configuration will soon be available.« less
COBRA: a Bayesian approach to pulsar searching
NASA Astrophysics Data System (ADS)
Lentati, L.; Champion, D. J.; Kramer, M.; Barr, E.; Torne, P.
2018-02-01
We introduce COBRA, a GPU-accelerated Bayesian analysis package for performing pulsar searching, that uses candidates from traditional search techniques to set the prior used for the periodicity of the source, and performs a blind search in all remaining parameters. COBRA incorporates models for both isolated and accelerated systems, as well as both Keplerian and relativistic binaries, and exploits pulse phase information to combine search epochs coherently, over time, frequency or across multiple telescopes. We demonstrate the efficacy of our approach in a series of simulations that challenge typical search techniques, including highly aliased signals, and relativistic binary systems. In the most extreme case, we simulate an 8 h observation containing 24 orbits of a pulsar in a binary with a 30 M⊙ companion. Even in this scenario we show that we can build up from an initial low-significance candidate, to fully recovering the signal. We also apply the method to survey data of three pulsars from the globular cluster 47Tuc: PSRs J0024-7204D, J0023-7203J and J0024-7204R. This final pulsar is in a 1.6 h binary, the shortest of any pulsar in 47Tuc, and additionally shows significant scintillation. By allowing the amplitude of the source to vary as a function of time, however, we show that we are able to obtain optimal combinations of such noisy data. We also demonstrate the ability of COBRA to perform high-precision pulsar timing directly on the single pulse survey data, and obtain a 95 per cent upper limit on the eccentricity of PSR J0024-7204R of εb < 0.0007.
Ben-Tov, Daniela; Abraham, Yael; Stav, Shira; Thompson, Kevin; Loraine, Ann; Elbaum, Rivka; de Souza, Amancio; Pauly, Markus; Kieber, Joseph J; Harpaz-Saad, Smadar
2015-03-01
Differentiation of the maternally derived seed coat epidermal cells into mucilage secretory cells is a common adaptation in angiosperms. Recent studies identified cellulose as an important component of seed mucilage in various species. Cellulose is deposited as a set of rays that radiate from the seed upon mucilage extrusion, serving to anchor the pectic component of seed mucilage to the seed surface. Using transcriptome data encompassing the course of seed development, we identified COBRA-LIKE2 (COBL2), a member of the glycosylphosphatidylinositol-anchored COBRA-LIKE gene family in Arabidopsis (Arabidopsis thaliana), as coexpressed with other genes involved in cellulose deposition in mucilage secretory cells. Disruption of the COBL2 gene results in substantial reduction in the rays of cellulose present in seed mucilage, along with an increased solubility of the pectic component of the mucilage. Light birefringence demonstrates a substantial decrease in crystalline cellulose deposition into the cellulosic rays of the cobl2 mutants. Moreover, crystalline cellulose deposition into the radial cell walls and the columella appears substantially compromised, as demonstrated by scanning electron microscopy and in situ quantification of light birefringence. Overall, the cobl2 mutants display about 40% reduction in whole-seed crystalline cellulose content compared with the wild type. These data establish that COBL2 plays a role in the deposition of crystalline cellulose into various secondary cell wall structures during seed coat epidermal cell differentiation. © 2015 American Society of Plant Biologists. All Rights Reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crockett, D.P.; Smith, W.K.; Proshansky, E.
1989-10-08
We report on computer-assisted three-dimensional reconstruction of spinal cord activity associated with stimulation of the plantar cushion (PC) as revealed by (14C)-2-deoxy-D-glucose (2-DG) serial autoradiographs. Moderate PC stimulation in cats elicits a reflex phasic plantar flexion of the toes. Four cats were chronically spinalized at about T6 under barbiturate anesthesia. Four to 11 days later, the cats were injected (i.v.) with 2-DG (100 microCi/kg) and the PC was electrically stimulated with needle electrodes at 2-5 times threshold for eliciting a reflex. Following stimulation, the spinal cord was processed for autoradiography. Subsequently, autoradiographs, representing approximately 8-18 mm from spinal segments L6-S1,more » were digitized for computer analysis and 3-D reconstruction. Several strategies of analysis were employed: (1) Three-dimensional volume images were color-coded to represent different levels of functional activity. (2) On the reconstructed volumes, virtual sections were made in the horizontal, sagittal, and transverse planes to view regions of 2-DG activity. (3) In addition, we were able to sample different regions within the grey and white matter semi-quantitatively (i.e., pixel intensity) from section to section to reveal differences between ipsi- and contralateral activity, as well as possible variation between sections. These analyses revealed 2-DG activity associated with moderate PC stimulation, not only in the ipsilateral dorsal horn as we had previously demonstrated, but also in both the ipsilateral and contralateral ventral horns, as well as in the intermediate grey matter. The use of novel computer analysis techniques--combined with an unanesthetized preparation--enabled us to demonstrate that the increased metabolic activity in the lumbosacral spinal cord associated with PC stimulation was much more extensive than had heretofore been observed.« less
Beltrán-Valero de Bernabé, D; Granadino, B; Chiarelli, I; Porfirio, B; Mayatepek, E; Aquaron, R; Moore, M M; Festen, J J; Sanmartí, R; Peñalva, M A; de Córdoba, S R
1998-01-01
Alkaptonuria (AKU), a rare hereditary disorder of phenylalanine and tyrosine catabolism, was the first disease to be interpreted as an inborn error of metabolism. AKU patients are deficient for homogentisate 1,2 dioxygenase (HGO); this deficiency causes homogentisic aciduria, ochronosis, and arthritis. We cloned the human HGO gene and characterized two loss-of-function mutations, P230S and V300G, in the HGO gene in AKU patients. Here we report haplotype and mutational analysis of the HGO gene in 29 novel AKU chromosomes. We identified 12 novel mutations: 8 (E42A, W97G, D153G, S189I, I216T, R225H, F227S, and M368V) missense mutations that result in amino acid substitutions at positions conserved in HGO in different species, 1 (F10fs) frameshift mutation, 2 intronic mutations (IVS9-56G-->A, IVS9-17G-->A), and 1 splice-site mutation (IVS5+1G-->T). We also report characterization of five polymorphic sites in HGO and describe the haplotypic associations of alleles at these sites in normal and AKU chromosomes. One of these sites, HGO-3, is a variable dinucleotide repeat; IVS2+35T/A, IVS5+25T/C, and IVS6+46C/A are intronic sites at which single nucleotide substitutions (dimorphisms) have been detected; and c407T/A is a relatively frequent nucleotide substitution in the coding sequence, exon 4, resulting in an amino acid change (H80Q). These data provide insight into the origin and evolution of the various AKU alleles. PMID:9529363
Computational Fluid Dynamics Requirements at the Naval Postgraduate School.
1986-10-01
FIELD ANALYSIS OF WING-FUSELAGE .1?CONFIGURATION r 13. PROFILE- THE EPPLER PROGRAM FOR THE DESIGN AND ANALYSIS OF LOW-SPEED AIRFOILS 14. AERODYNAMIC...POSTORRDUATE SCHOOL(U) VI IJE UNIV MAUSSELS (ELGIUM) C HIRSCH 61 OCT 96 NPS-67-S6-007CR M62271-06-M-0242 UNCLSSIFIED F/0 26/4 NE"I ChE’i...codes Under this group ons can list the codes KELLER BOX METHOD FOR BOUNDARY LAYERS VISCID-INVISCID INTERACTION ON AIRFOIL FLOW OVER WING-BODY JUNCTION
An Efficient Method for Verifying Gyrokinetic Microstability Codes
NASA Astrophysics Data System (ADS)
Bravenec, R.; Candy, J.; Dorland, W.; Holland, C.
2009-11-01
Benchmarks for gyrokinetic microstability codes can be developed through successful ``apples-to-apples'' comparisons among them. Unlike previous efforts, we perform the comparisons for actual discharges, rendering the verification efforts relevant to existing experiments and future devices (ITER). The process requires i) assembling the experimental analyses at multiple times, radii, discharges, and devices, ii) creating the input files ensuring that the input parameters are faithfully translated code-to-code, iii) running the codes, and iv) comparing the results, all in an organized fashion. The purpose of this work is to automate this process as much as possible: At present, a python routine is used to generate and organize GYRO input files from TRANSP or ONETWO analyses. Another routine translates the GYRO input files into GS2 input files. (Translation software for other codes has not yet been written.) Other python codes submit the multiple GYRO and GS2 jobs, organize the results, and collect them into a table suitable for plotting. (These separate python routines could easily be consolidated.) An example of the process -- a linear comparison between GYRO and GS2 for a DIII-D discharge at multiple radii -- will be presented.
Dynamics of face and annular seals with two-phase flow
NASA Technical Reports Server (NTRS)
Hughes, William F.; Basu, Prithwish; Beatty, Paul A.; Beeler, Richard M.; Lau, Stephen
1988-01-01
A detailed study was made of face and annular seals under conditions where boiling, i.e., phase change of the leaking fluid, occurs within the seal. Many seals operate in this mode because of flashing due to pressure drop and/or heat input from frictional heating. Some of the distinctive behavior characteristics of two phase seals are discussed, particularly their axial stability. The main conclusions are that seals with two phase flow may be unstable if improperly balanced. Detailed theoretical analyses of low (laminar) and high (turbulent) leakage seals are presented along with computer codes, parametric studies, and in particular a simplified PC based code that allows for rapid performance prediction: calculations of stiffness coefficients, temperature and pressure distributions, and leakage rates for parallel and coned face seals. A simplified combined computer code for the performance prediction over the laminar and turbulent ranges of a two phase flow is described and documented. The analyses, results, and computer codes are summarized.
1977-06-01
SPHEROIDAL BUOYS As mentioned previously, for the limiting case of zero ; educed frequency where tile free surface behaves as a rigid plane, the...4P ) -4 -4 ft O D - % Cdd :# P) fy ou Cd 1d Cd’. . 6 - C3 ’o .. eg S w Cd *g.4C~dI~**II~.a Ký 0 4 : 10 In H-C IV ..4 _ C) bu 0 40 la W C2C W cc lp lqa
Exome sequencing identifies complex I NDUFV2 mutations as a novel cause of Leigh syndrome.
Cameron, Jessie M; MacKay, Nevena; Feigenbaum, Annette; Tarnopolsky, Mark; Blaser, Susan; Robinson, Brian H; Schulze, Andreas
2015-09-01
Two siblings with hypertrophic cardiomyopathy and brain atrophy were diagnosed with Complex I deficiency based on low enzyme activity in muscle and high lactate/pyruvate ratio in fibroblasts. Whole exome sequencing results of fibroblast gDNA from one sibling was narrowed down to 190 SNPs or In/Dels in 185 candidate genes by selecting non-synonymous coding sequence base pair changes that were not present in the SNP database. Two compound heterozygous mutations were identified in both siblings in NDUFV2, encoding the 24 kDa subunit of Complex I. The intronic mutation (c.IVS2 + 1delGTAA) is disease causing and has been reported before. The other mutation is novel (c.669_670insG, p.Ser224Valfs*3) and predicted to cause a pathogenic frameshift in the protein. Subsequent investigation of 10 probands with complex I deficiency from different families revealed homozygosity for the intronic c.IVS2 + 1delGTAA mutation in a second, consanguineous family. In this family three of five siblings were affected. Interestingly, they presented with Leigh syndrome but no cardiac involvement. The same genotype had been reported previously in a two families but presenting with hypertrophic cardiomyopathy, trunk hypotonia and encephalopathy. We have identified NDUFV2 mutations in two families with Complex I deficiency, including a novel mutation. The diagnosis of Leigh syndrome expands the clinical phenotypes associated with the c.IVS2 + 1delGTAA mutation in this gene. Copyright © 2015 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.
Szarpak, Łukasz; Kurowski, Andrzej; Truszewski, Zenon; Robak, Oliver; Frass, Michael
2015-08-01
Ensuring an open airway during cardiopulmonary resuscitation is fundamental. The aim of this study was to determine the success rate of blind intubation during simulated cardiopulmonary resuscitation by untrained personnel. Four devices were compared in a simulated resuscitation scenario: ILMA (Intavent Direct Ltd, Buckinghamshire, United Kingdom), Cobra PLA (Engineered Medical Systems Inc, Indianapolis, IN), Supraglottic Airway Laryngopharyngeal Tube (SALT) (ECOLAB, St. Paul, MN), and Air-Q (Mercury Medical, Clearwater, FL). A group of 210 paramedics intubated a manikin with continuous chest compressions. The mean times to intubation were 40.46 ± 4.64, 33.96 ± 6.23, 17.2 ± 4.63, and 49.23 ± 13.19 seconds (SALT vs ILMA, Cobra PLA, and Air-Q; P < .05). The success ratios of blind intubation for the devices were 86.7%, 85.7%, 100%, and 71.4% (SALT vs ILMA, Cobra PLA, and Air-Q; P < .05). The study showed that the most efficient device with the shortest blind intubation time was the SALT device. Copyright © 2015 Elsevier Inc. All rights reserved.
A verification of the gyrokinetic microstability codes GEM, GYRO, and GS2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, R. V.; Chen, Y.; Wan, W.
2013-10-15
A previous publication [R. V. Bravenec et al., Phys. Plasmas 18, 122505 (2011)] presented favorable comparisons of linear frequencies and nonlinear fluxes from the Eulerian gyrokinetic codes gyro[J. Candy and R. E. Waltz, J. Comput. Phys. 186, 545 (2003)] and gs2[W. Dorland et al., Phys. Rev. Lett. 85, 5579 (2000)]. The motivation was to verify the codes, i.e., demonstrate that they correctly solve the gyrokinetic-Maxwell equations. The premise was that it is highly unlikely for both codes to yield the same incorrect results. In this work, we add the Lagrangian particle-in-cell code gem[Y. Chen and S. Parker, J. Comput. Phys.more » 220, 839 (2007)] to the comparisons, not simply to add another code, but also to demonstrate that the codes' algorithms do not matter. We find good agreement of gem with gyro and gs2 for the plasma conditions considered earlier, thus establishing confidence that the codes are verified and that ongoing validation efforts for these plasma parameters are warranted.« less
Ho, Chi-Kung; Chen, Fu-Cheng; Chen, Yung-Lung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te; Cheng, Cheng-I
2017-01-01
This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p < 0.05). There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time.
Gstat: a program for geostatistical modelling, prediction and simulation
NASA Astrophysics Data System (ADS)
Pebesma, Edzer J.; Wesseling, Cees G.
1998-01-01
Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.
1963-10-04
Tolerances of Transducer Elements and Preamplifiers on Beam Formation and SSI Performance in the AN/SQS-26 Sonar Equipment (U)", TRACOR Document Number 63...SQS-26 SONAR EQUIPMENT (U) Prepared for GROLP - 4 DOWNGRADED AT% YEAR INTERVALS: l LJ.I The Bureau of Ships DECLASSIFIED A ER 12 YEARS. r . Code 688E t...ON.PERATION OF THEP ,,, Ts 4a nAinS-26 SONAR pul i~ ~ ~ ~ ~ ~ ~ ~~%,i forre o teSFXPora aaeet Prepared for Bull by: DSS11TIAVAILAIIL CODES The Bureau of Ships
1979-10-01
prescribed as well as alternative personnel and equipment configurations. This user’s guide is a companion to ARI Technical Report 413 (Volume IV...Library I Medrchn Chef C E.R.P.A.-Arsenal. TouioneNaval France 2 USA Aviation Test Bd. Ft Rucker. ATTN: STEBO-PO I P.... Scientific Off. Aptil Hfum
Logistics Automation Master Plan (LAMP). Better Logistics Support through Automation.
1983-06-01
office micro-computers, positioned throughout the command chain , by providing real time links between LCA and all users: 2. Goals: Assist HQDA staff in...field i.e., Airland Battle 2000. IV-27 Section V: CONCEPT OF EXECUTION Suply (Retail) A. SRstem Description. I. The Division Logistics Property Book...7. Divisional Direct Support Unit Automated Supply System (DDASS)/Direct pport Level Suply Automation (DLSA). DDASS and DLSA are system development
CPMIP: measurements of real computational performance of Earth system models in CMIP6
NASA Astrophysics Data System (ADS)
Balaji, Venkatramani; Maisonnave, Eric; Zadeh, Niki; Lawrence, Bryan N.; Biercamp, Joachim; Fladrich, Uwe; Aloisio, Giovanni; Benson, Rusty; Caubel, Arnaud; Durachta, Jeffrey; Foujols, Marie-Alice; Lister, Grenville; Mocavero, Silvia; Underwood, Seth; Wright, Garrett
2017-01-01
A climate model represents a multitude of processes on a variety of timescales and space scales: a canonical example of multi-physics multi-scale modeling. The underlying climate system is physically characterized by sensitive dependence on initial conditions, and natural stochastic variability, so very long integrations are needed to extract signals of climate change. Algorithms generally possess weak scaling and can be I/O and/or memory-bound. Such weak-scaling, I/O, and memory-bound multi-physics codes present particular challenges to computational performance. Traditional metrics of computational efficiency such as performance counters and scaling curves do not tell us enough about real sustained performance from climate models on different machines. They also do not provide a satisfactory basis for comparative information across models. codes present particular challenges to computational performance. We introduce a set of metrics that can be used for the study of computational performance of climate (and Earth system) models. These measures do not require specialized software or specific hardware counters, and should be accessible to anyone. They are independent of platform and underlying parallel programming models. We show how these metrics can be used to measure actually attained performance of Earth system models on different machines, and identify the most fruitful areas of research and development for performance engineering. codes present particular challenges to computational performance. We present results for these measures for a diverse suite of models from several modeling centers, and propose to use these measures as a basis for a CPMIP, a computational performance model intercomparison project (MIP).
Porting plasma physics simulation codes to modern computing architectures using the
NASA Astrophysics Data System (ADS)
Germaschewski, Kai; Abbott, Stephen
2015-11-01
Available computing power has continued to grow exponentially even after single-core performance satured in the last decade. The increase has since been driven by more parallelism, both using more cores and having more parallelism in each core, e.g. in GPUs and Intel Xeon Phi. Adapting existing plasma physics codes is challenging, in particular as there is no single programming model that covers current and future architectures. We will introduce the open-source
Evolutionary fuzzy modeling human diagnostic decisions.
Peña-Reyes, Carlos Andrés
2004-05-01
Fuzzy CoCo is a methodology, combining fuzzy logic and evolutionary computation, for constructing systems able to accurately predict the outcome of a human decision-making process, while providing an understandable explanation of the underlying reasoning. Fuzzy logic provides a formal framework for constructing systems exhibiting both good numeric performance (accuracy) and linguistic representation (interpretability). However, fuzzy modeling--meaning the construction of fuzzy systems--is an arduous task, demanding the identification of many parameters. To solve it, we use evolutionary computation techniques (specifically cooperative coevolution), which are widely used to search for adequate solutions in complex spaces. We have successfully applied the algorithm to model the decision processes involved in two breast cancer diagnostic problems, the WBCD problem and the Catalonia mammography interpretation problem, obtaining systems both of high performance and high interpretability. For the Catalonia problem, an evolved system was embedded within a Web-based tool-called COBRA-for aiding radiologists in mammography interpretation.
Comparisons for ESTA-Task3: ASTEC, CESAM and CLÉS
NASA Astrophysics Data System (ADS)
Christensen-Dalsgaard, J.
The ESTA activity under the CoRoT project aims at testing the tools for computing stellar models and oscillation frequencies that will be used in the analysis of asteroseismic data from CoRoT and other large-scale upcoming asteroseismic projects. Here I report results of comparisons between calculations using the Aarhus code (ASTEC) and two other codes, for models that include diffusion and settling. It is found that there are likely deficiencies, requiring further study, in the ASTEC computation of models including convective cores.
Analog system for computing sparse codes
Rozell, Christopher John; Johnson, Don Herrick; Baraniuk, Richard Gordon; Olshausen, Bruno A.; Ortman, Robert Lowell
2010-08-24
A parallel dynamical system for computing sparse representations of data, i.e., where the data can be fully represented in terms of a small number of non-zero code elements, and for reconstructing compressively sensed images. The system is based on the principles of thresholding and local competition that solves a family of sparse approximation problems corresponding to various sparsity metrics. The system utilizes Locally Competitive Algorithms (LCAs), nodes in a population continually compete with neighboring units using (usually one-way) lateral inhibition to calculate coefficients representing an input in an over complete dictionary.
NASA Technical Reports Server (NTRS)
Brinson, Thomas E.; Kopasakis, George
2004-01-01
The Controls and Dynamics Technology Branch at NASA Glenn Research Center are interested in combining a solid oxide fuel cell (SOFC) to operate in conjunction with a gas turbine engine. A detailed engine model currently exists in the Matlab/Simulink environment. The idea is to incorporate a SOFC model within the turbine engine simulation and observe the hybrid system's performance. The fuel cell will be heated to its appropriate operating condition by the engine s combustor. Once the fuel cell is operating at its steady-state temperature, the gas burner will back down slowly until the engine is fully operating on the hot gases exhausted from the SOFC. The SOFC code is based on a steady-state model developed by The U.S. Department of Energy (DOE). In its current form, the DOE SOFC model exists in Microsoft Excel and uses Visual Basics to create an I-V (current-voltage) profile. For the project's application, the main issue with this model is that the gas path flow and fuel flow temperatures are used as input parameters instead of outputs. The objective is to create a SOFC model based on the DOE model that inputs the fuel cells flow rates and outputs temperature of the flow streams; therefore, creating a temperature profile as a function of fuel flow rate. This will be done by applying the First Law of Thermodynamics for a flow system to the fuel cell. Validation of this model will be done in two procedures. First, for a given flow rate the exit stream temperature will be calculated and compared to DOE SOFC temperature as a point comparison. Next, an I-V curve and temperature curve will be generated where the I-V curve will be compared with the DOE SOFC I-V curve. Matching I-V curves will suggest validation of the temperature curve because voltage is a function of temperature. Once the temperature profile is created and validated, the model will then be placed into the turbine engine simulation for system analysis.
GUIDE-0: An Experimental Information System.
ERIC Educational Resources Information Center
Murai, Shinnichi
A description is provided of GUIDE-0, an experimental information system. The system serves as a bibliographic aid for students who are taking introductory computer science courses whose material is at least partially implemented via PLATO-IV lessons. Following a brief introduction to the system in Chapter I, the second Chapter describes the…
78 FR 2912 - Prohibition on Personal Use of Electronic Devices on the Flight Deck
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-15
....C. 332(c)(7)(C)(i). In general, wireless telecommunications is the transfer of information between... personal wireless communications device or laptop computer for personal use while at their duty station on.... Personal Wireless Communications Device IV. Regulatory Notices and Analyses A. Regulatory Evaluation B...
System Integration and Interface Transition Issues.
1977-04-01
OC - 4- u -O m4 U V L.- I~V 0~ C 0 - i CC 0 .iOC30~i .- ~. C > u uU O! ul Wi 0) i~ LUn CL04) z w 0 CL-0r I.- ~ ~~~~ in0 6 - 2-A 0 ~ 4) 0 zEC u~5. 0...Systems Design and Documentation - An Introduction to the HIPO Method, Van Nostrand Reinhold Co. (1976). [34] Peter Freeman, "Toward Improved Review of...Software Design," Proc. National Computer Conf. 44, AFIPS Press (1975) pp 329-334. [35] Peter G. Neumann, "Software Development & Proofs of Multi-Level
An Overview of Ares-I CFD Ascent Aerodynamic Data Development And Analysis Based on USM3D
NASA Technical Reports Server (NTRS)
Abdol-Hamid, Khaled S.; Ghaffari, Farhad; Parlette, Edward B.
2011-01-01
An overview of the computational results obtained from the NASA Langley developed unstructured grid, Reynolds-averaged Navier-Stokes flow solver USM3D, in support of the Ares-I project within the NASA s Constellation program, are presented. The numerical data are obtained for representative flow conditions pertinent to the ascent phase of the trajectory at both wind tunnel and flight Reynolds number without including any propulsion effects. The USM3D flow solver has been designated to have the primary role within the Ares-I project in developing the computational aerodynamic data for the vehicle while other flow solvers, namely OVERFLOW and FUN3D, have supporting roles to provide complementary results for fewer cases as part of the verification process to ensure code-to-code solution consistency. Similarly, as part of the solution validation efforts, the predicted numerical results are correlated with the aerodynamic wind tunnel data that have been generated within the project in the past few years. Sample aerodynamic results and the processes established for the computational solution/data development for the evolving Ares-I design cycles are presented.
Toward a harmonized approach to animal welfare law in Canada.
Fraser, David; Koralesky, Katherine E; Urton, Geoff
2018-03-01
Animal protection law in Canada varies across the country. Federal animal protection law exists in the Criminal Code, in regulations for the transport of animals, and in regulations for humane handling and slaughter at abattoirs that are inspected by the Canadian Food Inspection Agency. Provincial animal protection laws often include provisions that i) describe a duty of care toward animals; ii) prohibit causing or permitting animal "distress;" iii) specify exemptions from prosecution; and iv) reference various national and other standards. Inconsistencies lead to duplication of effort, create difficulty in working across jurisdictions, and may erode public trust. A more consistent approach might be achieved by i) referencing a common suite of standards in provincial statutes; ii) citing the federal transport and humane slaughter regulations in provincial regulations; iii) establishing agreements so provincial authorities may enforce federal regulations; iv) wider and more uniform adoption of enforcement tools that require people to take immediate action to protect animal welfare; v) developing new standards; and vi) national consultation to define frequently used terms.
Trellis coding techniques for mobile communications
NASA Technical Reports Server (NTRS)
Divsalar, D.; Simon, M. K.; Jedrey, T.
1988-01-01
A criterion for designing optimum trellis codes to be used over fading channels is given. A technique is shown for reducing certain multiple trellis codes, optimally designed for the fading channel, to conventional (i.e., multiplicity one) trellis codes. The computational cutoff rate R0 is evaluated for MPSK transmitted over fading channels. Examples of trellis codes optimally designed for the Rayleigh fading channel are given and compared with respect to R0. Two types of modulation/demodulation techniques are considered, namely coherent (using pilot tone-aided carrier recovery) and differentially coherent with Doppler frequency correction. Simulation results are given for end-to-end performance of two trellis-coded systems.
BEARCLAW: Boundary Embedded Adaptive Refinement Conservation LAW package
NASA Astrophysics Data System (ADS)
Mitran, Sorin
2011-04-01
The BEARCLAW package is a multidimensional, Eulerian AMR-capable computational code written in Fortran to solve hyperbolic systems for astrophysical applications. It is part of AstroBEAR, a hydrodynamic & magnetohydrodynamic code environment designed for a variety of astrophysical applications which allows simulations in 2, 2.5 (i.e., cylindrical), and 3 dimensions, in either cartesian or curvilinear coordinates.
2012-06-01
assets; or Cobra Gold, a six-week exercise conducted jointly with the Royal Thai Armed Forces (U.S. Army, Pacific, 2012). Because these operations do...point of use. An example of this type of mobilization is Cobra Gold, a six-week exercise conducted jointly with the Royal Thai Armed Forces...in the same theatre , and to discontinue the loss of maintenance man-hours in packing and unpacking the entire support package upon each deployment
Davidson, R W
1985-01-01
The increasing need to communicate to exchange data can be handled by personal microcomputers. The necessity for the transference of information stored in one type of personal computer to another type of personal computer is often encountered in the process of integrating multiple sources of information stored in different and incompatible computers in Medical Research and Practice. A practical example is demonstrated with two relatively inexpensive commonly used computers, the IBM PC jr. and the Apple IIe. The basic input/output (I/O) interface chip for serial communication for each computer are joined together using a Null connector and cable to form a communications link. Using BASIC (Beginner's All-purpose Symbolic Instruction Code) Computer Language and the Disk Operating System (DOS) the communications handshaking protocol and file transfer is established between the two computers. The BASIC programming languages used are Applesoft (Apple Personal Computer) and PC BASIC (IBM Personal computer).
Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Samuel; Baker, Gavin Matthew; Gamell, Marc
2015-10-01
Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leadingmore » candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.« less
Walters, D J; Solomkin, J S; Paladino, J A
1999-11-01
To compare the cost effectiveness of sequential intravenous (i.v.) to oral ciprofloxacin plus metronidazole (CIP/MTZ i.v./PO) with that of i.v. ciprofloxacin plus i.v. metronidazole (CIP/MTZ i.v.) and i.v. imipenem-cilastatin (IMI i.v.) in patients with intra-abdominal infections. Patients enrolled in a double-blind randomised clinical trial were eligible for inclusion into this cost-effectiveness analysis. Decision analysis was used to characterise the economic outcomes between groups and provide a structure upon which to base the sensitivity analyses. 1996 cost values were used throughout. The economic perspective of the analysis was that of a hospital provider. Among 446 economically evaluable patients, 176 could be switched from i.v. to oral administration. The 51 patients randomised to CIP/MTZ i.v./PO who received active oral therapy had a success rate of 98%, mean duration of therapy of 9.1 days and mean cost of $US7678. There were 125 patients randomized to either CIP/MTZ i.v. or IMI i.v. who received oral placebo while continuing on active i.v. antibacterials; their success rate was 94%, mean duration of therapy was 10.1 days and mean cost was $US8774 (p = 0.029 vs CIP/MTZ i.v./PO). Of the 270 patients who were unable to receive oral administration, 97 received IMI i.v. and had a success rate of 75%, mean duration of therapy of 13.8 days and a mean cost of $US12,418, and 173 received CIP/MTZ i.v. and had a success rate of 77%, mean duration of therapy of 13.4 days and mean cost of $US12,219 (p = 0.26 vs IMI i.v.). In patients able to receive oral therapy, sequential i.v. to oral treatment with ciprofloxacin plus metronidazole was cost effective compared with full i.v. courses of ciprofloxacin plus metronidazole or imipenem-cilastatin. In patients unable to receive oral therapy, no difference in mean cost was found between i.v. imipenem-cilastatin or i.v. ciprofloxacin plus i.v. metronidazole.
NSTX-U Control System Upgrades
Erickson, K. G.; Gates, D. A.; Gerhardt, S. P.; ...
2014-06-01
The National Spherical Tokamak Experiment (NSTX) is undergoing a wealth of upgrades (NSTX-U). These upgrades, especially including an elongated pulse length, require broad changes to the control system that has served NSTX well. A new fiber serial Front Panel Data Port input and output (I/O) stream will supersede the aging copper parallel version. Driver support for the new I/O and cyber security concerns require updating the operating system from Redhat Enterprise Linux (RHEL) v4 to RedHawk (based on RHEL) v6. While the basic control system continues to use the General Atomics Plasma Control System (GA PCS), the effort to forwardmore » port the entire software package to run under 64-bit Linux instead of 32-bit Linux included PCS modifications subsequently shared with GA and other PCS users. Software updates focused on three key areas: (1) code modernization through coding standards (C99/C11), (2) code portability and maintainability through use of the GA PCS code generator, and (3) support of 64-bit platforms. Central to the control system upgrade is the use of a complete real time (RT) Linux platform provided by Concurrent Computer Corporation, consisting of a computer (iHawk), an operating system and drivers (RedHawk), and RT tools (NightStar). Strong vendor support coupled with an extensive RT toolset influenced this decision. The new real-time Linux platform, I/O, and software engineering will foster enhanced capability and performance for NSTX-U plasma control.« less
NASA Technical Reports Server (NTRS)
Prabhu, Ramadas K.
1994-01-01
This paper presents a nonequilibrium flow solver, implementation of the algorithm on unstructured meshes, and application to hypersonic flow past blunt bodies. Air is modeled as a mixture of five chemical species, namely O2, N2, O, NO, and N, having two temperatures namely translational and vibrational. The solution algorithm is a cell centered, point implicit upwind scheme that employs Roe's flux difference splitting technique. Implementation of this algorithm on unstructured meshes is described. The computer code is applied to solve Mach 15 flow with and without a Type IV shock interference on a cylindrical body of 2.5mm radius representing a cowl lip. Adaptively generated meshes are employed, and the meshes are refined several times until the solution exhibits detailed flow features and surface pressure and heat flux distributions. Effects of a catalytic wall on surface heat flux distribution are studied. For the Mach 15 Type IV shock interference flow, present results showed a peak heat flux of 544 MW/m2 for a fully catalytic wall and 431 MW/m(exp 2) for a noncatalytic wall. Some of the results are compared with available computational data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Favorite, Jeffrey A.
SENSMG is a tool for computing first-order sensitivities of neutron reaction rates, reaction-rate ratios, leakage, k eff, and α using the PARTISN multigroup discrete-ordinates code. SENSMG computes sensitivities to all of the transport cross sections and data (total, fission, nu, chi, and all scattering moments), two edit cross sections (absorption and capture), and the density for every isotope and energy group. It also computes sensitivities to the mass density for every material and derivatives with respect to all interface locations. The tool can be used for one-dimensional spherical (r) and two-dimensional cylindrical (r-z) geometries. The tool can be used formore » fixed-source and eigenvalue problems. The tool implements Generalized Perturbation Theory (GPT) as discussed by Williams and Stacey. Section II of this report describes the theory behind adjoint-based sensitivities, gives the equations that SENSMG solves, and defines the sensitivities that are output. Section III describes the user interface, including the input file and command line options. Section IV describes the output. Section V gives some notes about the coding that may be of interest. Section VI discusses verification, which is ongoing. Section VII lists needs and ideas for future work. Appendix A lists all of the input files whose results are presented in Sec. VI.« less
Clinical application of antenatal genetic diagnosis of osteogenesis imperfecta type IV.
Yuan, Jing; Li, Song; Xu, YeYe; Cong, Lin
2015-04-02
Clinical analysis and genetic testing of a family with osteogenesis imperfecta type IV were conducted, aiming to discuss antenatal genetic diagnosis of osteogenesis imperfecta type IV. Preliminary genotyping was performed based on clinical characteristics of the family members and then high-throughput sequencing was applied to rapidly and accurately detect the changes in candidate genes. Genetic testing of the III5 fetus and other family members revealed missense mutation in c.2746G>A, pGly916Arg in COL1A2 gene coding region and missense and synonymous mutation in COL1A1 gene coding region. Application of antenatal genetic diagnosis provides fast and accurate genetic counseling and eugenics suggestions for patients with osteogenesis imperfecta type IV and their families.
Optimum Vessel Performance in Evolving Nonlinear Wave Fields
2012-11-01
TEMPEST , the new, nonlinear, time-domain ship motion code being developed by the Navy. Table of Contents Executive Summary i List of Figures iii...domain ship motion code TEMPEST . The radiation and diffraction forces in the level 3.0 version of TEMPEST will be computed by the body-exact strip theory...nonlinear responses of a ship to a seaway are being incorporated into version 3 of TEMPEST , the new, nonlinear, time-domain ship motion code that
The Continual Intercomparison of Radiation Codes: Results from Phase I
NASA Technical Reports Server (NTRS)
Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri;
2011-01-01
The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality, and will guide the development of future phases of CIRC
Does Prop-2-ynylideneamine, HC≡CCH=NH, Exist in Space? A Theoretical and Computational Investigation
Osman, Osman I.; Elroby, Shaaban A.; Aziz, Saadullah G.; Hilal, Rifaat H.
2014-01-01
MP2, DFT and CCSD methods with 6-311++G** and aug-cc-pvdz basis sets have been used to probe the structural changes and relative energies of E-prop-2-ynylideneamine (I), Z-prop-2-ynylideneamine (II), prop-1,2-diene-1-imine (III) and vinyl cyanide (IV). The energy near-equivalence and provenance of preference of isomers and tautomers were investigated by NBO calculations using HF and B3LYP methods with 6-311++G** and aug-cc-pvdz basis sets. All substrates have Cs symmetry. The optimized geometries were found to be mainly theoretical method dependent. All elected levels of theory have computed I/II total energy of isomerization (ΔE) of 1.707 to 3.707 kJ/mol in favour of II at 298.15 K. MP2 and CCSD methods have indicated clearly the preference of II over III; while the B3LYP functional predicted nearly similar total energies. All tested levels of theory yielded a global II/IV tautomerization total energy (ΔE) of 137.3–148.4 kJ/mol in support of IV at 298.15 K. The negative values of ΔS indicated that IV is favoured at low temperature. At high temperature, a reverse tautomerization becomes spontaneous and II is preferred. The existence of II in space was debated through the interpretation and analysis of the thermodynamic and kinetic studies of this tautomerization reaction and the presence of similar compounds in the Interstellar Medium (ISM). PMID:24950178
Coupling MHD and PIC models in 2 dimensions
NASA Astrophysics Data System (ADS)
Daldorff, L.; Toth, G.; Sokolov, I.; Gombosi, T. I.; Lapenta, G.; Brackbill, J. U.; Markidis, S.; Amaya, J.
2013-12-01
Even for extended fluid plasma models, like Hall, anisotropic ion pressure and multi fluid MHD, there are still many plasma phenomena that are not well captured. For this reason, we have coupled the Implicit Particle-In-Cell (iPIC3D) code with the BATSRUS global MHD code. The PIC solver is applied in a part of the computational domain, for example, in the vicinity of reconnection sites, and overwrites the MHD solution. On the other hand, the fluid solver provides the boundary conditions for the PIC code. To demonstrate the use of the coupled codes for magnetospheric applications, we perform a 2D magnetosphere simulation, where BATSRUS solves for Hall MHD in the whole domain except for the tail reconnection region, which is handled by iPIC3D.
Autobiography as a tool for self-construction: a study of patients with mental disorders.
Smorti, Andrea; Risaliti, Francesco; Pananti, Bianca; Cipriani, Valentina
2008-07-01
The aim of the present study was to explore how the autobiographical process can lead to a transformation in the quality of psychiatric patients' self-narrative. Fifteen participants, with ages ranging from 25 to 40 years and affected by axis I psychiatric disorders (DSM IV), were selected to participate. A 10-question interview referring to 10 autobiographical cruxes was used to collect autobiographical data; the interview was readministered 2 weeks later. A coding system (the N.O.I.S.) was used to analyze each participant's 2 autobiographical productions. Results from the second interviews showed significant and positive transformations in the quality of patients' autobiographical representation.
Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?
Tai, Tracy Waize; Anandarajah, Sobanna; Dhoul, Neil; de Lusignan, Simon
2007-01-01
Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.
1985-05-01
ADAlS481 ROUTINES(U), NASSACNUSETTS INST OF TECH CANIRIDGE 1in ARTIFICIAL INTELLIGENCE LAS P E AfiRE NAY 85 AI- N -129 NSSS±4-7?-C-S3S9 UNCLSSIFIED F/G...Process representation 3 0. AGSTRAC? (Cal~ N 411 P011110 @do it 0600 01 MMRN& uIV.i eONlm- Regularities in the world give rise to regularities in, the way...sourcc code can wreak suhstantihd havoc. ............---... ... . . . . ........." ’.pa ". --.-. .. n . .’....p a’’. .. ’ .. ’-’..... ’..... 1
NASA Astrophysics Data System (ADS)
Fitch, W. Tecumseh
2014-09-01
Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology.
Fitch, W Tecumseh
2014-09-01
Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology. Copyright © 2014. Published by Elsevier B.V.
Antiproliferative activity of king cobra (Ophiophagus hannah) venom L-amino acid oxidase.
Li Lee, Mui; Chung, Ivy; Yee Fung, Shin; Kanthimathi, M S; Hong Tan, Nget
2014-04-01
King cobra (Ophiophagus hannah) venom L-amino acid oxidase (LAAO), a heat-stable enzyme, is an extremely potent antiproliferative agent against cancer cells when compared with LAAO isolated from other snake venoms. King cobra venom LAAO was shown to exhibit very strong antiproliferative activities against MCF-7 (human breast adenocarcinoma) and A549 (human lung adenocarcinoma) cells, with an IC50 value of 0.04±0.00 and 0.05±0.00 μg/mL, respectively, after 72-hr treatment. In comparison, its cytotoxicity was about 3-4 times lower when tested against human non-tumourigenic breast (184B5) and lung (NL 20) cells, suggesting selective antitumour activity. Furthermore, its potency in MCF-7 and A549 cell lines was greater than the effects of doxorubicin, a clinically established cancer chemotherapeutic agent, which showed an IC50 value of 0.18±0.03 and 0.63±0.21 μg/mL, respectively, against the two cell lines. The selective cytotoxic action of the LAAO was confirmed by phycoerythrin (PE) annexin V/7-amino-actinomycin (AAD) apoptotic assay, in which a significant increase in apoptotic cells was observed in LAAO-treated tumour cells than in their non-tumourigenic counterparts. The ability of LAAO to induce apoptosis in tumour cells was further demonstrated using caspase-3/7 and DNA fragmentation assays. We also determined that this enzyme may target oxidative stress in its killing of tumour cells, as its cytotoxicity was significantly reduced in the presence of catalase (a H2O2 scavenger). In view of its heat stability and selective and potent cytotoxic action on cancer cells, king cobra venom LAAO can be potentially developed for treating solid tumours. © 2013 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).
Pickering, R. J.; Wolfson, M. R.; Good, R. A.; Gewurz, H.
1969-01-01
The studies presented here indicate that activation of the complement (C′) system by a foreign protein will cause membrane injury and passive lysis of unsensitized erythrocytes present at the time of the reaction. These observations suggest that in addition to the classical antibody-C′-induced cytolysis, there are alternative pathways or mechanisms for activation and participation of the terminal C′ components in the production of cell membrane injury. We have shown that a substance derived from cobra venom and eluted from a single protein band on polyacrylamide can promote lysis of unsensitized autologous or heterologous erythrocytes in the presence of fresh guinea pig serum and that this lysis-inducing activity and C′-inhibiting activity appear to reside in the same fractions. The lytic activity is prevented by several agents known to impair classical C′3 activity, but is unaffected by certain procedures which interfere with the function of C′ components C′1 and C′2, a suggestion that this reaction involves chiefly C′3-C′9. Further, the cobra venom (CV) factor depletes C′ activity in cobra serum, and the CV factor (with its 5S serum cofactor) converts purified C′3 to its inactive form,1 indicating that the reaction of this complex with the complement system occurs without participation of antibody. Therefore, since the lysis-inducing and C′-inhibiting activity of the CV factor appear to result from similar interactions with the complement system, these observations suggest that cell membrane damage and cell lysis can be accomplished through activation of the complement system by a mechanism involving little or no participation of classical antibody or C′ components C′1, 4, or 2. Images PMID:4978744
Deferred Compilation: The Automation of Run-Time Code Generation
1993-12-01
can bte amortizted over many late computations ’iCPW931. For example, in a itmandard MtL implementation of a network cotmmunications *ystem, Biagioni ...with global variables and abstract data types. Science of Computer Pr"rnMmminq, 16(2):151-195. Septernber 1991. BHL93’ Edoaxdo Biagioni , Robert Harper...16(2):151-195. September 1991. 311L93i Edoardo Biagioni , Robert Harper, and Peter Lee. Standard NIL signatures for a protocol stack. Technical
Iterative Demodulation and Decoding of Non-Square QAM
NASA Technical Reports Server (NTRS)
Li, Lifang; Divsalar, Dariush; Dolinar, Samuel
2004-01-01
It has been shown that a non-square (NS) 2(sup 2n+1)-ary (where n is a positive integer) quadrature amplitude modulation [(NS)2(sup 2n+1)-QAM] has inherent memory that can be exploited to obtain coding gains. Moreover, it should not be necessary to build new hardware to realize these gains. The present scheme is a product of theoretical calculations directed toward reducing the computational complexity of decoding coded 2(sup 2n+1)-QAM. In the general case of 2(sup 2n+1)-QAM, the signal constellation is not square and it is impossible to have independent in-phase (I) and quadrature-phase (Q) mapping and demapping. However, independent I and Q mapping and demapping are desirable for reducing the complexity of computing the log likelihood ratio (LLR) between a bit and a received symbol (such computations are essential operations in iterative decoding). This is because in modulation schemes that include independent I and Q mapping and demapping, each bit of a signal point is involved in only one-dimensional mapping and demapping. As a result, the computation of the LLR is equivalent to that of a one-dimensional pulse amplitude modulation (PAM) system. Therefore, it is desirable to find a signal constellation that enables independent I and Q mapping and demapping for 2(sup 2n+1)-QAM.
GPU Computing in Bayesian Inference of Realized Stochastic Volatility Model
NASA Astrophysics Data System (ADS)
Takaishi, Tetsuya
2015-01-01
The realized stochastic volatility (RSV) model that utilizes the realized volatility as additional information has been proposed to infer volatility of financial time series. We consider the Bayesian inference of the RSV model by the Hybrid Monte Carlo (HMC) algorithm. The HMC algorithm can be parallelized and thus performed on the GPU for speedup. The GPU code is developed with CUDA Fortran. We compare the computational time in performing the HMC algorithm on GPU (GTX 760) and CPU (Intel i7-4770 3.4GHz) and find that the GPU can be up to 17 times faster than the CPU. We also code the program with OpenACC and find that appropriate coding can achieve the similar speedup with CUDA Fortran.
ERIC Educational Resources Information Center
Falloon, G.
2016-01-01
Recent government moves in many countries have seen coding included in school curricula, or promoted as part of computing, mathematics or science programmes. While these moves have generally been associated with a need to engage more young people in technology study, research has hinted at possible benefits from learning to program including…
VizieR Online Data Catalog: FAMA code for stellar parameters and abundances (Magrini+, 2013)
NASA Astrophysics Data System (ADS)
Magrini, L.; Randich, S.; Friel, E.; Spina, L.; Jacobson, H.; Cantat-Gaudin, T.; Donati, P.; Baglioni, R.; Maiorca, E.; Bragaglia, A.; Sordo, R.; Vallenari, A.
2013-07-01
FAMA v.1, July 2013, distributed with MOOGv2013 and Kurucz models. Perl Codes: read_out2.pl read_final.pl driver.pl sclipping_26.0.pl sclipping_final.pl sclipping_26.1.pl confronta.pl fama.pl Model atmopheres and interpolator (Kurucz models): MODEL_ATMO MOOG_files: files to compile MOOG (the most recent version of MOOG can be obtained from http://www.as.utexas.edu/~chris/moog.html) FAMAmoogfiles: files to update when compiling MOOG OUTPUT: directory in which the results will be stored, contains a sm macro to produce final plots automoog.par: files with parameters for FAMA 1) OUTPUTdir 2) MOOGdir 3) modelsdir 4) 1.0 (default) percentage of the dispersion of FeI abundances to be considered to compute the errors on the stellar parameters, 1.0 means 100%, thus to compute e.g., the error on Teff we allow to code to find the Teff corresponding to a slope given by σ(FeI)/range(EP). 5) 1.2 (default) σ clipping for FeI lines 6) 1.0 (default) σ clipping for FeII lines 7) 1.0 (default) σ clipping for the other elements 8) 1.0 (default) value of the QP parameter, higher values mean less strong convergence criteria. star.iron: EWs in the correct format to test the code sun.par: initial parameters for the test (1 data file).
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Azevedo, Eduardo; Abbott, Stephen; Koskela, Tuomas
The XGC fusion gyrokinetic code combines state-of-the-art, portable computational and algorithmic technologies to enable complicated multiscale simulations of turbulence and transport dynamics in ITER edge plasma on the largest US open-science computer, the CRAY XK7 Titan, at its maximal heterogeneous capability, which have not been possible before due to a factor of over 10 shortage in the time-to-solution for less than 5 days of wall-clock time for one physics case. Frontier techniques such as nested OpenMP parallelism, adaptive parallel I/O, staging I/O and data reduction using dynamic and asynchronous applications interactions, dynamic repartitioning for balancing computational work in pushing particlesmore » and in grid related work, scalable and accurate discretization algorithms for non-linear Coulomb collisions, and communication-avoiding subcycling technology for pushing particles on both CPUs and GPUs are also utilized to dramatically improve the scalability and time-to-solution, hence enabling the difficult kinetic ITER edge simulation on a present-day leadership class computer.« less
1974-09-01
introduction of modifications involving flashcards and audio have also been unsuccessful. It is felt that further progress will require a...course: Books I and 11. San Diego: Navy Personnel Research and Development Center, September 1973. Main, R. E. The effectiveness of flashcards
ERIC Educational Resources Information Center
Kline, Lanaii
A computer program that produces three reports based on asset inventory data--i.e. facilities and equipment data--is described. Written in FORTRAN IV (Level G), the program was used on the IBM 360 Model 91 at the University of California at Los Angeles (UCLA). The first report is a listing of data sorted by local, user-assigned identification…
Simulink Model of the Ares I Upper Stage Main Propulsion System
NASA Technical Reports Server (NTRS)
Burchett, Bradley T.
2008-01-01
A numerical model of the Ares I upper stage main propulsion system is formulated based on first principles. Equation's are written as non-linear ordinary differential equations. The GASP fortran code is used to compute thermophysical properties of the working fluids. Complicated algebraic constraints are numerically solved. The model is implemented in Simulink and provides a rudimentary simulation of the time history of important pressures and temperatures during re-pressurization, boost and upper stage firing. The model is validated against an existing reliable code, and typical results are shown.
An empirical analysis of journal policy effectiveness for computational reproducibility.
Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun
2018-03-13
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.
An empirical analysis of journal policy effectiveness for computational reproducibility
Seiler, Jennifer; Ma, Zhaokun
2018-01-01
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by (i) requesting data and code from authors and (ii) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility. PMID:29531050
Callahan, Michael J; Servaes, Sabah; Lee, Edward Y; Towbin, Alexander J; Westra, Sjirk J; Frush, Donald P
2014-04-01
There are limited data available on the use of i.v. contrast media for CT studies in the pediatric population. The purpose of this study is to determine the practice patterns of i.v. contrast media usage for pediatric CT by members of the Society for Pediatric Radiology (SPR). SPR members were surveyed regarding the use of i.v. contrast media for pediatric CT studies. Questions pertained to information required before administering i.v. contrast media, types of central catheters for injecting i.v. contrast media, injection rates based on angiocatheter size and study type, and management of i.v. contrast media extravasation. The response rate of 6% (88/1545) represented practice patterns of 26% (401/1545) of the SPR membership. Most respondents thought the following clinical information was mandatory before i.v. contrast media administration: allergy to i.v. contrast media (97%), renal insufficiency (97%), current metformin use (72%), significant allergies (61%), diabetes (54%), and asthma (52%). Most administered i.v. contrast media through nonimplanted central venous catheters (78%), implanted venous ports (78%), and peripherally inserted central catheters (72%). The most common maximum i.v. contrast media injection rates were 5.0 mL/s or greater for a 16-gauge angiocatheter, 4.0 mL/s for an 18-gauge angiocatheter, 3.0 mL/s for a 20-gauge angiocatheter, and 2.0 mL/s for a 22-gauge angiocatheter. For soft-tissue extravasation of i.v. contrast media, 95% elevate the affected extremity, 76% use ice, and 45% use heat. The results of this survey illustrate the collective opinion of a subset of SPR members relating to the use of i.v. contrast media in pediatric CT, providing guidelines for clinical histories needed before i.v. contrast media, maximum i.v. contrast injection rates for standard angiocatheters, contrast media injection rates for specific CT studies, and management of i.v. contrast media soft-tissue extravasation.
Computer Aided Design of Polyhedron Solids to Model Air in Com-Geom Descriptions
1983-08-01
34The GIFT Code User Manual, Volume I, Introduction and Input Requirements," BRL Report No. 1802, July 1975 (Unclassified). (AD B0060Z7LK 2G...Kuehl, L. Bain and M. Reisinger, "The GIFT Code User Manual, Volume II, The Output Options," BRL Report ARBRL-TR-02189, September 1979...is generated from the GIFT code under op- tion XSECT. This option produces plot files which define cross- sectional views of the COM-GEOM
Jahnke, A; Hirschberger, J; Fischer, C; Brill, T; Köstlin, R; Plank, C; Küchenhoff, H; Krieger, S; Kamenica, K; Schillinger, U
2007-12-01
Despite aggressive pre- or postoperative treatment, feline fibrosarcomas have a high relapse rate. In this study, a new treatment option based on immune stimulation by intra-tumoral delivery of three feline cytokine genes was performed. The objective of this phase-I dose-escalation study was to determine a safe dose for further evaluation in a subsequent phase-II trial. Twenty-five client-owned cats with clinical diagnosis of fibrosarcoma - primary tumours as well as recurrences - entered the study. Four increasing doses of plasmids coding for feIL-2, feIFN-gamma or feGM-CSF, respectively, were previously defined. In groups I, II, III and IV these doses were 15, 50, 150 and 450 microg per plasmid and a corresponding amount of magnetic nanoparticles. Two preoperative intra-tumoral injections of the magnetic DNA solution were followed by magnetofection. A group of four control cats received only surgical treatment. Side effects were registered and graded according to the VCOG-CTCAE scale and correlated to treatment. Statistical analyses included one-way anova, post hoc and Kruskal-Wallis tests. ELISA tests detecting plasma feIFN-gamma and plasma feGM-CSF were performed. One cat out of group IV (450 microg per plasmid) showed adverse events probably related to gene delivery. As these side effects were self-limiting and occurred only in one of eight cats in group IV, this dose was determined to be well tolerable. Altogether six cats developed local recurrences during a 1-year observation period. Four of these cats had been treated with dose IV. Regarding these observations, a subsequent phase-II trial including a representative amount of cats should be tested for the efficacy of dose IV as well as dose III.
Integrated Idl Tool For 3d Modeling And Imaging Data Analysis
NASA Astrophysics Data System (ADS)
Nita, Gelu M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A. A.; Kontar, E. P.
2012-05-01
Addressing many key problems in solar physics requires detailed analysis of non-simultaneous imaging data obtained in various wavelength domains with different spatial resolution and their comparison with each other supplied by advanced 3D physical models. To facilitate achieving this goal, we have undertaken a major enhancement and improvements of IDL-based simulation tools developed earlier for modeling microwave and X-ray emission. The greatly enhanced object-based architecture provides interactive graphic user interface that allows the user i) to import photospheric magnetic field maps and perform magnetic field extrapolations to almost instantly generate 3D magnetic field models, ii) to investigate the magnetic topology of these models by interactively creating magnetic field lines and associated magnetic field tubes, iii) to populate them with user-defined nonuniform thermal plasma and anisotropic nonuniform nonthermal electron distributions; and iv) to calculate the spatial and spectral properties of radio and X-ray emission. The application integrates DLL and Shared Libraries containing fast gyrosynchrotron emission codes developed in FORTRAN and C++, soft and hard X-ray codes developed in IDL, and a potential field extrapolation DLL produced based on original FORTRAN code developed by V. Abramenko and V. Yurchishin. The interactive interface allows users to add any user-defined IDL or external callable radiation code, as well as user-defined magnetic field extrapolation routines. To illustrate the tool capabilities, we present a step-by-step live computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data produced by NORH and RHESSI instruments. This work was supported in part by NSF grants AGS-0961867, AST-0908344, AGS-0969761, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, the Leverhulme Trust, UK, and by the European Commission through the Radiosun and HESPE Networks.
Iqbal, Junaid; Sagheer, Mehwish; Tabassum, Nazneen; Siddiqui, Ruqaiyyah; Khan, Naveed Ahmed
2014-01-01
Using morphological analysis and biochemical testing, here for the first time, we determined the culturable gut bacterial flora (aerobes and facultative anaerobes) in the venomous Black Cobra (Naja naja karachiensis) from South Asia. The findings revealed that these snakes inhabit potentially pathogenic bacteria including Serratia marcescens, Pseudomonas aeruginosa, Shewanella putrefaciens, Aeromonas hydrophila, Salmonella sp., Moraxella sp., Bacillus sp., Ochrobactrum anthropi, and Providencia rettgeri. These findings are of concern, as injury from snake bite can result in wound infections and tissue necrosis leading to sepsis/necrotizing fasciitis and/or expose consumers of snake meat/medicine in the community to infections. PMID:25002979
Skel: Generative Software for Producing Skeletal I/O Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Logan, J.; Klasky, S.; Lofstead, J.
2011-01-01
Massively parallel computations consist of a mixture of computation, communication, and I/O. As part of the co-design for the inevitable progress towards exascale computing, we must apply lessons learned from past work to succeed in this new age of computing. Of the three components listed above, implementing an effective parallel I/O solution has often been overlooked by application scientists and was usually added to large scale simulations only when existing serial techniques had failed. As scientists teams scaled their codes to run on hundreds of processors, it was common to call on an I/O expert to implement a set ofmore » more scalable I/O routines. These routines were easily separated from the calculations and communication, and in many cases, an I/O kernel was derived from the application which could be used for testing I/O performance independent of the application. These I/O kernels developed a life of their own used as a broad measure for comparing different I/O techniques. Unfortunately, as years passed and computation and communication changes required changes to the I/O, the separate I/O kernel used for benchmarking remained static no longer providing an accurate indicator of the I/O performance of the simulation making I/O research less relevant for the application scientists. In this paper we describe a new approach to this problem where I/O kernels are replaced with skeletal I/O applications automatically generated from an abstract set of simulation I/O parameters. We realize this abstraction by leveraging the ADIOS middleware's XML I/O specification with additional runtime parameters. Skeletal applications offer all of the benefits of I/O kernels including allowing I/O optimizations to focus on useful I/O patterns. Moreover, since they are automatically generated, it is easy to produce an updated I/O skeleton whenever the simulation's I/O changes. In this paper we analyze the performance of automatically generated I/O skeletal applications for the S3D and GTS codes. We show that these skeletal applications achieve performance comparable to that of the production applications. We wrap up the paper with a discussion of future changes to make the skeletal application better approximate the actual I/O performed in the simulation.« less
Fully kinetic 3D simulations of the Hermean magnetosphere under realistic conditions: a new approach
NASA Astrophysics Data System (ADS)
Amaya, Jorge; Gonzalez-Herrero, Diego; Lembège, Bertrand; Lapenta, Giovanni
2017-04-01
Simulations of the magnetosphere of planets are usually performed using the MHD and the hybrid approaches. However, these two methods still rely on approximations for the computation of the pressure tensor, and require the neutrality of the plasma at every point of the domain by construction. These approximations undermine the role of electrons on the emergence of plasma features in the magnetosphere of planets. The high mobility of electrons, their characteristic time and space scales, and the lack of perfect neutrality, are the source of many observed phenomena in the magnetospheres, including the turbulence energy cascade, the magnetic reconnection, the particle acceleration in the shock front and the formation of current systems around the magnetosphere. Fully kinetic codes are extremely demanding of computing time, and have been unable to perform simulations of the full magnetosphere at the real scales of a planet with realistic plasma conditions. This is caused by two main reasons: 1) explicit codes must resolve the electron scales limiting the time and space discretisation, and 2) current versions of semi-implicit codes are unstable for cell sizes larger than a few Debye lengths. In this work we present new simulations performed with ECsim, an Energy Conserving semi-implicit method [1], that can overcome these two barriers. We compare the solutions obtained with ECsim with the solutions obtained by the classic semi-implicit code iPic3D [2]. The new simulations with ECsim demand a larger computational effort, but the time and space discretisations are larger than those in iPic3D allowing for a faster simulation time of the full planetary environment. The new code, ECsim, can reach a resolution allowing the capture of significant large scale physics without loosing kinetic electron information, such as wave-electron interaction and non-Maxwellian electron velocity distributions [3]. The code is able to better capture the thickness of the different boundary layers of the magnetosphere of Mercury. Electron kinetics are consistent with the spatial and temporal scale resolutions. Simulations are compared with measurements from the MESSENGER spacecraft showing a better fit when compared against the classic fully kinetic code iPic3D. These results show that the new generation of Energy Conserving semi-implicit codes can be used for an accurate analysis and interpretation of particle data from magnetospheric missions like BepiColombo and MMS, including electron velocity distributions and electron temperature anisotropies. [1] Lapenta, G. (2016). Exactly Energy Conserving Implicit Moment Particle in Cell Formulation. arXiv preprint arXiv:1602.06326. [2] Markidis, S., & Lapenta, G. (2010). Multi-scale simulations of plasma with iPIC3D. Mathematics and Computers in Simulation, 80(7), 1509-1519. [3] Lapenta, G., Gonzalez-Herrero, D., & Boella, E. (2016). Multiple scale kinetic simulations with the energy conserving semi implicit particle in cell (PIC) method. arXiv preprint arXiv:1612.08289.
Computational Approaches for Designing Protein/Inhibitor Complexes and Membrane Protein Variants
NASA Astrophysics Data System (ADS)
Vijayendran, Krishna Gajan
Drug discovery of small-molecule protein inhibitors is a vast enterprise that involves several scientific disciplines (i.e. genomics, cell biology, x-ray crystallography, chemistry, computer science, statistics), with each discipline focusing on a particular aspect of the process. In this thesis, I use computational and experimental approaches to explore the most fundamental aspect of drug discovery: the molecular interactions of small-molecules inhibitors with proteins. In Part I (Chapters I and II), I describe how computational docking approaches can be used to identify structurally diverse molecules that can inhibit multiple protein targets in the brain. I illustrate this approach using the examples of microtubule-stabilizing agents and inhibitors of cyclooxygenase(COX)-I and 5-lipoxygenase (5-LOX). In Part II (Chapters III and IV), I focus on membrane proteins, which are notoriously difficult to work with due to their low natural abundances, low yields for heterologous over expression, and propensities toward aggregation. I describe a general approach for designing water-soluble variants of membrane proteins, for the purpose of developing cell-free, label-free, detergent-free, solution-phase studies of protein structure and small-molecule binding. I illustrate this approach through the design of a water-soluble variant of the membrane protein Smoothened, wsSMO. This wsSMO stands to serve as a first-step towards developing membrane protein analogs of this important signaling protein and drug target.
Ho, Chi-Kung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te
2017-01-01
Background This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). Methods A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. Results There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p < 0.05). There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. Conclusions This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time. PMID:28900621
A hydrodynamic approach to cosmology - Methodology
NASA Technical Reports Server (NTRS)
Cen, Renyue
1992-01-01
The present study describes an accurate and efficient hydrodynamic code for evolving self-gravitating cosmological systems. The hydrodynamic code is a flux-based mesh code originally designed for engineering hydrodynamical applications. A variety of checks were performed which indicate that the resolution of the code is a few cells, providing accuracy for integral energy quantities in the present simulations of 1-3 percent over the whole runs. Six species (H I, H II, He I, He II, He III) are tracked separately, and relevant ionization and recombination processes, as well as line and continuum heating and cooling, are computed. The background radiation field is simultaneously determined in the range 1 eV to 100 keV, allowing for absorption, emission, and cosmological effects. It is shown how the inevitable numerical inaccuracies can be estimated and to some extent overcome.
1979-08-21
Appendix s - Outline and Draft Material for Proposed Triservice Interim Guideline on Application of Software Acceptance Criteria....... 269 Appendix 9...AND DRAFT MATERIAL FOR PROPOSED TRISERVICE INTERIM GUIDELINE ON APPLICATION OF SOFTWARE ACCEPTANCE CRITERIA I I INTRODUCTION The purpose of this guide...contract item (CPCI) (code) 5. CPCI test plan 6. CPCI test procedures 7. CPCI test report 8. Handbooks and manuals. Al though additional material does
Jeevanandan, Ganesh; Thomas, Eapen
2018-01-01
This present study was conducted to analyze the volumetric change in the root canal space and instrumentation time between hand files, hand files in reciprocating motion, and three rotary files in primary molars. One hundred primary mandibular molars were randomly allotted to one of the five groups. Instrumentation was done using Group I; nickel-titanium (Ni-Ti) hand file, Group II; Ni-Ti hand files in reciprocating motion, Group III; Race rotary files, Group IV; prodesign pediatric rotary files, and Group V; ProTaper rotary files. The mean volumetric changes were assessed using pre- and post-operative spiral computed tomography scans. Instrumentation time was recorded. Statistical analysis to access intergroup comparison for mean canal volume and instrumentation time was done using Bonferroni-adjusted Mann-Whitney test and Mann-Whitney test, respectively. Intergroup comparison of mean canal volume showed statistically significant difference between Groups II versus IV, Groups III versus V, and Groups IV versus V. Intergroup comparison of mean instrumentation time showed statistically significant difference among all the groups except Groups IV versus V. Among the various instrumentation techniques available, rotary instrumentation is the considered to be the better instrumentation technique for canal preparation in primary teeth.
The Mediterranean Crucible, 1942-1943: Did Technology or Tenets Achieve Air Superiority
2012-06-01
messages of critical Luftwaffe communications. The decryption, analysis, and dissemination of messages from the German Enigma coding machine, facilitated...the ability to “read the Luftwaffe [Enigma] keys in North Africa from the first day of their introduction” in the theater.5 This system, code ...IRIS no. 118168, in USAF Collection, AFHRA, Part IV, 1. 21 AWPD-42, Part IV, 1. superiority which enables its possessor to conduct air
NASA Technical Reports Server (NTRS)
Thompson, Daniel
2004-01-01
Coming into the Combustion Branch of the Turbomachinery and Propulsion Systems Division, there was not any set project planned out for me to work on. This was understandable, considering I am only at my sophmore year in college. Also, my mentor was a division chief and it was expected that I would be passed down the line. It took about a week for me to be placed with somebody who could use me. My first project was to write a macro for TecPlot. Commonly, a person would have a 3D contour volume modeling something such as a combustion engine. This 3D volume needed to have slices extracted from it and made into 2D scientific plots with all of the appropriate axis and titles. This was very tedious to do by hand. My macro needed to automate the process. There was some education I needed before I could start, however. First, TecPlot ran on Unix and Linux, like a growing majority of scientific applications. I knew a little about Linux, but I would need to know more to use the software at hand. I took two classes at the Learning Center on Unix and am now comfortable with Linux and Unix. I already had taken Computer Science I and II, and had undergone the transformation from Computer Programmer to Procedural Epistemologist. I knew how to design efficient algorithms, I just needed to learn the macro language. After a little less than a week, I had learned the basics of the language. Like most languages, the best way to learn more of it was by using it. It was decided that it was best that I do the macro in layers, starting simple and adding features as I went. The macro started out slicing with respect to only one axis, and did not make 2D plots out of the slices. Instead, it lined them up inside the solid. Next, I allowed for more than one axis and placed each slice in a separate frame. After this, I added code that transformed each individual slice-frame into a scientific plot. I also made frames for composite volumes, which showed all of the slices in the same XYZ space. I then designed an addition companion macro that exported each frame into its own image file. I then distributed the macros to a test group, and am awaiting feedback. In the meantime, a am researching the possible applications of distributed computing on the National Combustor Code. Many of our Linux boxes were idle for most of the day. The department thinks that it would be wonderful if we could get all of these idle processors to work on a problem under the NCC code. The client software would have to be easily distributed, such as in screensaver format or as a program that only ran when the computer was not in use. This project proves to be an interesting challenge.
Computer Systems Acquisition Metrics Handbook. Volume II. Quality Factor Modules.
1982-05-01
are unable to evil .mte) IPAW BrY: APFD If: i Ef-24 EFFICICYe: EfPDF.l L C1E PHASE: Sota S): PRELINARY DESIGN EfP. 1 EfDC. 2 = StSM NAM:. I. CUM...unable to evaluate) Iv. D6PB=’S C3#rm: PWAPM BY:______ APFD N _______ Po-30 MACHM NDE WM4 MEASE SO= Go&e:. PoII4LS LIM P E PHAiSE: SJRMCs): IMPLEME
Metal oxidation states in biological water splitting.
Krewald, Vera; Retegan, Marius; Cox, Nicholas; Messinger, Johannes; Lubitz, Wolfgang; DeBeer, Serena; Neese, Frank; Pantazis, Dimitrios A
2015-03-01
A central question in biological water splitting concerns the oxidation states of the manganese ions that comprise the oxygen-evolving complex of photosystem II. Understanding the nature and order of oxidation events that occur during the catalytic cycle of five S i states ( i = 0-4) is of fundamental importance both for the natural system and for artificial water oxidation catalysts. Despite the widespread adoption of the so-called "high-valent scheme"-where, for example, the Mn oxidation states in the S 2 state are assigned as III, IV, IV, IV-the competing "low-valent scheme" that differs by a total of two metal unpaired electrons ( i.e. III, III, III, IV in the S 2 state) is favored by several recent studies for the biological catalyst. The question of the correct oxidation state assignment is addressed here by a detailed computational comparison of the two schemes using a common structural platform and theoretical approach. Models based on crystallographic constraints were constructed for all conceivable oxidation state assignments in the four (semi)stable S states of the oxygen evolving complex, sampling various protonation levels and patterns to ensure comprehensive coverage. The models are evaluated with respect to their geometric, energetic, electronic, and spectroscopic properties against available experimental EXAFS, XFEL-XRD, EPR, ENDOR and Mn K pre-edge XANES data. New 2.5 K 55 Mn ENDOR data of the S 2 state are also reported. Our results conclusively show that the entire S state phenomenology can only be accommodated within the high-valent scheme by adopting a single motif and protonation pattern that progresses smoothly from S 0 (III, III, III, IV) to S 3 (IV, IV, IV, IV), satisfying all experimental constraints and reproducing all observables. By contrast, it was impossible to construct a consistent cycle based on the low-valent scheme for all S states. Instead, the low-valent models developed here may provide new insight into the over-reduced S states and the states involved in the assembly of the catalytically active water oxidizing cluster.
Computing Challenges in Coded Mask Imaging
NASA Technical Reports Server (NTRS)
Skinner, Gerald
2009-01-01
This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.