Analysis of Methods to Excite Head-Tail Motion Within the Cornell Electron Storage Ring
NASA Astrophysics Data System (ADS)
Gendler, Naomi; Billing, Mike; Shanks, Jim
The main accelerator complex at Cornell consists of two rings around which electrons and positrons move: the synchrotron, where the particles are accelerated to 5 GeV, and the Storage Ring, where the particles circulate a ta Þxed energy, guided by quadrupole and dipole magnets, with a steady energy due to a sinusoidal voltage source. Keeping the beam stable in the Storage Ring is crucial for its lifetime. A long-lasting, invariable beam means more accurate experiments, as well as brighter, more focused X-rays for use in the Cornell High Energy Synchrotron Source (CHESS). The stability of the electron and positron beams in the Cornell Electron Storage Ring (CESR) is important for the development of accelerators and for usage of the beam in X-ray science and accelerator physics. Bunch oscillations tend to enlarge the beam's cross section, making it less stable. We believe that one such oscillation is ``head-tail motion,'' where the bunch rocks back and forth on a pivot located at the central particle. In this project, we write a simulation of the bunch that induces head-tail motion with a vertical driver. We also excite this motion physically in the storage ring, and observe a deÞnite head-tail signal. In the experiment, we saw a deÞnite persistence of the drive-damp signal within a small band around the head-tail frequency, indicating that the head-tail frequency is a natural vertical mode of the bunch that was being excited. The signal seen in the experiment matched the signal seen in the simulation to within an order of magnitude.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Yong Ho; Chao, Alexander Wu; Blaskiewicz, Michael M.
Effects of the chromaticity on head-tail instabilities for broadband impedances are comprehensively studied, using the two particle model, the Vlasov analysis and computer simulations. We show both in the two particle model and the Vlasov analysis with the trapezoidal (semiconstant) wake model that we can derive universal contour plots for the growth factor as a function of the two dimensionless parameters: the wakefield strength, Υ, and the difference of the betatron phase advances between the head and the tail, χ. They reveal how the chromaticity affects strong head-tail instabilities and excites head-tail instabilities. We also apply the LEP (Large Electron-Positronmore » Collider) broadband resonator model to the Vlasov approach and find that the results are in very good agreement with those of the trapezoidal wake model. The theoretical findings are also reinforced by the simulation results. In conclusion, the trapezoidal wake model turns out to be a very useful tool since it significantly simplifies the time domain analysis and provides well-behaved impedance at the same time.« less
Chin, Yong Ho; Chao, Alexander Wu; Blaskiewicz, Michael M.; ...
2017-07-28
Effects of the chromaticity on head-tail instabilities for broadband impedances are comprehensively studied, using the two particle model, the Vlasov analysis and computer simulations. We show both in the two particle model and the Vlasov analysis with the trapezoidal (semiconstant) wake model that we can derive universal contour plots for the growth factor as a function of the two dimensionless parameters: the wakefield strength, Υ, and the difference of the betatron phase advances between the head and the tail, χ. They reveal how the chromaticity affects strong head-tail instabilities and excites head-tail instabilities. We also apply the LEP (Large Electron-Positronmore » Collider) broadband resonator model to the Vlasov approach and find that the results are in very good agreement with those of the trapezoidal wake model. The theoretical findings are also reinforced by the simulation results. In conclusion, the trapezoidal wake model turns out to be a very useful tool since it significantly simplifies the time domain analysis and provides well-behaved impedance at the same time.« less
NASA Astrophysics Data System (ADS)
Gan, Zhaoming; Li, Hui; Li, Shengtai; Yuan, Feng
2017-04-01
The distinctive morphology of head-tail radio galaxies reveals strong interactions between the radio jets and their intra-cluster environment, the general consensus on the morphology origin of head-tail sources is that radio jets are bent by violent intra-cluster weather. We demonstrate in this paper that such strong interactions provide a great opportunity to study the jet properties and also the dynamics of the intra-cluster medium (ICM). By three-dimensional magnetohydrodynamical simulations, we analyze the detailed bending process of a magnetically dominated jet, based on the magnetic tower jet model. We use stratified atmospheres modulated by wind/shock to mimic the violent intra-cluster weather. Core sloshing is found to be inevitable during the wind-cluster core interaction, which induces significant shear motion and could finally drive ICM turbulence around the jet, making it difficult for the jet to survive. We perform a detailed comparison between the behavior of pure hydrodynamical jets and the magnetic tower jet and find that the jet-lobe morphology could not survive against the violent disruption in all of our pure hydrodynamical jet models. On the other hand, the head-tail morphology is well reproduced by using a magnetic tower jet model bent by wind, in which hydrodynamical instabilities are naturally suppressed and the jet could always keep its integrity under the protection of its internal magnetic fields. Finally, we also check the possibility for jet bending by shock only. We find that shock could not bend the jet significantly, and thus could not be expected to explain the observed long tails in head-tail radio galaxies.
Two particle model for studying the effects of space-charge force on strong head-tail instabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Yong Ho; Chao, Alexander Wu; Blaskiewicz, Michael M.
In this paper, we present a new two particle model for studying the strong head-tail instabilities in the presence of the space-charge force. It is a simple expansion of the well-known two particle model for strong head-tail instability and is still analytically solvable. No chromaticity effect is included. It leads to a formula for the growth rate as a function of the two dimensionless parameters: the space-charge tune shift parameter (normalized by the synchrotron tune) and the wakefield strength, Upsilon. The three-dimensional contour plot of the growth rate as a function of those two dimensionless parameters reveals stopband structures. Manymore » simulation results generally indicate that a strong head-tail instability can be damped by a weak space-charge force, but the beam becomes unstable again when the space-charge force is further increased. The new two particle model indicates a similar behavior. In weak space-charge regions, additional tune shifts by the space-charge force dissolve the mode coupling. As the space-charge force is increased, they conversely restore the mode coupling, but then a further increase of the space-charge force decouples the modes again. Lastly, this mode coupling/decoupling behavior creates the stopband structures.« less
Two particle model for studying the effects of space-charge force on strong head-tail instabilities
Chin, Yong Ho; Chao, Alexander Wu; Blaskiewicz, Michael M.
2016-01-19
In this paper, we present a new two particle model for studying the strong head-tail instabilities in the presence of the space-charge force. It is a simple expansion of the well-known two particle model for strong head-tail instability and is still analytically solvable. No chromaticity effect is included. It leads to a formula for the growth rate as a function of the two dimensionless parameters: the space-charge tune shift parameter (normalized by the synchrotron tune) and the wakefield strength, Upsilon. The three-dimensional contour plot of the growth rate as a function of those two dimensionless parameters reveals stopband structures. Manymore » simulation results generally indicate that a strong head-tail instability can be damped by a weak space-charge force, but the beam becomes unstable again when the space-charge force is further increased. The new two particle model indicates a similar behavior. In weak space-charge regions, additional tune shifts by the space-charge force dissolve the mode coupling. As the space-charge force is increased, they conversely restore the mode coupling, but then a further increase of the space-charge force decouples the modes again. Lastly, this mode coupling/decoupling behavior creates the stopband structures.« less
High fidelity 3-dimensional models of beam-electron cloud interactions in circular accelerators
NASA Astrophysics Data System (ADS)
Feiz Zarrin Ghalam, Ali
Electron cloud is a low-density electron profile created inside the vacuum chamber of circular machines with positively charged beams. Electron cloud limits the peak current of the beam and degrades the beams' quality through luminosity degradation, emittance growth and head to tail or bunch to bunch instability. The adverse effects of electron cloud on long-term beam dynamics becomes more and more important as the beams go to higher and higher energies. This problem has become a major concern in many future circular machines design like the Large Hadron Collider (LHC) under construction at European Center for Nuclear Research (CERN). Due to the importance of the problem several simulation models have been developed to model long-term beam-electron cloud interaction. These models are based on "single kick approximation" where the electron cloud is assumed to be concentrated at one thin slab around the ring. While this model is efficient in terms of computational costs, it does not reflect the real physical situation as the forces from electron cloud to the beam are non-linear contrary to this model's assumption. To address the existing codes limitation, in this thesis a new model is developed to continuously model the beam-electron cloud interaction. The code is derived from a 3-D parallel Particle-In-Cell (PIC) model (QuickPIC) originally used for plasma wakefield acceleration research. To make the original model fit into circular machines environment, betatron and synchrotron equations of motions have been added to the code, also the effect of chromaticity, lattice structure have been included. QuickPIC is then benchmarked against one of the codes developed based on single kick approximation (HEAD-TAIL) for the transverse spot size of the beam in CERN-LHC. The growth predicted by QuickPIC is less than the one predicted by HEAD-TAIL. The code is then used to investigate the effect of electron cloud image charges on the long-term beam dynamics, particularly on the transverse tune shift of the beam at CERN Super Proton Synchrotron (SPS) ring. The force from the electron cloud image charges on the beam cancels the force due to cloud compression formed on the beam axis and therefore the tune shift is mainly due to the uniform electron cloud density. (Abstract shortened by UMI.)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bassi, Gabriele; Blednykh, Alexei; Smalyuk, Victor
A novel algorithm for self-consistent simulations of long-range wakefield effects has been developed and applied to the study of both longitudinal and transverse coupled-bunch instabilities at NSLS-II. The algorithm is implemented in the new parallel tracking code space (self-consistent parallel algorithm for collective effects) discussed in the paper. The code is applicable for accurate beam dynamics simulations in cases where both bunch-to-bunch and intrabunch motions need to be taken into account, such as chromatic head-tail effects on the coupled-bunch instability of a beam with a nonuniform filling pattern, or multibunch and single-bunch effects of a passive higher-harmonic cavity. The numericalmore » simulations have been compared with analytical studies. For a beam with an arbitrary filling pattern, intensity-dependent complex frequency shifts have been derived starting from a system of coupled Vlasov equations. The analytical formulas and numerical simulations confirm that the analysis is reduced to the formulation of an eigenvalue problem based on the known formulas of the complex frequency shifts for the uniform filling pattern case.« less
Insights into Head-Tailed Viruses Infecting Extremely Halophilic Archaea
Pietilä, Maija K.; Laurinmäki, Pasi; Russell, Daniel A.; Ko, Ching-Chung; Jacobs-Sera, Deborah; Butcher, Sarah J.
2013-01-01
Extremophilic archaea, both hyperthermophiles and halophiles, dominate in habitats where rather harsh conditions are encountered. Like all other organisms, archaeal cells are susceptible to viral infections, and to date, about 100 archaeal viruses have been described. Among them, there are extraordinary virion morphologies as well as the common head-tailed viruses. Although approximately half of the isolated archaeal viruses belong to the latter group, no three-dimensional virion structures of these head-tailed viruses are available. Thus, rigorous comparisons with bacteriophages are not yet warranted. In the present study, we determined the genome sequences of two of such viruses of halophiles and solved their capsid structures by cryo-electron microscopy and three-dimensional image reconstruction. We show that these viruses are inactivated, yet remain intact, at low salinity and that their infectivity is regained when high salinity is restored. This enabled us to determine their three-dimensional capsid structures at low salinity to a ∼10-Å resolution. The genetic and structural data showed that both viruses belong to the same T-number class, but one of them has enlarged its capsid to accommodate a larger genome than typically associated with a T=7 capsid by inserting an additional protein into the capsid lattice. PMID:23283946
Bassi, Gabriele; Blednykh, Alexei; Smalyuk, Victor
2016-02-24
A novel algorithm for self-consistent simulations of long-range wakefield effects has been developed and applied to the study of both longitudinal and transverse coupled-bunch instabilities at NSLS-II. The algorithm is implemented in the new parallel tracking code space (self-consistent parallel algorithm for collective effects) discussed in the paper. The code is applicable for accurate beam dynamics simulations in cases where both bunch-to-bunch and intrabunch motions need to be taken into account, such as chromatic head-tail effects on the coupled-bunch instability of a beam with a nonuniform filling pattern, or multibunch and single-bunch effects of a passive higher-harmonic cavity. The numericalmore » simulations have been compared with analytical studies. For a beam with an arbitrary filling pattern, intensity-dependent complex frequency shifts have been derived starting from a system of coupled Vlasov equations. The analytical formulas and numerical simulations confirm that the analysis is reduced to the formulation of an eigenvalue problem based on the known formulas of the complex frequency shifts for the uniform filling pattern case.« less
Lyu, Xiaolin; Xiao, Anqi; Zhang, Wei; Hou, Pingping; Gu, Kehua; Tang, Zhehao; Pan, Hongbing; Wu, Fan; Shen, Zhihao; Fan, Xinghe
2018-06-08
In this report, Im-3m and Pn-3m polymer cubosomes and p6mm polymer hexasomes are obtained through the self-assembly of a rod-coil amphiphilic block copolymer (ABCP). This is the first time that these structures are observed in a rod-coil system. By varying the hydrophobic chain length, the initial concentration of the polymer solution, or the solubility parameter of the mixed solvent, head-tail asymmetry is adjusted to control the formation of polymer cubosomes or hexasomes. The formation mechanism of the polymer cubosomes was also studied. This research opens up a new way for further study of the bicontinuous and inverse phases in different ABCP systems. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Users manual for the improved NASA Lewis ice accretion code LEWICE 1.6
NASA Technical Reports Server (NTRS)
Wright, William B.
1995-01-01
This report is intended as an update/replacement to NASA CR 185129 'User's Manual for the NASALewis Ice Accretion Prediction Code (LEWICE)' and as an update to NASA CR 195387 'Update to the NASA Lewis Ice Accretion Code LEWICE'. In addition to describing the changes specifically made for this version, information from previous manuals will be duplicated so that the user will not need three manuals to use this code.
Single molecule FRET observation of kinesin-1’s head-tail interaction on microtubule
Aoki, Takahiro; Tomishige, Michio; Ariga, Takayuki
2013-01-01
Kinesin-1 (conventional kinesin) is a molecular motor that transports various cargo such as endoplasmic reticulum and mitochondria in cells. Its two head domains walk along microtubule by hydrolyzing ATP, while the tail domains at the end of the long stalk bind to the cargo. When a kinesin is not carrying cargo, its motility and ATPase activity is inhibited by direct interactions between the tail and head. However, the mechanism of this tail regulation is not well understood. Here, we apply single molecule fluorescence resonance energy transfer (smFRET) to observe this interaction in stalk-truncated kinesin. We found that kinesin with two tails forms a folding conformation and dissociates from microtubules, whereas kinesin with one tail remains bound to the micro-tubule and is immobile even in the presence of ATP. We further investigated the head-tail interaction as well as head-head coordination on the microtubule at various nucleotide conditions. From these results, we propose a two-step inhibition model for kinesin motility. PMID:27493553
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
... Plan to update the Commission's human health and aquatic life stream quality objectives (also called... DELAWARE RIVER BASIN COMMISSION 18 CFR Part 410 Amendments to the Water Quality Regulations, Water Code and Comprehensive Plan To Update Water Quality Criteria for Toxic Pollutants in the Delaware...
Update to the NASA Lewis Ice Accretion Code LEWICE
NASA Technical Reports Server (NTRS)
Wright, William B.
1994-01-01
This report is intended as an update to NASA CR-185129 'User's Manual for the NASA Lewis Ice Accretion Prediction Code (LEWICE).' It describes modifications and improvements made to this code as well as changes to the input and output files, interactive input, and graphics output. The comparison of this code to experimental data is shown to have improved as a result of these modifications.
76 FR 11339 - Update to NFPA 101, Life Safety Code, for State Home Facilities
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... DEPARTMENT OF VETERANS AFFAIRS 38 CFR Part 51 RIN 2900-AN59 Update to NFPA 101, Life Safety Code..., Life Safety Code. The change is designed to assure that State Home facilities meet current industry- wide standards regarding life safety and fire safety. DATES: Effective Date: This final rule is...
NASA Technical Reports Server (NTRS)
Shapiro, Wilbur
1996-01-01
This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.
Efficiency of feedbacks for suppression of transverse instabilities of bunched beams
Burov, Alexey
2016-08-05
Which gain and phase have to be set for a bunch-by-bunch transverse damper, and at which chromaticity it is better to stay? Furthermore, these questions are considered for three models: the two-particle model with possible quadrupole wake, the author's Nested Head-Tail Vlasov solver with a broadband impedance, and the same with the LHC impedance model.
Casein Kinase 2 Reverses Tail-Independent Inactivation of Kinesin-1
NASA Astrophysics Data System (ADS)
Xu, Jing
2013-03-01
Kinesin-1 is a plus-end microtubule-based motor, and defects in kinesin-based transport are linked to diseases including neurodegeneration. Kinesin can auto-inhibit via a head-tail interaction, but is believed to be active otherwise. Here we report a tail-independent inactivation of kinesin, reversible by the disease-relevant signalling protein, casein kinase 2 (CK2). The majority of initially active kinesin (native or tail-less) loses its ability to interact with microtubules in vitro, and CK2 reverses this inactivation (approximately fourfold) without altering kinesin's single motor properties. This activation pathway does not require motor phosphorylation, and is independent of head-tail auto-inhibition. In cultured mammalian cells, reducing CK2 expression, but not its kinase activity, decreases the force required to stall lipid droplet transport, consistent with a decreased number of active kinesin motors. Our results (Nat. Commun., 3:754, 2012) provide the first direct evidence of a protein kinase upregulating kinesin-based transport, and suggest a novel pathway for regulating the activity of cargo-bound kinesin. Work supported by NIGMS grants GM64624 to SPG, GM74830-06A1 to LH, GM76516 to LB, NS048501 to SJK, and AHA grant 825278F to JX.
Build-up Approach to Updating the Mock Quiet Spike(TradeMark) Beam Model
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Pak, Chan-gi
2007-01-01
A crucial part of aircraft design is ensuring that the required margin for flutter is satisfied. A trustworthy flutter analysis, which begins by possessing an accurate dynamics model, is necessary for this task. Traditionally, a model was updated manually by fine tuning specific stiffness parameters until the analytical results matched test data. This is a time consuming iterative process. NASA Dryden Flight Research Center has developed a mode matching code to execute this process in a more efficient manner. Recently, this code was implemented in the F-15B/Quiet Spike(TradeMark) (Gulfstream Aerospace Corporation, Savannah, Georgia) model update. A build-up approach requiring several ground vibration test configurations and a series of model updates was implemented in order to determine the connection stiffness between aircraft and test article. The mode matching code successfully updated various models for the F-15B/Quiet Spike(TradeMark) project to within 1 percent error in frequency and the modal assurance criteria values ranged from 88.51-99.42 percent.
Build-up Approach to Updating the Mock Quiet Spike(TM)Beam Model
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Pak, Chan-gi
2007-01-01
A crucial part of aircraft design is ensuring that the required margin for flutter is satisfied. A trustworthy flutter analysis, which begins by possessing an accurate dynamics model, is necessary for this task. Traditionally, a model was updated manually by fine tuning specific stiffness parameters until the analytical results matched test data. This is a time consuming iterative process. The NASA Dryden Flight Research Center has developed a mode matching code to execute this process in a more efficient manner. Recently, this code was implemented in the F-15B/Quiet Spike (Gulfstream Aerospace Corporation, Savannah, Georgia) model update. A build-up approach requiring several ground vibration test configurations and a series of model updates was implemented to determine the connection stiffness between aircraft and test article. The mode matching code successfully updated various models for the F-15B/Quiet Spike project to within 1 percent error in frequency and the modal assurance criteria values ranged from 88.51-99.42 percent.
An efficient decoding for low density parity check codes
NASA Astrophysics Data System (ADS)
Zhao, Ling; Zhang, Xiaolin; Zhu, Manjie
2009-12-01
Low density parity check (LDPC) codes are a class of forward-error-correction codes. They are among the best-known codes capable of achieving low bit error rates (BER) approaching Shannon's capacity limit. Recently, LDPC codes have been adopted by the European Digital Video Broadcasting (DVB-S2) standard, and have also been proposed for the emerging IEEE 802.16 fixed and mobile broadband wireless-access standard. The consultative committee for space data system (CCSDS) has also recommended using LDPC codes in the deep space communications and near-earth communications. It is obvious that LDPC codes will be widely used in wired and wireless communication, magnetic recording, optical networking, DVB, and other fields in the near future. Efficient hardware implementation of LDPC codes is of great interest since LDPC codes are being considered for a wide range of applications. This paper presents an efficient partially parallel decoder architecture suited for quasi-cyclic (QC) LDPC codes using Belief propagation algorithm for decoding. Algorithmic transformation and architectural level optimization are incorporated to reduce the critical path. First, analyze the check matrix of LDPC code, to find out the relationship between the row weight and the column weight. And then, the sharing level of the check node updating units (CNU) and the variable node updating units (VNU) are determined according to the relationship. After that, rearrange the CNU and the VNU, and divide them into several smaller parts, with the help of some assistant logic circuit, these smaller parts can be grouped into CNU during the check node update processing and grouped into VNU during the variable node update processing. These smaller parts are called node update kernel units (NKU) and the assistant logic circuit are called node update auxiliary unit (NAU). With NAUs' help, the two steps of iteration operation are completed by NKUs, which brings in great hardware resource reduction. Meanwhile, efficient techniques have been developed to reduce the computation delay of the node processing units and to minimize hardware overhead for parallel processing. This method may be applied not only to regular LDPC codes, but also to the irregular ones. Based on the proposed architectures, a (7493, 6096) irregular QC-LDPC code decoder is described using verilog hardware design language and implemented on Altera field programmable gate array (FPGA) StratixII EP2S130. The implementation results show that over 20% of logic core size can be saved than conventional partially parallel decoder architectures without any performance degradation. If the decoding clock is 100MHz, the proposed decoder can achieve a maximum (source data) decoding throughput of 133 Mb/s at 18 iterations.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-25
... DELAWARE RIVER BASIN COMMISSION 18 CFR Part 410 Proposed Amendments to the Water Quality Regulations, Water Code and Comprehensive Plan To Update Water Quality Criteria for pH AGENCY: Delaware River... public hearing to receive comments on proposed amendments to the Commission's Water Quality Regulations...
Collective Effects in a Diffraction Limited Storage Ring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagaoka, Ryutaro; Bane, Karl L.F.
Our paper gives an overview of collective effects that are likely to appear and possibly limit the performance in a diffraction-limited storage ring (DLSR) that stores a high-intensity ultra-low-emittance beam. Beam instabilities and other intensity-dependent effects that may significantly impact the machine performance are covered. The latter include beam-induced machine heating, Touschek scattering, intra-beam scattering, as well as incoherent tune shifts. The general trend that the efforts to achieve ultra-low emittance result in increasing the machine coupling impedance and the beam sensitivity to instability is reviewed. The nature of coupling impedance in a DLSR is described, followed by a seriesmore » of potentially dangerous beam instabilities driven by the former, such as resistive-wall, TMCI (transverse mode coupling instability), head-tail and microwave instabilities. Additionally, beam-ion and CSR (coherent synchrotron radiation) instabilities are also treated. Means to fight against collective effects such as lengthening of the bunch with passive harmonic cavities and bunch-by-bunch transverse feedback are introduced. Numerical codes developed and used to evaluate the machine coupling impedance, as well as to simulate beam instability using the former as inputs are described.« less
Collective Effects in a Diffraction Limited Storage Ring
Nagaoka, Ryutaro; Bane, Karl L.F.
2015-10-20
Our paper gives an overview of collective effects that are likely to appear and possibly limit the performance in a diffraction-limited storage ring (DLSR) that stores a high-intensity ultra-low-emittance beam. Beam instabilities and other intensity-dependent effects that may significantly impact the machine performance are covered. The latter include beam-induced machine heating, Touschek scattering, intra-beam scattering, as well as incoherent tune shifts. The general trend that the efforts to achieve ultra-low emittance result in increasing the machine coupling impedance and the beam sensitivity to instability is reviewed. The nature of coupling impedance in a DLSR is described, followed by a seriesmore » of potentially dangerous beam instabilities driven by the former, such as resistive-wall, TMCI (transverse mode coupling instability), head-tail and microwave instabilities. Additionally, beam-ion and CSR (coherent synchrotron radiation) instabilities are also treated. Means to fight against collective effects such as lengthening of the bunch with passive harmonic cavities and bunch-by-bunch transverse feedback are introduced. Numerical codes developed and used to evaluate the machine coupling impedance, as well as to simulate beam instability using the former as inputs are described.« less
Programmable Pulse-Position-Modulation Encoder
NASA Technical Reports Server (NTRS)
Zhu, David; Farr, William
2006-01-01
A programmable pulse-position-modulation (PPM) encoder has been designed for use in testing an optical communication link. The encoder includes a programmable state machine and an electronic code book that can be updated to accommodate different PPM coding schemes. The encoder includes a field-programmable gate array (FPGA) that is programmed to step through the stored state machine and code book and that drives a custom high-speed serializer circuit board that is capable of generating subnanosecond pulses. The stored state machine and code book can be updated by means of a simple text interface through the serial port of a personal computer.
[An update of the diagnostic coding system by the Spanish Society of Pediatric Emergencies].
Benito Fernández, J; Luaces Cubells, C; Gelabert Colomé, G; Anso Borda, I
2015-06-01
The Quality Working Group of the Spanish Society of Pediatric Emergencies (SEUP) presents an update of the diagnostic coding list. The original list was prepared and published in Anales de Pediatría in 2000, being based on the International Coding system ICD-9-CM current at that time. Following the same methodology used at that time and based on the 2014 edition of the ICD-9-CM, 35 new codes have been added to the list, 15 have been updated, and a list of the most frequent references to trauma diagnoses in pediatrics have been provided. In the current list of diagnoses, SEUP reflects the significant changes that have taken place in Pediatric Emergency Services in the last decade. Copyright © 2014 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.
75 FR 17644 - Update to NFPA 101, Life Safety Code, for State Home Facilities
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
... DEPARTMENT OF VETERANS AFFAIRS 38 CFR Part 51 RIN 2900-AN59 Update to NFPA 101, Life Safety Code... certain provisions of the 2009 edition of the National Fire Protection Association's NFPA 101, Life Safety... standards regarding life safety and fire safety. DATES: Written comments must be received by VA on or before...
Boltzmann Transport Code Update: Parallelization and Integrated Design Updates
NASA Technical Reports Server (NTRS)
Heinbockel, J. H.; Nealy, J. E.; DeAngelis, G.; Feldman, G. A.; Chokshi, S.
2003-01-01
The on going efforts at developing a web site for radiation analysis is expected to result in an increased usage of the High Charge and Energy Transport Code HZETRN. It would be nice to be able to do the requested calculations quickly and efficiently. Therefore the question arose, "Could the implementation of parallel processing speed up the calculations required?" To answer this question two modifications of the HZETRN computer code were created. The first modification selected the shield material of Al(2219) , then polyethylene and then Al(2219). The modified Fortran code was labeled 1SSTRN.F. The second modification considered the shield material of CO2 and Martian regolith. This modified Fortran code was labeled MARSTRN.F.
DOE Office of Scientific and Technical Information (OSTI.GOV)
K.Y. Ng
2003-08-25
The lecture covers mainly Sections 2.VIII and 3.VII of the book ''Accelerator Physics'' by S.Y. Lee, plus mode-coupling instabilities and chromaticity-driven head-tail instability. Besides giving more detailed derivation of many equations, simple interpretations of many collective instabilities are included with the intention that the phenomena can be understood more easily without going into too much mathematics. The notations of Lee's book as well as the e{sup jwt} convention are followed.
Lee, Ming-Tsung; Vishnyakov, Aleksey; Neimark, Alexander V
2013-09-05
Micelle formation in surfactant solutions is a self-assembly process governed by complex interplay of solvent-mediated interactions between hydrophilic and hydrophobic groups, which are commonly called heads and tails. However, the head-tail repulsion is not the only factor affecting the micelle formation. For the first time, we present a systematic study of the effect of chain rigidity on critical micelle concentration and micelle size, which is performed with the dissipative particle dynamics simulation method. Rigidity of the coarse-grained surfactant molecule was controlled by the harmonic bonds set between the second-neighbor beads. Compared to flexible molecules with the nearest-neighbor bonds being the only type of bonded interactions, rigid molecules exhibited a lower critical micelle concentration and formed larger and better-defined micelles. By varying the strength of head-tail repulsion and the chain rigidity, we constructed two-dimensional diagrams presenting how the critical micelle concentration and aggregation number depend on these parameters. We found that the solutions of flexible and rigid molecules that exhibited approximately the same critical micelle concentration could differ substantially in the micelle size and shape depending on the chain rigidity. With the increase of surfactant concentration, primary micelles of more rigid molecules were found less keen to agglomeration and formation of nonspherical aggregates characteristic of flexible molecules.
Muller, Sara; Hider, Samantha L; Raza, Karim; Stack, Rebecca J; Hayward, Richard A; Mallen, Christian D
2015-01-01
Objective Rheumatoid arthritis (RA) is a multisystem, inflammatory disorder associated with increased levels of morbidity and mortality. While much research into the condition is conducted in the secondary care setting, routinely collected primary care databases provide an important source of research data. This study aimed to update an algorithm to define RA that was previously developed and validated in the General Practice Research Database (GPRD). Methods The original algorithm consisted of two criteria. Individuals meeting at least one were considered to have RA. Criterion 1: ≥1 RA Read code and a disease modifying antirheumatic drug (DMARD) without an alternative indication. Criterion 2: ≥2 RA Read codes, with at least one ‘strong’ code and no alternative diagnoses. Lists of codes for consultations and prescriptions were obtained from the authors of the original algorithm where these were available, or compiled based on the original description and clinical knowledge. 4161 people with a first Read code for RA between 1 January 2010 and 31 December 2012 were selected from the Clinical Practice Research Datalink (CPRD, successor to the GPRD), and the criteria applied. Results Code lists were updated for the introduction of new Read codes and biological DMARDs. 3577/4161 (86%) of people met the updated algorithm for RA, compared to 61% in the original development study. 62.8% of people fulfilled both Criterion 1 and Criterion 2. Conclusions Those wishing to define RA in the CPRD, should consider using this updated algorithm, rather than a single RA code, if they wish to identify only those who are most likely to have RA. PMID:26700281
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... for Residential Construction in High Wind Regions. ICC 700: National Green Building Standard The..., coordinated, and necessary to regulate the built environment. Federal agencies frequently use these codes and... International Codes and Standards consist of the following: ICC Codes International Building Code. International...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-16
... for Residential Construction in High Wind Areas. ICC 700: National Green Building Standard. The... Codes and Standards that are comprehensive, coordinated, and necessary to regulate the built environment... International Codes and Standards consist of the following: ICC Codes International Building Code. International...
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This document contains the State Building Energy Codes Status prepared by Pacific Northwest National Laboratory for the U.S. Department of Energy under Contract DE-AC06-76RL01830 and dated September 1996. The U.S. Department of Energy`s Office of Codes and Standards has developed this document to provide an information resource for individuals interested in energy efficiency of buildings and the relevant building energy codes in each state and U.S. territory. This is considered to be an evolving document and will be updated twice a year. In addition, special state updates will be issued as warranted.
Integration of a supersonic unsteady aerodynamic code into the NASA FASTEX system
NASA Technical Reports Server (NTRS)
Appa, Kari; Smith, Michael J. C.
1987-01-01
A supersonic unsteady aerodynamic loads prediction method based on the constant pressure method was integrated into the NASA FASTEX system. The updated FASTEX code can be employed for aeroelastic analyses in subsonic and supersonic flow regimes. A brief description of the supersonic constant pressure panel method, as applied to lifting surfaces and body configurations, is followed by a documentation of updates required to incorporate this method in the FASTEX code. Test cases showing correlations of predicted pressure distributions, flutter solutions, and stability derivatives with available data are reported.
Development and verification of NRC`s single-rod fuel performance codes FRAPCON-3 AND FRAPTRAN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyer, C.E.; Cunningham, M.E.; Lanning, D.D.
1998-03-01
The FRAPCON and FRAP-T code series, developed in the 1970s and early 1980s, are used by the US Nuclear Regulatory Commission (NRC) to predict fuel performance during steady-state and transient power conditions, respectively. Both code series are now being updated by Pacific Northwest National Laboratory to improve their predictive capabilities at high burnup levels. The newest versions of the codes are called FRAPCON-3 and FRAPTRAN. The updates to fuel property and behavior models are focusing on providing best estimate predictions under steady-state and fast transient power conditions up to extended fuel burnups (> 55 GWd/MTU). Both codes will be assessedmore » against a data base independent of the data base used for code benchmarking and an estimate of code predictive uncertainties will be made based on comparisons to the benchmark and independent data bases.« less
Disposal Notifications and Quarterly Membership Updates for the Utility Solid Waste Group Members’ Risk-Based Approvals to Dispose of Polychlorinated Biphenyl (PCB) Remediation Waste Under Title 40 of the Code of Federal Regulations Section 761.61(c)
How unrealistic optimism is maintained in the face of reality.
Sharot, Tali; Korn, Christoph W; Dolan, Raymond J
2011-10-09
Unrealistic optimism is a pervasive human trait that influences domains ranging from personal relationships to politics and finance. How people maintain unrealistic optimism, despite frequently encountering information that challenges those biased beliefs, is unknown. We examined this question and found a marked asymmetry in belief updating. Participants updated their beliefs more in response to information that was better than expected than to information that was worse. This selectivity was mediated by a relative failure to code for errors that should reduce optimism. Distinct regions of the prefrontal cortex tracked estimation errors when those called for positive update, both in individuals who scored high and low on trait optimism. However, highly optimistic individuals exhibited reduced tracking of estimation errors that called for negative update in right inferior prefrontal gyrus. These findings indicate that optimism is tied to a selective update failure and diminished neural coding of undesirable information regarding the future.
Updates to building-code maps for the 2015 NEHRP recommended seismic provisions
Luco, Nicolas; Bachman, Robert; Crouse, C.B; Harris, James R.; Hooper, John D.; Kircher, Charles A.; Caldwell, Phillp; Rukstales, Kenneth S.
2015-01-01
With the 2014 update of the U.S. Geological Survey (USGS) National Seismic Hazard Model (NSHM) as a basis, the Building Seismic Safety Council (BSSC) has updated the earthquake ground motion maps in the National Earthquake Hazards Reduction Program (NEHRP) Recommended Seismic Provisions for New Buildings and Other Structures, with partial funding from the Federal Emergency Management Agency. Anticipated adoption of the updated maps into the American Society of Civil Engineers Minimum Design Loads for Building and Other Structures and the International Building and Residential Codes is underway. Relative to the ground motions in the prior edition of each of these documents, most of the updated values are within a ±20% change. The larger changes are, in most cases, due to the USGS NSHM updates, reasons for which are given in companion publications. In some cases, the larger changes are partly due to a BSSC update of the slope of the fragility curve that is used to calculate the risk-targeted ground motions, and/or the introduction by BSSC of a quantitative definition of “active faults” used to calculate deterministic ground motions.
Bumper 3 Update for IADC Protection Manual
NASA Technical Reports Server (NTRS)
Christiansen, Eric L.; Nagy, Kornel; Hyde, Jim
2016-01-01
The Bumper code has been the standard in use by NASA and contractors to perform meteoroid/debris risk assessments since 1990. It has undergone extensive revisions and updates [NASA JSC HITF website; Christiansen et al., 1992, 1997]. NASA Johnson Space Center (JSC) has applied BUMPER to risk assessments for Space Station, Shuttle, Mir, Extravehicular Mobility Units (EMU) space suits, and other spacecraft (e.g., LDEF, Iridium, TDRS, and Hubble Space Telescope). Bumper continues to be updated with changes in the ballistic limit equations describing failure threshold of various spacecraft components, as well as changes in the meteoroid and debris environment models. Significant efforts are expended to validate Bumper and benchmark it to other meteoroid/debris risk assessment codes. Bumper 3 is a refactored version of Bumper II. The structure of the code was extensively modified to improve maintenance, performance and flexibility. The architecture was changed to separate the frequently updated ballistic limit equations from the relatively stable common core functions of the program. These updates allow NASA to produce specific editions of the Bumper 3 that are tailored for specific customer requirements. The core consists of common code necessary to process the Micrometeoroid and Orbital Debris (MMOD) environment models, assess shadowing and calculate MMOD risk. The library of target response subroutines includes a board range of different types of MMOD shield ballistic limit equations as well as equations describing damage to various spacecraft subsystems or hardware (thermal protection materials, windows, radiators, solar arrays, cables, etc.). The core and library of ballistic response subroutines are maintained under configuration control. A change in the core will affect all editions of the code, whereas a change in one or more of the response subroutines will affect all editions of the code that contain the particular response subroutines which are modified. Note that the Bumper II program is no longer maintained or distributed by NASA.
Building a Better Campus: An Update on Building Codes.
ERIC Educational Resources Information Center
Madden, Michael J.
2002-01-01
Discusses the implications for higher education institutions in terms of facility planning, design, construction, and renovation of the move from regionally-developed model-building codes to two international sets of codes. Also addresses the new performance-based design option within the codes. (EV)
Vectorized schemes for conical potential flow using the artificial density method
NASA Technical Reports Server (NTRS)
Bradley, P. F.; Dwoyer, D. L.; South, J. C., Jr.; Keen, J. M.
1984-01-01
A method is developed to determine solutions to the full-potential equation for steady supersonic conical flow using the artificial density method. Various update schemes used generally for transonic potential solutions are investigated. The schemes are compared for speed and robustness. All versions of the computer code have been vectorized and are currently running on the CYBER-203 computer. The update schemes are vectorized, where possible, either fully (explicit schemes) or partially (implicit schemes). Since each version of the code differs only by the update scheme and elements other than the update scheme are completely vectorizable, comparisons of computational effort and convergence rate among schemes are a measure of the specific scheme's performance. Results are presented for circular and elliptical cones at angle of attack for subcritical and supercritical crossflows.
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1993-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application. An overview of the GRASP/Ada project with an emphasis on the current update is provided.
Reed-Solomon error-correction as a software patch mechanism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pendley, Kevin D.
This report explores how error-correction data generated by a Reed-Solomon code may be used as a mechanism to apply changes to an existing installed codebase. Using the Reed-Solomon code to generate error-correction data for a changed or updated codebase will allow the error-correction data to be applied to an existing codebase to both validate and introduce changes or updates from some upstream source to the existing installed codebase.
Rhodes, Gillian; Nishimura, Mayu; de Heering, Adelaide; Jeffery, Linda; Maurer, Daphne
2017-05-01
Faces are adaptively coded relative to visual norms that are updated by experience, and this adaptive coding is linked to face recognition ability. Here we investigated whether adaptive coding of faces is disrupted in individuals (adolescents and adults) who experience face recognition difficulties following visual deprivation from congenital cataracts in infancy. We measured adaptive coding using face identity aftereffects, where smaller aftereffects indicate less adaptive updating of face-coding mechanisms by experience. We also examined whether the aftereffects increase with adaptor identity strength, consistent with norm-based coding of identity, as in typical populations, or whether they show a different pattern indicating some more fundamental disruption of face-coding mechanisms. Cataract-reversal patients showed significantly smaller face identity aftereffects than did controls (Experiments 1 and 2). However, their aftereffects increased significantly with adaptor strength, consistent with norm-based coding (Experiment 2). Thus we found reduced adaptability but no fundamental disruption of norm-based face-coding mechanisms in cataract-reversal patients. Our results suggest that early visual experience is important for the normal development of adaptive face-coding mechanisms. © 2016 John Wiley & Sons Ltd.
Losing the rose tinted glasses: neural substrates of unbiased belief updating in depression
Garrett, Neil; Sharot, Tali; Faulkner, Paul; Korn, Christoph W.; Roiser, Jonathan P.; Dolan, Raymond J.
2014-01-01
Recent evidence suggests that a state of good mental health is associated with biased processing of information that supports a positively skewed view of the future. Depression, on the other hand, is associated with unbiased processing of such information. Here, we use brain imaging in conjunction with a belief update task administered to clinically depressed patients and healthy controls to characterize brain activity that supports unbiased belief updating in clinically depressed individuals. Our results reveal that unbiased belief updating in depression is mediated by strong neural coding of estimation errors in response to both good news (in left inferior frontal gyrus and bilateral superior frontal gyrus) and bad news (in right inferior parietal lobule and right inferior frontal gyrus) regarding the future. In contrast, intact mental health was linked to a relatively attenuated neural coding of bad news about the future. These findings identify a neural substrate mediating the breakdown of biased updating in major depression disorder, which may be essential for mental health. PMID:25221492
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-03
... description of comorbidity for chronic renal failure. In addition, we inadvertently omitted from Table 11 the comorbidity code ``V4511'' for chronic renal failure. These changes are not substantive changes to the... heading ``Diagnoses codes,'' for the renal failure, chronic diagnoses codes, replace code ``V451'' with...
2014-01-01
Background The pediatric complex chronic conditions (CCC) classification system, developed in 2000, requires revision to accommodate the International Classification of Disease 10th Revision (ICD-10). To update the CCC classification system, we incorporated ICD-9 diagnostic codes that had been either omitted or incorrectly specified in the original system, and then translated between ICD-9 and ICD-10 using General Equivalence Mappings (GEMs). We further reviewed all codes in the ICD-9 and ICD-10 systems to include both diagnostic and procedural codes indicative of technology dependence or organ transplantation. We applied the provisional CCC version 2 (v2) system to death certificate information and 2 databases of health utilization, reviewed the resulting CCC classifications, and corrected any misclassifications. Finally, we evaluated performance of the CCC v2 system by assessing: 1) the stability of the system between ICD-9 and ICD-10 codes using data which included both ICD-9 codes and ICD-10 codes; 2) the year-to-year stability before and after ICD-10 implementation; and 3) the proportions of patients classified as having a CCC in both the v1 and v2 systems. Results The CCC v2 classification system consists of diagnostic and procedural codes that incorporate a new neonatal CCC category as well as domains of complexity arising from technology dependence or organ transplantation. CCC v2 demonstrated close comparability between ICD-9 and ICD-10 and did not detect significant discontinuity in temporal trends of death in the United States. Compared to the original system, CCC v2 resulted in a 1.0% absolute (10% relative) increase in the number of patients identified as having a CCC in national hospitalization dataset, and a 0.4% absolute (24% relative) increase in a national emergency department dataset. Conclusions The updated CCC v2 system is comprehensive and multidimensional, and provides a necessary update to accommodate widespread implementation of ICD-10. PMID:25102958
Recent Updates to the MELCOR 1.8.2 Code for ITER Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, Brad J
This report documents recent changes made to the MELCOR 1.8.2 computer code for application to the International Thermonuclear Experimental Reactor (ITER), as required by ITER Task Agreement ITA 81-18. There are four areas of change documented by this report. The first area is the addition to this code of a model for transporting HTO. The second area is the updating of the material oxidation correlations to match those specified in the ITER Safety Analysis Data List (SADL). The third area replaces a modification to an aerosol tranpsort subroutine that specified the nominal aerosol density internally with one that now allowsmore » the user to specify this density through user input. The fourth area corrected an error that existed in an air condensation subroutine of previous versions of this modified MELCOR code. The appendices of this report contain FORTRAN listings of the coding for these modifications.« less
Ethics and the Early Childhood Educator: Using the NAEYC Code. 2005 Code Edition
ERIC Educational Resources Information Center
Freeman, Nancy; Feeney, Stephanie
2005-01-01
With updated language and references to the 2005 revision of the Code of Ethical Conduct, this book, like the NAEYC Code of Ethical Conduct, seeks to inform, not prescribe, answers to tough questions that teachers face as they work with children, families, and colleagues. To help everyone become well acquainted with the Code and use it in one's…
Standard interface files and procedures for reactor physics codes, version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, B.M.
Standards and procedures for promoting the exchange of reactor physics codes are updated to Version-III status. Standards covering program structure, interface files, file handling subroutines, and card input format are included. The implementation status of the standards in codes and the extension of the standards to new code areas are summarized. (15 references) (auth)
Coding in Stroke and Other Cerebrovascular Diseases.
Korb, Pearce J; Jones, William
2017-02-01
Accurate coding is critical for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of coding principles for patients with strokes and other cerebrovascular diseases and includes an illustrative case as a review of coding principles in a patient with acute stroke.
ON UPGRADING THE NUMERICS IN COMBUSTION CHEMISTRY CODES. (R824970)
A method of updating and reusing legacy FORTRAN codes for combustion simulations is presented using the DAEPACK software package. The procedure is demonstrated on two codes that come with the CHEMKIN-II package, CONP and SENKIN, for the constant-pressure batch reactor simulati...
X-ray emission associated with radio galaxies in the Perseus cluster
NASA Technical Reports Server (NTRS)
Rhee, George; Burns, Jack O.; Kowalski, Michael P.
1994-01-01
In this paper, we report on new x-ray observations of the Perseus cluster made using four separate pointings of the Roentgen Satellite (ROSAT) Positron Sensitive Proportional Counter (PSPC). We searched for x-ray emission associated with 16 radio galaxies and detected six above 3 sigma. We made use of the PSPC spectra to determine if the x-ray emission associated with radio galaxies in Perseus is thermal or nonthermal in origin (i.e., hot gas or an active galactic nuclei (AGN)). For the head-tail radio galaxy IC 310, we find that the data are best fit by a power law model with an unusually large spectral index alpha = 2.7. This is consistent with its unresolved spatial structure. On the other hand, a second resolved x-ray source associated with another radio galaxy 2.3 Mpc from the Perseus center (V Zw 331) is best fit by a thermal model. For three sources with insufficient flux for a full spectral analysis, we calculated hardness ratios. On this basis, the x-ray emission associated with the well known head-tail source NGC 1265 is consistent with thermal radiation. The x-ray spectra of UGC 2608 and UGC 2654 probably arise from hot gas, although very steep power-law spectra (alpha greater than 3.2) are also possible. The spectrum of NGC 1275 is quite complex due to the presence of an AGN and the galaxy's location at the center of a cluster cooling flow.
An update on the BQCD Hybrid Monte Carlo program
NASA Astrophysics Data System (ADS)
Haar, Taylor Ryan; Nakamura, Yoshifumi; Stüben, Hinnerk
2018-03-01
We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n) for K, cSW and chemical potential reweighting), a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.
NASA Technical Reports Server (NTRS)
Harp, J. L., Jr.; Oatway, T. P.
1975-01-01
A research effort was conducted with the goal of reducing computer time of a Navier Stokes Computer Code for prediction of viscous flow fields about lifting bodies. A two-dimensional, time-dependent, laminar, transonic computer code (STOKES) was modified to incorporate a non-uniform timestep procedure. The non-uniform time-step requires updating of a zone only as often as required by its own stability criteria or that of its immediate neighbors. In the uniform timestep scheme each zone is updated as often as required by the least stable zone of the finite difference mesh. Because of less frequent update of program variables it was expected that the nonuniform timestep would result in a reduction of execution time by a factor of five to ten. Available funding was exhausted prior to successful demonstration of the benefits to be derived from the non-uniform time-step method.
Updating of visual orientation in a gravity-based reference frame.
Niehof, Nynke; Tramper, Julian J; Doeller, Christian F; Medendorp, W Pieter
2017-10-01
The brain can use multiple reference frames to code line orientation, including head-, object-, and gravity-centered references. If these frames change orientation, their representations must be updated to keep register with actual line orientation. We tested this internal updating during head rotation in roll, exploiting the rod-and-frame effect: The illusory tilt of a vertical line surrounded by a tilted visual frame. If line orientation is stored relative to gravity, these distortions should also affect the updating process. Alternatively, if coding is head- or frame-centered, updating errors should be related to the changes in their orientation. Ten subjects were instructed to memorize the orientation of a briefly flashed line, surrounded by a tilted visual frame, then rotate their head, and subsequently judge the orientation of a second line relative to the memorized first while the frame was upright. Results showed that updating errors were mostly related to the amount of subjective distortion of gravity at both the initial and final head orientation, rather than to the amount of intervening head rotation. In some subjects, a smaller part of the updating error was also related to the change of visual frame orientation. We conclude that the brain relies primarily on a gravity-based reference to remember line orientation during head roll.
Discontinued Codes in The USDA Food and Nutrient Database for Dietary Studies
USDA-ARS?s Scientific Manuscript database
For each new version of the Food and Nutrient Database for Dietary Studies (FNDDS), foods and beverages, portions, and nutrient values are reviewed and updated. New food and beverage codes are added based on changes in consumption and the marketplace; additionally, codes are discontinued. To date,...
Optimization and Control of Burning Plasmas Through High Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pankin, Alexei
This project has revived the FACETS code, that has been developed under SciDAC fund- ing in 2008-2012. The code has been dormant for a number of years after the SciDAC funding stopped. FACETS depends on external packages. The external packages and libraries such as PETSc, FFTW, HDF5 and NETCDF that are included in FACETS have evolved during these years. Some packages in FACETS are also parts of other codes such as PlasmaState, NUBEAM, GACODES, and UEDGE. These packages have been also evolved together with their host codes which include TRANSP, TGYRO and XPTOR. Finally, there is also a set ofmore » packages in FACETS that are being developed and maintained by Tech-X. These packages include BILDER, SciMake, and FcioWrappers. Many of these packages evolved significantly during the last several years and FACETS had to be updated to synchronize with the re- cent progress in the external packages. The PI has introduced new changes to the BILDER package to support the updated interfaces to the external modules. During the last year of the project, the FACETS version of the UEDGE code has been extracted from FACETS as a standalone package. The PI collaborates with the scientists from LLNL on the updated UEDGE model in FACETS. Drs. T. Rognlien, M. Umansky and A. Dimits from LLNL are contributing to this task.« less
76 FR 4113 - Federal Procurement Data System Product Service Code Manual Update
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
... reported in the Federal Procurement Data System (FPDS). GSA, which maintains the PSC Manual, is in the... codes as necessary, and adding environmental/sustainability attributes required for reporting to the...
Jones, Lyell K; Ney, John P
2016-12-01
Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.
An update of input instructions to TEMOD
NASA Technical Reports Server (NTRS)
1973-01-01
The theory and operation of a FORTRAN 4 computer code, designated as TEMOD, used to calcuate tubular thermoelectric generator performance is described in WANL-TME-1906. The original version of TEMOD was developed in 1969. A description is given of additions to the mathematical model and an update of the input instructions to the code. Although the basic mathematical model described in WANL-TME-1906 has remained unchanged, a substantial number of input/output options were added to allow completion of module performance parametrics as required in support of the compact thermoelectric converter system technology program.
Additional development of the XTRAN3S computer program
NASA Technical Reports Server (NTRS)
Borland, C. J.
1989-01-01
Additional developments and enhancements to the XTRAN3S computer program, a code for calculation of steady and unsteady aerodynamics, and associated aeroelastic solutions, for 3-D wings in the transonic flow regime are described. Algorithm improvements for the XTRAN3S program were provided including an implicit finite difference scheme to enhance the allowable time step and vectorization for improved computational efficiency. The code was modified to treat configurations with a fuselage, multiple stores/nacelles/pylons, and winglets. Computer program changes (updates) for error corrections and updates for version control are provided.
Probabilistic Seismic Hazard Assessment for Iraq
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq
Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al.,more » 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.« less
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
40 CFR 52.824 - Original identification of plan section.
Code of Federal Regulations, 2014 CFR
2014-07-01
... rules, “Iowa Administrative Code,” effective February 22, 1995. This revision approves new definitions... definition updates. (E) “Iowa Administrative Code,” section 567-31.1, effective February 22, 1995. This rule... Quality and replaced the Iowa air pollution control statute which appeared as Chapter 136B of the Code of...
40 CFR 52.824 - Original identification of plan section.
Code of Federal Regulations, 2011 CFR
2011-07-01
... rules, “Iowa Administrative Code,” effective February 22, 1995. This revision approves new definitions... definition updates. (E) “Iowa Administrative Code,” section 567-31.1, effective February 22, 1995. This rule... Quality and replaced the Iowa air pollution control statute which appeared as Chapter 136B of the Code of...
40 CFR 52.824 - Original identification of plan section.
Code of Federal Regulations, 2013 CFR
2013-07-01
... rules, “Iowa Administrative Code,” effective February 22, 1995. This revision approves new definitions... definition updates. (E) “Iowa Administrative Code,” section 567-31.1, effective February 22, 1995. This rule... Quality and replaced the Iowa air pollution control statute which appeared as Chapter 136B of the Code of...
40 CFR 52.824 - Original identification of plan section.
Code of Federal Regulations, 2012 CFR
2012-07-01
... rules, “Iowa Administrative Code,” effective February 22, 1995. This revision approves new definitions... definition updates. (E) “Iowa Administrative Code,” section 567-31.1, effective February 22, 1995. This rule... Quality and replaced the Iowa air pollution control statute which appeared as Chapter 136B of the Code of...
Hispanics/Latinos & Cardiovascular Disease: Statistical Fact Sheet
Statistical Fact Sheet 2013 Update Hispanics/Latinos & Cardiovascular Diseases Cardiovascular Disease (CVD) (ICD/10 codes I00-I99, Q20-Q28) (ICD/9 codes 390-459, 745-747) Among Mexican-American adults age 20 ...
Predictive codes of familiarity and context during the perceptual learning of facial identities
NASA Astrophysics Data System (ADS)
Apps, Matthew A. J.; Tsakiris, Manos
2013-11-01
Face recognition is a key component of successful social behaviour. However, the computational processes that underpin perceptual learning and recognition as faces transition from unfamiliar to familiar are poorly understood. In predictive coding, learning occurs through prediction errors that update stimulus familiarity, but recognition is a function of both stimulus and contextual familiarity. Here we show that behavioural responses on a two-option face recognition task can be predicted by the level of contextual and facial familiarity in a computational model derived from predictive-coding principles. Using fMRI, we show that activity in the superior temporal sulcus varies with the contextual familiarity in the model, whereas activity in the fusiform face area covaries with the prediction error parameter that updated facial familiarity. Our results characterize the key computations underpinning the perceptual learning of faces, highlighting that the functional properties of face-processing areas conform to the principles of predictive coding.
Artificial Boundary Conditions for Finite Element Model Update and Damage Detection
2017-03-01
BOUNDARY CONDITIONS FOR FINITE ELEMENT MODEL UPDATE AND DAMAGE DETECTION by Emmanouil Damanakis March 2017 Thesis Advisor: Joshua H. Gordis...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE ARTIFICIAL BOUNDARY CONDITIONS FOR FINITE ELEMENT MODEL UPDATE AND DAMAGE DETECTION...release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) In structural engineering, a finite element model is often
Specifying and Verifying the Correctness of Dynamic Software Updates
2011-11-15
additional branching introduced by update points and the need to analyze the state transformer code. As tools become faster and more effective , our...It shows the effectiveness of merging-based verification on practical examples, including Redis [20], a widely deployed server program. 2 Defining...Gupta’s reachability while side -stepping the problem that reachability can leave behavior CS-TR-4997 under-constrained. For example, for the vsftpd update
78 FR 36738 - Signal System Reporting Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-19
... by updating an outdated statutory citation. DATES: Written comments must be received by August 19... interested parties of the date, time, and location of any such hearing. ADDRESSES: You may submit comments.... Updating U.S. Code Citations in Part 233 Administrative amendments are sometimes necessary to address...
Central Heat Plant Modernization: FY98 Update and Recommendations.
1999-12-01
Boiler and Pressure Vessel Code suggests an inspection frequency of 12 months for...28 April 1997). ASME International, Boiler and Pressure Vessel Code (ASME International, New York, NY, 1995). Bloomquist, R.G., J.D. Nimmons, and K...Services (HQDA, 28 April 1997). ASME International, Boiler and Pressure Vessel Code (ASME International, New York, NY, 1995). Bloomquist, R.G.,
On the effect of updated MCNP photon cross section data on the simulated response of the HPA TLD.
Eakins, Jonathan
2009-02-01
The relative response of the new Health Protection Agency thermoluminescence dosimeter (TLD) has been calculated for Narrow Series X-ray distribution and (137)Cs photon sources using the Monte Carlo code MCNP5, and the results compared with those obtained during its design stage using the predecessor code, MCNP4c2. The results agreed at intermediate energies (approximately 0.1 MeV to (137)Cs), but differed at low energies (<0.1 MeV) by up to approximately 10%. This disparity has been ascribed to differences in the default photon interaction data used by the two codes, and derives ultimately from the effect on absorbed dose of the recent updates to the photoelectric cross sections. The sources of these data have been reviewed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-30
... Disorders Fourth Edition--Text Revision. DRGs Diagnosis-related groups. FY Federal fiscal year. ICD-9-CM...) coding and diagnosis-related groups (DRGs) classification changes discussed in the annual update to the... for the following patient-level characteristics: Medicare Severity diagnosis related groups (MS-DRGs...
The Nuremberg Code-A critique.
Ghooi, Ravindra B
2011-04-01
The Nuremberg Code drafted at the end of the Doctor's trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics.
Build-Up Approach to Updating the Mock Quiet Spike Beam Model
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Pak, Chan-gi
2007-01-01
When a new aircraft is designed or a modification is done to an existing aircraft, the aeroelastic properties of the aircraft should be examined to ensure the aircraft is flight worthy. Evaluating the aeroelastic properties of a new or modified aircraft can include performing a variety of analyses, such as modal and flutter analyses. In order to produce accurate results from these analyses, it is imperative to work with finite element models (FEM) that have been validated by or correlated to ground vibration test (GVT) data, Updating an analytical model using measured data is a challenge in the area of structural dynamics. The analytical model update process encompasses a series of optimizations that match analytical frequencies and mode shapes to the measured modal characteristics of structure. In the past, the method used to update a model to test data was "trial and error." This is an inefficient method - running a modal analysis, comparing the analytical results to the GVT data, manually modifying one or more structural parameters (mass, CG, inertia, area, etc.), rerunning the analysis, and comparing the new analytical modal characteristics to the GVT modal data. If the match is close enough (close enough defined by analyst's updating requirements), then the updating process is completed. If the match does not meet updating-requirements, then the parameters are changed again and the process is repeated. Clearly, this manual optimization process is highly inefficient for large FEM's and/or a large number of structural parameters. NASA Dryden Flight Research Center (DFRC) has developed, in-house, a Mode Matching Code that automates the above-mentioned optimization process, DFRC's in-house Mode Matching Code reads mode shapes and frequencies acquired from GVT to create the target model. It also reads the current analytical model, as we11 as the design variables and their upper and lower limits. It performs a modal analysis on this model and modifies it to create an updated model that has similar mode shapes and frequencies as those of the target model. The Mode Matching Code output frequencies and modal assurance criteria (MAC) values that allow for the quantified comparison of the updated model versus the target model. A recent application of this code is the F453 supersonic flight testing platform, NASA DFRC possesses a modified F-15B that is used as a test bed aircraft for supersonic flight experiments. Traditionally, the finite element model of the test article is generated. A GVT is done on the test article ta validate and update its FEM. This FEM is then mated to the F-15B model, which was correlated to GVT data in fall of 2004, A GVT is conducted with the test article mated to the aircraft, and this mated F-15B/ test article FEM is correlated to this final GVT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacFarlane, Joseph J.; Golovkin, I. E.; Woodruff, P. R.
2009-08-07
This Final Report summarizes work performed under DOE STTR Phase II Grant No. DE-FG02-05ER86258 during the project period from August 2006 to August 2009. The project, “Development of Spectral and Atomic Models for Diagnosing Energetic Particle Characteristics in Fast Ignition Experiments,” was led by Prism Computational Sciences (Madison, WI), and involved collaboration with subcontractors University of Nevada-Reno and Voss Scientific (Albuquerque, NM). In this project, we have: Developed and implemented a multi-dimensional, multi-frequency radiation transport model in the LSP hybrid fluid-PIC (particle-in-cell) code [1,2]. Updated the LSP code to support the use of accurate equation-of-state (EOS) tables generated by Prism’smore » PROPACEOS [3] code to compute more accurate temperatures in high energy density physics (HEDP) plasmas. Updated LSP to support the use of Prism’s multi-frequency opacity tables. Generated equation of state and opacity data for LSP simulations for several materials being used in plasma jet experimental studies. Developed and implemented parallel processing techniques for the radiation physics algorithms in LSP. Benchmarked the new radiation transport and radiation physics algorithms in LSP and compared simulation results with analytic solutions and results from numerical radiation-hydrodynamics calculations. Performed simulations using Prism radiation physics codes to address issues related to radiative cooling and ionization dynamics in plasma jet experiments. Performed simulations to study the effects of radiation transport and radiation losses due to electrode contaminants in plasma jet experiments. Updated the LSP code to generate output using NetCDF to provide a better, more flexible interface to SPECT3D [4] in order to post-process LSP output. Updated the SPECT3D code to better support the post-processing of large-scale 2-D and 3-D datasets generated by simulation codes such as LSP. Updated atomic physics modeling to provide for more comprehensive and accurate atomic databases that feed into the radiation physics modeling (spectral simulations and opacity tables). Developed polarization spectroscopy modeling techniques suitable for diagnosing energetic particle characteristics in HEDP experiments. A description of these items is provided in this report. The above efforts lay the groundwork for utilizing the LSP and SPECT3D codes in providing simulation support for DOE-sponsored HEDP experiments, such as plasma jet and fast ignition physics experiments. We believe that taken together, the LSP and SPECT3D codes have unique capabilities for advancing our understanding of the physics of these HEDP plasmas. Based on conversations early in this project with our DOE program manager, Dr. Francis Thio, our efforts emphasized developing radiation physics and atomic modeling capabilities that can be utilized in the LSP PIC code, and performing radiation physics studies for plasma jets. A relatively minor component focused on the development of methods to diagnose energetic particle characteristics in short-pulse laser experiments related to fast ignition physics. The period of performance for the grant was extended by one year to August 2009 with a one-year no-cost extension, at the request of subcontractor University of Nevada-Reno.« less
Correct coding for laboratory procedures during assisted reproductive technology cycles.
2016-04-01
This document provides updated coding information for services related to assisted reproductive technology procedures. This document replaces the 2012 ASRM document of the same name. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
New Mandates and Imperatives in the Revised "ACA Code of Ethics"
ERIC Educational Resources Information Center
Kaplan, David M.; Kocet, Michael M.; Cottone, R. Rocco; Glosoff, Harriet L.; Miranti, Judith G.; Moll, E. Christine; Bloom, John W.; Bringaze, Tammy B.; Herlihy, Barbara; Lee, Courtland C.; Tarvydas, Vilia M.
2009-01-01
The first major revision of the "ACA Code of Ethics" in a decade occurred in late 2005, with the updated edition containing important new mandates and imperatives. This article provides interviews with members of the Ethics Revision Task Force that flesh out seminal changes in the revised "ACA Code of Ethics" in the areas of confidentiality,…
Mothers as Mediators of Cognitive Development: A Coding Manual. Updated.
ERIC Educational Resources Information Center
Friedman, Sarah L.; Sherman, Tracy L.
Coding systems developed for a study of the way mothers influence the cognitive development of their 2- to 4-year-old children are described in this report. The coding systems were developed for the analysis of data recorded on videotapes of 3 mother-child situations: 8 minutes of interaction starting with a reunion between mother and child, 5…
Second Generation Integrated Composite Analyzer (ICAN) Computer Code
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.
1993-01-01
This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.
Coding update of the SMFM definition of low risk for cesarean delivery from ICD-9-CM to ICD-10-CM.
Armstrong, Joanne; McDermott, Patricia; Saade, George R; Srinivas, Sindhu K
2017-07-01
In 2015, the Society for Maternal-Fetal Medicine developed a low risk for cesarean delivery definition based on administrative claims-based diagnosis codes described by the International Classification of Diseases, Ninth Revision, Clinical Modification. The Society for Maternal-Fetal Medicine definition is a clinical enrichment of 2 available measures from the Joint Commission and the Agency for Healthcare Research and Quality measures. The Society for Maternal-Fetal Medicine measure excludes diagnosis codes that represent clinically relevant risk factors that are absolute or relative contraindications to vaginal birth while retaining diagnosis codes such as labor disorders that are discretionary risk factors for cesarean delivery. The introduction of the International Statistical Classification of Diseases, 10th Revision, Clinical Modification in October 2015 expanded the number of available diagnosis codes and enabled a greater depth and breadth of clinical description. These coding improvements further enhance the clinical validity of the Society for Maternal-Fetal Medicine definition and its potential utility in tracking progress toward the goal of safely lowering the US cesarean delivery rate. This report updates the Society for Maternal-Fetal Medicine definition of low risk for cesarean delivery using International Statistical Classification of Diseases, 10th Revision, Clinical Modification coding. Copyright © 2017. Published by Elsevier Inc.
Building Codes and Regulations.
ERIC Educational Resources Information Center
Fisher, John L.
The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)
NASA Astrophysics Data System (ADS)
Frisoni, Manuela
2017-09-01
ANITA-IEAF is an activation package (code and libraries) developed in the past in ENEA-Bologna in order to assess the activation of materials exposed to neutrons with energies greater than 20 MeV. An updated version of the ANITA-IEAF activation code package has been developed. It is suitable to be applied to the study of the irradiation effects on materials in facilities like the International Fusion Materials Irradiation Facility (IFMIF) and the DEMO Oriented Neutron Source (DONES), in which a considerable amount of neutrons with energies above 20 MeV is produced. The present paper summarizes the main characteristics of the updated version of ANITA-IEAF, able to use decay and cross section data based on more recent evaluated nuclear data libraries, i.e. the JEFF-3.1.1 Radioactive Decay Data Library and the EAF-2010 neutron activation cross section library. In this paper the validation effort related to the comparison between the code predictions and the activity measurements obtained from the Karlsruhe Isochronous Cyclotron is presented. In this integral experiment samples of two different steels, SS-316 and F82H, pure vanadium and a vanadium alloy, structural materials of interest in fusion technology, were activated in a neutron spectrum similar to the IFMIF neutron field.
Updates to the CMAQ Post Processing and Evaluation Tools for 2016
In the spring of 2016, the evaluation tools distributed with the CMAQ model code were updated and new tools were added to the existing set of tools. Observation data files, compatible with the AMET software, were also made available on the CMAS website for the first time with the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-20
... Medical Program of the Uniformed Services; Calendar Year 2013 TRICARE Young Adult Program Premium Update... Young Adult Premiums for Calendar Year 2013. SUMMARY: This notice provides the updated TRICARE Young... to implement the TRICARE Young Adult (TYA) program as required by Title 10, United States Code...
Code of Federal Regulations, 2010 CFR
2010-10-01
... the methodology and data used to calculate the updated Federal per diem base payment amount. (b)(1... maintain the appropriate outlier percentage. (e) Describe the ICD-9-CM coding changes and DRG... psychiatric facilities for which the fiscal intermediary obtains inaccurate or incomplete data with which to...
77 FR 38717 - Updating Regulations Issued Under the Fair Labor Standards Act
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-29
... DEPARTMENT OF LABOR Wage and Hour Division 29 CFR Parts 531 and 553 Updating Regulations Issued Under the Fair Labor Standards Act CFR Correction In Title 29 of the Code of Federal Regulations, Parts 500 to 899, revised as of July 1, 2011, the following corrections are made: [[Page 38718
Use of Spacecraft Command Language for Advanced Command and Control Applications
NASA Technical Reports Server (NTRS)
Mims, Tikiela L.
2008-01-01
The purpose of this work is to evaluate the use of SCL in building and monitoring command and control applications in order to determine its fitness for space operations. Approximately 24,325 lines of PCG2 code was converted to SCL yielding a 90% reduction in the number of lines of code as many of the functions and scripts utilized in SCL could be ported and reused. Automated standalone testing, simulating the actual production environment, was performed in order to generalize and gauge the relative time it takes for SCL to update and write a given display. The use of SCL rules, functions, and scripts allowed the creation of several test cases permitting the detection of the amount of time it takes update a given set of measurements given the change in a globally existing CUI or CUI. It took the SCL system an average 926.09 ticks to update the entire display of 323 measurements.
Ghooi, Ravindra B.
2011-01-01
The Nuremberg Code drafted at the end of the Doctor’s trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics. PMID:21731859
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
Discriminative object tracking via sparse representation and online dictionary learning.
Xie, Yuan; Zhang, Wensheng; Li, Cuihua; Lin, Shuyang; Qu, Yanyun; Zhang, Yinghua
2014-04-01
We propose a robust tracking algorithm based on local sparse coding with discriminative dictionary learning and new keypoint matching schema. This algorithm consists of two parts: the local sparse coding with online updated discriminative dictionary for tracking (SOD part), and the keypoint matching refinement for enhancing the tracking performance (KP part). In the SOD part, the local image patches of the target object and background are represented by their sparse codes using an over-complete discriminative dictionary. Such discriminative dictionary, which encodes the information of both the foreground and the background, may provide more discriminative power. Furthermore, in order to adapt the dictionary to the variation of the foreground and background during the tracking, an online learning method is employed to update the dictionary. The KP part utilizes refined keypoint matching schema to improve the performance of the SOD. With the help of sparse representation and online updated discriminative dictionary, the KP part are more robust than the traditional method to reject the incorrect matches and eliminate the outliers. The proposed method is embedded into a Bayesian inference framework for visual tracking. Experimental results on several challenging video sequences demonstrate the effectiveness and robustness of our approach.
Bohm, Tim D; DeLuca, Paul M; DeWerd, Larry A
2003-04-01
Permanent implantation of low energy (20-40 keV) photon emitting radioactive seeds to treat prostate cancer is an important treatment option for patients. In order to produce accurate implant brachytherapy treatment plans, the dosimetry of a single source must be well characterized. Monte Carlo based transport calculations can be used for source characterization, but must have up to date cross section libraries to produce accurate dosimetry results. This work benchmarks the MCNP code and its photon cross section library for low energy photon brachytherapy applications. In particular, we calculate the emitted photon spectrum, air kerma, depth dose in water, and radial dose function for both 125I and 103Pd based seeds and compare to other published results. Our results show that MCNP's cross section library differs from recent data primarily in the photoelectric cross section for low energies and low atomic number materials. In water, differences as large as 10% in the photoelectric cross section and 6% in the total cross section occur at 125I and 103Pd photon energies. This leads to differences in the dose rate constant of 3% and 5%, and differences as large as 18% and 20% in the radial dose function for the 125I and 103Pd based seeds, respectively. Using a partially updated photon library, calculations of the dose rate constant and radial dose function agree with other published results. Further, the use of the updated photon library allows us to verify air kerma and depth dose in water calculations performed using MCNP's perturbation feature to simulate updated cross sections. We conclude that in order to most effectively use MCNP for low energy photon brachytherapy applications, we must update its cross section library. Following this update, the MCNP code system will be a very effective tool for low energy photon brachytherapy dosimetry applications.
Monte Carlo simulation of Ising models by multispin coding on a vector computer
NASA Astrophysics Data System (ADS)
Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus
1984-11-01
Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.
76 FR 64924 - Updating State Residential Building Energy Efficiency Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-19
...) considers high-rise (greater than three stories) multifamily residential buildings and hotel, motel, and..., duplexes, townhouses, row houses, and low-rise multifamily buildings (not greater than three stories) such... pumps as compared to other electric heating technologies, this code change is expected to increase the...
NASA Technical Reports Server (NTRS)
Harper, Warren
1989-01-01
Two electromagnetic scattering codes, NEC-BSC and ESP3, were delivered and installed on a NASA VAX computer for use by Marshall Space Flight Center antenna design personnel. The existing codes and certain supplementary software were updated, the codes installed on a computer that will be delivered to the customer, to provide capability for graphic display of the data to be computed by the use of the codes and to assist the customer in the solution of specific problems that demonstrate the use of the codes. With the exception of one code revision, all of these tasks were performed.
Recent update of the RPLUS2D/3D codes
NASA Technical Reports Server (NTRS)
Tsai, Y.-L. Peter
1991-01-01
The development of the RPLUS2D/3D codes is summarized. These codes utilize LU algorithms to solve chemical non-equilibrium flows in a body-fitted coordinate system. The motivation behind the development of these codes is the need to numerically predict chemical non-equilibrium flows for the National AeroSpace Plane Program. Recent improvements include vectorization method, blocking algorithms for geometric flexibility, out-of-core storage for large-size problems, and an LU-SW/UP combination for CPU-time efficiency and solution quality.
1981-12-01
file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler
Time-integrated directional detection of dark matter
NASA Astrophysics Data System (ADS)
O'Hare, Ciaran A. J.; Kavanagh, Bradley J.; Green, Anne M.
2017-10-01
The analysis of signals in directional dark matter (DM) detectors typically assumes that the directions of nuclear recoils can be measured in the Galactic rest frame. However, this is not possible with all directional detection technologies. In nuclear emulsions, for example, the recoil events must be detected and measured after the exposure time of the experiment. Unless the entire detector is mounted and rotated with the sidereal day, the recoils cannot be reoriented in the Galactic rest frame. We examine the effect of this "time integration" on the primary goals of directional detection, namely: (1) confirming that the recoils are anisotropic; (2) measuring the median recoil direction to confirm their Galactic origin; and (3) probing below the neutrino floor. We show that after time integration the DM recoil distribution retains a preferred direction and is distinct from that of Solar neutrino-induced recoils. Many of the advantages of directional detection are therefore preserved and it is not crucial to mount and rotate the detector. Rejecting isotropic backgrounds requires a factor of 2 more signal events compared with an experiment with event time information, whereas a factor of 1.5-3 more events are needed to measure a median direction in agreement with the expectation for DM. We also find that there is still effectively no neutrino floor in a time-integrated directional experiment. However to reach a cross section an order of magnitude below the floor, a factor of ˜8 larger exposure is required than with a conventional directional experiment. We also examine how the sensitivity is affected for detectors with only 2D recoil track readout, and/or no head-tail measurement. As for non-time-integrated experiments, 2D readout is not a major disadvantage, though a lack of head-tail sensitivity is.
LOFAR discovery of an ultra-steep radio halo and giant head-tail radio galaxy in Abell 1132
NASA Astrophysics Data System (ADS)
Wilber, A.; Brüggen, M.; Bonafede, A.; Savini, F.; Shimwell, T.; van Weeren, R. J.; Rafferty, D.; Mechev, A. P.; Intema, H.; Andrade-Santos, F.; Clarke, A. O.; Mahony, E. K.; Morganti, R.; Prandoni, I.; Brunetti, G.; Röttgering, H.; Mandal, S.; de Gasperin, F.; Hoeft, M.
2018-01-01
Low-Frequency Array (LOFAR) observations at 144 MHz have revealed large-scale radio sources in the unrelaxed galaxy cluster Abell 1132. The cluster hosts diffuse radio emission on scales of ∼650 kpc near the cluster centre and a head-tail (HT) radio galaxy, extending up to 1 Mpc, south of the cluster centre. The central diffuse radio emission is not seen in NRAO VLA FIRST Survey, Westerbork Northern Sky Survey, nor in C & D array VLA observations at 1.4 GHz, but is detected in our follow-up Giant Meterwave Radio Telescope (GMRT) observations at 325 MHz. Using LOFAR and GMRT data, we determine the spectral index of the central diffuse emission to be α = -1.75 ± 0.19 (S ∝ να). We classify this emission as an ultra-steep spectrum radio halo and discuss the possible implications for the physical origin of radio haloes. The HT radio galaxy shows narrow, collimated emission extending up to 1 Mpc and another 300 kpc of more diffuse, disturbed emission, giving a full projected linear size of 1.3 Mpc - classifying it as a giant radio galaxy (GRG) and making it the longest HT found to date. The head of the GRG coincides with an elliptical galaxy (SDSS J105851.01+564308.5) belonging to Abell 1132. In our LOFAR image, there appears to be a connection between the radio halo and the GRG. The turbulence that may have produced the halo may have also affected the tail of the GRG. In turn, the GRG may have provided seed electrons for the radio halo.
NASA Astrophysics Data System (ADS)
Imai, Rieko; Sugitani, Koji; Miao, Jingqi; Fukuda, Naoya; Watanabe, Makoto; Kusune, Takayoshi; Pickles, Andrew J.
2017-08-01
We carried out near-infrared (IR) observations to examine star formation toward the bright-rimmed cloud SFO 12, of which the main exciting star is O7V star in W5-W. We found a small young stellar object (YSO) cluster of six members embedded in the head of SFO 12 facing its exciting star, aligned along the UV radiation incident direction from the exciting star. We carried out high-resolution near-IR observations with the Subaru adaptive optics (AO) system and revealed that three of the cluster members appear to have circumstellar envelopes, one of which shows an arm-like structure in its envelope. Our near-IR and {L}\\prime -band photometry and Spitzer IRAC data suggest that formation of two members at the tip side occurred in advance of other members toward the central part, under our adopted assumptions. Our near-IR data and previous studies imply that more YSOs are distributed in the region just outside the cloud head on the side of the main exciting star, but there is little sign of star formation toward the opposite side. We infer that star formation has been sequentially occurring from the exciting star side to the central part. We examined archival data of far-infrared and CO (J=3-2) which reveals that, unlike in the optical image, SFO 12 has a head-tail structure that is along the UV incident direction. This suggests that SFO 12 is affected by strong UV from the main exciting star. We discuss the formation of this head-tail structure and star formation there by comparing with a radiation-driven implosion (RDI) model.
A new model for the surface arrangement of myosin molecules in tarantula thick filaments.
Offer, G; Knight, P J; Burgess, S A; Alamo, L; Padrón, R
2000-04-28
Three-dimensional reconstructions of the negatively stained thick filaments of tarantula muscle with a resolution of 50 A have previously suggested that the helical tracks of myosin heads are zigzagged, short diagonal ridges being connected by nearly axial links. However, surface views of lower contour levels reveal an additional J-shaped feature approximately the size and shape of a myosin head. We have modelled the surface array of myosin heads on the filaments using as a building block a model of a two-headed regulated myosin molecule in which the regulatory light chains of the two heads together form a compact head-tail junction. Four parameters defining the radius, orientation and rotation of each myosin molecule were varied. In addition, the heads were allowed independently to bend in a plane perpendicular to the coiled-coil tail at three sites, and to tilt with respect to the tail and to twist at one of these sites. After low-pass filtering, models were aligned with the reconstruction, scored by cross-correlation and refined by simulated annealing. Comparison of the geometry of the reconstruction and the distance between domains in the myosin molecule narrowed the choice of models to two main classes. A good match to the reconstruction was obtained with a model in which each ridge is formed from the motor domain of a head pointing to the bare zone together with the head-tail junction of a neighbouring molecule. The heads pointing to the Z-disc intermittently occupy the J-position. Each motor domain interacts with the essential and regulatory light chains of the neighbouring heads. A near-radial spoke in the reconstruction connecting the backbone to one end of the ridge can be identified as the start of the coiled-coil tail. Copyright 2000 Academic Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kenney, Jeffrey D. P.; Geha, Marla; Jáchym, Pavel
We present optical imaging and spectroscopy and H I imaging of the Virgo Cluster galaxy IC 3418, which is likely a 'smoking gun' example of the transformation of a dwarf irregular into a dwarf elliptical galaxy by ram pressure stripping. IC 3418 has a spectacular 17 kpc length UV-bright tail comprised of knots, head-tail, and linear stellar features. The only Hα emission arises from a few H II regions in the tail, the brightest of which are at the heads of head-tail UV sources whose tails point toward the galaxy ('fireballs'). Several of the elongated tail sources have Hα peaksmore » outwardly offset by ∼80-150 pc from the UV peaks, suggesting that gas clumps continue to accelerate through ram pressure, leaving behind streams of newly formed stars which have decoupled from the gas. Absorption line strengths, measured from Keck DEIMOS spectra, together with UV colors, show star formation stopped 300 ± 100 Myr ago in the main body, and a strong starburst occurred prior to quenching. While neither Hα nor H I emission are detected in the main body of the galaxy, we have detected 4 × 10{sup 7} M {sub ☉} of H I from the tail with the Very Large Array. The velocities of tail H II regions, measured from Keck LRIS spectra, extend only a small fraction of the way to the cluster velocity, suggesting that star formation does not happen in more distant parts of the tail. Stars in the outer tail have velocities exceeding the escape speed, but some in the inner tail should fall back into the galaxy, forming halo streams.« less
The Smith Cloud: surviving a high-speed transit of the Galactic disc
NASA Astrophysics Data System (ADS)
Tepper-García, Thor; Bland-Hawthorn, Joss
2018-02-01
The origin and survival of the Smith high-velocity H I cloud has so far defied explanation. This object has several remarkable properties: (i) its prograde orbit is ≈100 km s-1 faster than the underlying Galactic rotation; (ii) its total gas mass (≳ 4 × 106 M⊙) exceeds the mass of all other high-velocity clouds (HVCs) outside of the Magellanic Stream; (iii) its head-tail morphology extends to the Galactic H I disc, indicating some sort of interaction. The Smith Cloud's kinetic energy rules out models based on ejection from the disc. We construct a dynamically self-consistent, multi-phase model of the Galaxy with a view to exploring whether the Smith Cloud can be understood in terms of an infalling, compact HVC that has transited the Galactic disc. We show that while a dark-matter (DM) free HVC of sufficient mass and density can reach the disc, it does not survive the transit. The most important ingredient to survival during a transit is a confining DM subhalo around the cloud; radiative gas cooling and high spatial resolution (≲ 10pc) are also essential. In our model, the cloud develops a head-tail morphology within ∼10 Myr before and after its first disc crossing; after the event, the tail is left behind and accretes on to the disc within ∼400 Myr. In our interpretation, the Smith Cloud corresponds to a gas 'streamer' that detaches, falls back and fades after the DM subhalo, distorted by the disc passage, has moved on. We conclude that subhaloes with MDM ≲ 109 M⊙ have accreted ∼109 M⊙ of gas into the Galaxy over cosmic time - a small fraction of the total baryon budget.
Peak Performance for Healthy Schools
ERIC Educational Resources Information Center
McKale, Chuck; Townsend, Scott
2012-01-01
Far from the limelight of LEED, Energy Star or Green Globes certifications are the energy codes developed and updated by the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) and the International Code Council (ICC) through the support of the Department of Energy (DOE) as minimum guidelines for building envelope,…
Evolution of plastic anisotropy for high-strain-rate computations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiferl, S.K.; Maudlin, P.J.
1994-12-01
A model for anisotropic material strength, and for changes in the anisotropy due to plastic strain, is described. This model has been developed for use in high-rate, explicit, Lagrangian multidimensional continuum-mechanics codes. The model handles anisotropies in single-phase materials, in particular the anisotropies due to crystallographic texture--preferred orientations of the single-crystal grains. Textural anisotropies, and the changes in these anisotropies, depend overwhelmingly no the crystal structure of the material and on the deformation history. The changes, particularly for a complex deformations, are not amenable to simple analytical forms. To handle this problem, the material model described here includes a texturemore » code, or micromechanical calculation, coupled to a continuum code. The texture code updates grain orientations as a function of tensor plastic strain, and calculates the yield strength in different directions. A yield function is fitted to these yield points. For each computational cell in the continuum simulation, the texture code tracks a particular set of grain orientations. The orientations will change due to the tensor strain history, and the yield function will change accordingly. Hence, the continuum code supplies a tensor strain to the texture code, and the texture code supplies an updated yield function to the continuum code. Since significant texture changes require relatively large strains--typically, a few percent or more--the texture code is not called very often, and the increase in computer time is not excessive. The model was implemented, using a finite-element continuum code and a texture code specialized for hexagonal-close-packed crystal structures. The results for several uniaxial stress problems and an explosive-forming problem are shown.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, Maria N.; Salko, Robert K.
Coolant-Boiling in Rod Arrays|Two Fluids (COBRA-TF) is a thermal/ hydraulic (T/H) simulation code designed for light water reactor (LWR) vessel analysis. It uses a two-fluid, three-field (i.e. fluid film, fluid drops, and vapor) modeling approach. Both sub-channel and 3D Cartesian forms of 9 conservation equations are available for LWR modeling. The code was originally developed by Pacific Northwest Laboratory in 1980 and had been used and modified by several institutions over the last few decades. COBRA-TF also found use at the Pennsylvania State University (PSU) by the Reactor Dynamics and Fuel Management Group (RDFMG) and has been improved, updated, andmore » subsequently re-branded as CTF. As part of the improvement process, it was necessary to generate sufficient documentation for the open-source code which had lacked such material upon being adopted by RDFMG. This document serves mainly as a theory manual for CTF, detailing the many two-phase heat transfer, drag, and important accident scenario models contained in the code as well as the numerical solution process utilized. Coding of the models is also discussed, all with consideration for updates that have been made when transitioning from COBRA-TF to CTF. Further documentation outside of this manual is also available at RDFMG which focus on code input deck generation and source code global variable and module listings.« less
NASA Astrophysics Data System (ADS)
Duluc, Matthieu; Bardelay, Aurélie; Celik, Cihangir; Heinrichs, Dave; Hopper, Calvin; Jones, Richard; Kim, Soon; Miller, Thomas; Troisne, Marc; Wilson, Chris
2017-09-01
AWE (UK), IRSN (France), LLNL (USA) and ORNL (USA) began a long term collaboration effort in 2015 to update the nuclear criticality Slide Rule for the emergency response to a nuclear criticality accident. This document, published almost 20 years ago, gives order of magnitude estimates of key parameters, such as number of fissions and doses (neutron and gamma), useful for emergency response teams and public authorities. This paper will present, firstly the motivation and the long term objectives for this update, then the overview of the initial configurations for updated calculations and preliminary results obtained with modern 3D codes.
Peterson, M.D.; Mueller, C.S.
2011-01-01
The USGS National Seismic Hazard Maps are updated about every six years by incorporating newly vetted science on earthquakes and ground motions. The 2008 hazard maps for the central and eastern United States region (CEUS) were updated by using revised New Madrid and Charleston source models, an updated seismicity catalog and an estimate of magnitude uncertainties, a distribution of maximum magnitudes, and several new ground-motion prediction equations. The new models resulted in significant ground-motion changes at 5 Hz and 1 Hz spectral acceleration with 5% damping compared to the 2002 version of the hazard maps. The 2008 maps have now been incorporated into the 2009 NEHRP Recommended Provisions, the 2010 ASCE-7 Standard, and the 2012 International Building Code. The USGS is now planning the next update of the seismic hazard maps, which will be provided to the code committees in December 2013. Science issues that will be considered for introduction into the CEUS maps include: 1) updated recurrence models for New Madrid sources, including new geodetic models and magnitude estimates; 2) new earthquake sources and techniques considered in the 2010 model developed by the nuclear industry; 3) new NGA-East ground-motion models (currently under development); and 4) updated earthquake catalogs. We will hold a regional workshop in late 2011 or early 2012 to discuss these and other issues that will affect the seismic hazard evaluation in the CEUS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee
2016-09-01
This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.
Mutation Update of ARSA and PSAP Genes Causing Metachromatic Leukodystrophy.
Cesani, Martina; Lorioli, Laura; Grossi, Serena; Amico, Giulia; Fumagalli, Francesca; Spiga, Ivana; Filocamo, Mirella; Biffi, Alessandra
2016-01-01
Metachromatic leukodystrophy is a neurodegenerative disorder characterized by progressive demyelination. The disease is caused by variants in the ARSA gene, which codes for the lysosomal enzyme arylsulfatase A, or, more rarely, in the PSAP gene, which codes for the activator protein saposin B. In this Mutation Update, an extensive review of all the ARSA- and PSAP-causative variants published in the literature to date, accounting for a total of 200 ARSA and 10 PSAP allele types, is presented. The detailed ARSA and PSAP variant lists are freely available on the Leiden Online Variation Database (LOVD) platform at http://www.LOVD.nl/ARSA and http://www.LOVD.nl/PSAP, respectively. © 2015 WILEY PERIODICALS, INC.
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, Ray Alden; Zou, Ling; Zhao, Haihua
This document summarizes the physical models and mathematical formulations used in the RELAP-7 code. In summary, the MOOSE based RELAP-7 code development is an ongoing effort. The MOOSE framework enables rapid development of the RELAP-7 code. The developmental efforts and results demonstrate that the RELAP-7 project is on a path to success. This theory manual documents the main features implemented into the RELAP-7 code. Because the code is an ongoing development effort, this RELAP-7 Theory Manual will evolve with periodic updates to keep it current with the state of the development, implementation, and model additions/revisions.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-07
...] Draft Guidance for Industry: Bar Code Label Requirements-- Questions and Answers (Question 12 Update... Administration (FDA) is announcing the availability of a draft document entitled ``Guidance for Industry: Bar... guidance provides you, manufacturers of a licensed vaccine, with advice concerning compliance with the bar...
76 FR 32989 - Request for Certification of Compliance-Rural Industrialization Loan and Grant Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-07
.../Purpose: The loan, guarantee, or grant application is to obtain financing for infrastructure updates... will be located in Burlington, Iowa. The NAICS industry code for this enterprise is: 311812 (commercial..., Assistant Secretary for Employment and Training. [FR Doc. 2011-13937 Filed 6-6-11; 8:45 am] BILLING CODE...
The Italian Code of Medical Deontology: characterizing features of its 2014 edition.
Conti, Andrea Alberto
2015-09-14
The latest edition of the Italian Code of Medical Deontology has been released by the Italian Federation of the Registers of Physicians and Dentists in May 2014 (1). The previous edition of the Italian Code dated back to 2006 (2), and it has been integrated and updated by a multi-professional and inter-disciplinary panel involving, besides physicians, representatives of scientific societies and trade unions, jurisconsults and experts in bioethics....
NASA Glenn Steady-State Heat Pipe Code GLENHP: Compilation for 64- and 32-Bit Windows Platforms
NASA Technical Reports Server (NTRS)
Tower, Leonard K.; Geng, Steven M.
2016-01-01
A new version of the NASA Glenn Steady State Heat Pipe Code, designated "GLENHP," is introduced here. This represents an update to the disk operating system (DOS) version LERCHP reported in NASA/TM-2000-209807. The new code operates on 32- and 64-bit Windows-based platforms from within the 32-bit command prompt window. An additional evaporator boundary condition and other features are provided.
Probabilistic seismic hazard zonation for the Cuban building code update
NASA Astrophysics Data System (ADS)
Garcia, J.; Llanes-Buron, C.
2013-05-01
A probabilistic seismic hazard assessment has been performed in response to a revision and update of the Cuban building code (NC-46-99) for earthquake-resistant building construction. The hazard assessment have been done according to the standard probabilistic approach (Cornell, 1968) and importing the procedures adopted by other nations dealing with the problem of revising and updating theirs national building codes. Problems of earthquake catalogue treatment, attenuation of peak and spectral ground acceleration, as well as seismic source definition have been rigorously analyzed and a logic-tree approach was used to represent the inevitable uncertainties encountered through the whole seismic hazard estimation process. The seismic zonation proposed here, is formed by a map where it is reflected the behaviour of the spectral acceleration values for short (0.2 seconds) and large (1.0 seconds) periods on rock conditions with a 1642 -year return period, which being considered as maximum credible earthquake (ASCE 07-05). In addition, other three design levels are proposed (severe earthquake: with a 808 -year return period, ordinary earthquake: with a 475 -year return period and minimum earthquake: with a 225 -year return period). The seismic zonation proposed here fulfils the international standards (IBC-ICC) as well as the world tendencies in this thematic.
Sum of the Magnitude for Hard Decision Decoding Algorithm Based on Loop Update Detection.
Meng, Jiahui; Zhao, Danfeng; Tian, Hai; Zhang, Liang
2018-01-15
In order to improve the performance of non-binary low-density parity check codes (LDPC) hard decision decoding algorithm and to reduce the complexity of decoding, a sum of the magnitude for hard decision decoding algorithm based on loop update detection is proposed. This will also ensure the reliability, stability and high transmission rate of 5G mobile communication. The algorithm is based on the hard decision decoding algorithm (HDA) and uses the soft information from the channel to calculate the reliability, while the sum of the variable nodes' (VN) magnitude is excluded for computing the reliability of the parity checks. At the same time, the reliability information of the variable node is considered and the loop update detection algorithm is introduced. The bit corresponding to the error code word is flipped multiple times, before this is searched in the order of most likely error probability to finally find the correct code word. Simulation results show that the performance of one of the improved schemes is better than the weighted symbol flipping (WSF) algorithm under different hexadecimal numbers by about 2.2 dB and 2.35 dB at the bit error rate (BER) of 10 -5 over an additive white Gaussian noise (AWGN) channel, respectively. Furthermore, the average number of decoding iterations is significantly reduced.
Ning, Shangwei; Yue, Ming; Wang, Peng; Liu, Yue; Zhi, Hui; Zhang, Yan; Zhang, Jizhou; Gao, Yue; Guo, Maoni; Zhou, Dianshuang; Li, Xin; Li, Xia
2017-01-04
We describe LincSNP 2.0 (http://bioinfo.hrbmu.edu.cn/LincSNP), an updated database that is used specifically to store and annotate disease-associated single nucleotide polymorphisms (SNPs) in human long non-coding RNAs (lncRNAs) and their transcription factor binding sites (TFBSs). In LincSNP 2.0, we have updated the database with more data and several new features, including (i) expanding disease-associated SNPs in human lncRNAs; (ii) identifying disease-associated SNPs in lncRNA TFBSs; (iii) updating LD-SNPs from the 1000 Genomes Project; and (iv) collecting more experimentally supported SNP-lncRNA-disease associations. Furthermore, we developed three flexible online tools to retrieve and analyze the data. Linc-Mart is a convenient way for users to customize their own data. Linc-Browse is a tool for all data visualization. Linc-Score predicts the associations between lncRNA and disease. In addition, we provided users a newly designed, user-friendly interface to search and download all the data in LincSNP 2.0 and we also provided an interface to submit novel data into the database. LincSNP 2.0 is a continually updated database and will serve as an important resource for investigating the functions and mechanisms of lncRNAs in human diseases. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Posttest calculation of the PBF LOC-11B and LOC-11C experiments using RELAP4/MOD6. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, C.E.
Comparisons between RELAP4/MOD6, Update 4 code-calculated and measured experimental data are presented for the PBF LOC-11C and LOC-11B experiments. Independent code verification techniques are now being developed and this study represents a preliminary effort applying structured criteria for developing computer models, selecting code input, and performing base-run analyses. Where deficiencies are indicated in the base-case representation of the experiment, methods of code and criteria improvement are developed and appropriate recommendations are made.
Photoionization and High Density Gas
NASA Technical Reports Server (NTRS)
Kallman, T.; Bautista, M.; White, Nicholas E. (Technical Monitor)
2002-01-01
We present results of calculations using the XSTAR version 2 computer code. This code is loosely based on the XSTAR v.1 code which has been available for public use for some time. However it represents an improvement and update in several major respects, including atomic data, code structure, user interface, and improved physical description of ionization/excitation. In particular, it now is applicable to high density situations in which significant excited atomic level populations are likely to occur. We describe the computational techniques and assumptions, and present sample runs with particular emphasis on high density situations.
... Bulletins Patient Education Green Journal Clinical Updates Practice Management Coding Health Info Technology Professional Liability Managing Your Practice Patient Safety & Quality Payment Reform (MACRA) Education & Events Annual Meeting CME ...
2017-04-13
modelling code, a parallel benchmark , and a communication avoiding version of the QR algorithm. Further, several improvements to the OmpSs model were...movement; and a port of the dynamic load balancing library to OmpSs. Finally, several updates to the tools infrastructure were accomplished, including: an...OmpSs: a basic algorithm on image processing applications, a mini application representative of an ocean modelling code, a parallel benchmark , and a
TEMPUS: Simulating personnel and tasks in a 3-D environment
NASA Technical Reports Server (NTRS)
Badler, N. I.; Korein, J. D.
1985-01-01
The latest TEMPUS installation occurred in March, 1985. Another update is slated for early June, 1985. An updated User's Manual is in preparation and will be delivered approximately mid-June, 1985. NASA JSC has full source code listings and internal documentation for installed software. NASA JSC staff has received instruction in the use of TEMPUS. Telephone consultations have augmented on-site instruction.
ERIC Educational Resources Information Center
Food and Drug Administration (DHHS/PHS), Rockville, MD.
This document provides information, standards, and behavioral objectives for standardization and certification of retail food inspection personnel in the Food and Drug Administration (FDA). The procedures described in the document are based on the FDA Food Code, updated to reflect current Food Code provisions and to include a more refined focus on…
... Bulletins Patient Education Green Journal Clinical Updates Practice Management Coding Health Info Technology Professional Liability Managing Your Practice Patient Safety & Quality Payment Reform (MACRA) Education & Events Annual Meeting CME ...
... Journal Clinical Updates Practice Management Coding Health Info Technology Professional Liability Managing Your Practice Patient Safety & Quality Payment Reform (MACRA) Education & Events Annual Meeting CME Overview CREOG Meetings Calendar ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Womeldorff, Geoffrey Alan; Payne, Joshua Estes; Bergen, Benjamin Karl
These are slides for a presentation on PARTISN Research and FleCSI Updates. The following topics are covered: SNAP vs PARTISN, Background Research, Production Code (structural design and changes, kernel design and implementation, lessons learned), NuT IMC Proxy, FleCSI Update (design and lessons learned). It can all be summarized in the following manner: Kokkos was shown to be effective in FY15 in implementing a C++ version of SNAP's kernel. This same methodology was applied to a production IC code, PARTISN. This was a much more complex endeavour than in FY15 for many reasons; a C++ kernel embedded in Fortran, overloading Fortranmore » memory allocations, general language interoperability, and a fully fleshed out production code versus a simplified proxy code. Lessons learned are Legion. In no particular order: Interoperability between Fortran and C++ was really not that hard, and a useful engineering effort. Tracking down all necessary memory allocations for a kernel in a production code is pretty hard. Modifying a production code to work for more than a handful of use cases is also pretty hard. Figuring out the toolchain that will allow a successful implementation of design decisions is quite hard, if making use of "bleeding edge" design choices. In terms of performance, production code concurrency architecture can be a virtual showstopper; being too complex to easily rewrite and test in a short period of time, or depending on tool features which do not exist yet. Ultimately, while the tools used in this work were not successful in speeding up the production code, they helped to identify how work would be done, and provide requirements to tools.« less
Gao, Y Nina
2018-04-06
The Resource-Based Relative Value Scale Update Committee (RUC) submits recommended reimbursement values for physician work (wRVUs) under Medicare Part B. The RUC includes rotating representatives from medical specialties. To identify changes in physician reimbursements associated with RUC rotating seat representation. Relative Value Scale Update Committee members 1994-2013; Medicare Part B Relative Value Scale 1994-2013; Physician/Supplier Procedure Summary Master File 2007; Part B National Summary Data File 2000-2011. I match service and procedure codes to specialties using 2007 Medicare billing data. Subsequently, I model wRVUs as a function of RUC rotating committee representation and level of code specialization. An annual RUC rotating seat membership is associated with a statistically significant 3-5 percent increase in Medicare expenditures for codes billed to that specialty. For codes that are performed by a small number of physicians, the association between reimbursement and rotating subspecialty representation is positive, 0.177 (SE = 0.024). For codes that are performed by a large number of physicians, the association is negative, -0.183 (SE = 0.026). Rotating representation on the RUC is correlated with overall reimbursement rates. The resulting differential changes may exacerbate existing reimbursement discrepancies between generalist and specialist practitioners. © Health Research and Educational Trust.
Infant Mortality: Development of a Proposed Update to the Dollfus Classification of Infant Deaths
Dove, Melanie S.; Minnal, Archana; Damesyn, Mark; Curtis, Michael P.
2015-01-01
Objective Identifying infant deaths with common underlying causes and potential intervention points is critical to infant mortality surveillance and the development of prevention strategies. We constructed an International Classification of Diseases 10th Revision (ICD-10) parallel to the Dollfus cause-of-death classification scheme first published in 1990, which organized infant deaths by etiology and their amenability to prevention efforts. Methods Infant death records for 1996, dual-coded to the ICD Ninth Revision (ICD-9) and ICD-10, were obtained from the CDC public-use multiple-cause-of-death file on comparability between ICD-9 and ICD-10. We used the underlying cause of death to group 27,821 infant deaths into the nine categories of the ICD-9-based update to Dollfus' original coding scheme, published by Sowards in 1999. Comparability ratios were computed to measure concordance between ICD versions. Results The Dollfus classification system updated with ICD-10 codes had limited agreement with the 1999 modified classification system. Although prematurity, congenital malformations, Sudden Infant Death Syndrome, and obstetric conditions were the first through fourth most common causes of infant death under both systems, most comparability ratios were significantly different from one system to the other. Conclusion The Dollfus classification system can be adapted for use with ICD-10 codes to create a comprehensive, etiology-based profile of infant deaths. The potential benefits of using Dollfus logic to guide perinatal mortality reduction strategies, particularly to maternal and child health programs and other initiatives focused on improving infant health, warrant further examination of this method's use in perinatal mortality surveillance. PMID:26556935
Evaluation of the DRAGON code for VHTR design analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division
2006-01-12
This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less
... Bulletins Patient Education Green Journal Clinical Updates Practice Management Coding Health Info Technology Professional Liability Managing Your Practice Patient Safety & Quality Payment Reform (MACRA) Education & Events Annual Meeting CME ...
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting themore » time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.« less
Faunt, C.C.; Hanson, R.T.; Martin, P.; Schmid, W.
2011-01-01
California's Central Valley has been one of the most productive agricultural regions in the world for more than 50 years. To better understand the groundwater availability in the valley, the U.S. Geological Survey (USGS) developed the Central Valley hydrologic model (CVHM). Because of recent water-level declines and renewed subsidence, the CVHM is being updated to better simulate the geohydrologic system. The CVHM updates and refinements can be grouped into two general categories: (1) model code changes and (2) data updates. The CVHM updates and refinements will require that the model be recalibrated. The updated CVHM will provide a detailed transient analysis of changes in groundwater availability and flow paths in relation to climatic variability, urbanization, stream flow, and changes in irrigated agricultural practices and crops. The updated CVHM is particularly focused on more accurately simulating the locations and magnitudes of land subsidence. The intent of the updated CVHM is to help scientists better understand the availability and sustainability of water resources and the interaction of groundwater levels with land subsidence. ?? 2011 ASCE.
Faunt, Claudia C.; Hanson, Randall T.; Martin, Peter; Schmid, Wolfgang
2011-01-01
California's Central Valley has been one of the most productive agricultural regions in the world for more than 50 years. To better understand the groundwater availability in the valley, the U.S. Geological Survey (USGS) developed the Central Valley hydrologic model (CVHM). Because of recent water-level declines and renewed subsidence, the CVHM is being updated to better simulate the geohydrologic system. The CVHM updates and refinements can be grouped into two general categories: (1) model code changes and (2) data updates. The CVHM updates and refinements will require that the model be recalibrated. The updated CVHM will provide a detailed transient analysis of changes in groundwater availability and flow paths in relation to climatic variability, urbanization, stream flow, and changes in irrigated agricultural practices and crops. The updated CVHM is particularly focused on more accurately simulating the locations and magnitudes of land subsidence. The intent of the updated CVHM is to help scientists better understand the availability and sustainability of water resources and the interaction of groundwater levels with land subsidence.
AeroDyn V15.04: Design tool for wind and MHK turbines
Murray, Robynne; Hayman, Greg; Jonkman, Jason
2017-04-28
AeroDyn is a time-domain wind and MHK turbine aerodynamics module that can be coupled into the FAST version 8 multi-physics engineering tool to enable aero-elastic simulation of horizontal-axis wind turbines. AeroDyn V15.04 has been updated to include a cavitation check for MHK turbines, and can be driven as a standalone code to compute wind turbine aerodynamic response uncoupled from FAST. Note that while AeroDyn has been updated to v15.04, FAST v8.16 has not yet been updated and still uses AeroDyn v15.03.
2014-08-06
This final rule will update the prospective payment rates for Medicare inpatient hospital services provided by inpatient psychiatric facilities (IPFs). These changes will be applicable to IPF discharges occurring during the fiscal year (FY) beginning October 1, 2014 through September 30, 2015. This final rule will also address implementation of ICD-10-CM and ICD-10-PCS codes; finalize a new methodology for updating the cost of living adjustment (COLA), and finalize new quality measures and reporting requirements under the IPF quality reporting program.
A High Altitude Ionization Structure and Scintillation Model.
1979-02-19
structure and convection model into an existing systems code. The purpose is to update estimates of the scintillation effects of the structuring of the...OSEC 37.2 Fig. 16. Isodensity contours of plasma density at t - 0 sec. The ini- tial destribution for NJNO is a gaus- sian in y, centered at y - 12.1 km...striations is a task for future work. 35 - . 5. IMPLEMENTATION INTO AN EXISTING CODE In any existing systems code that uses RANC phenomenology, there
India: Chronology of Recent Events
2007-02-13
Order Code RS21589 Updated February 13, 2007 India : Chronology of Recent Events K. Alan Kronstadt Specialist in Asian Affairs Foreign Affairs...Defense, and Trade Division Summary This report provides a reverse chronology of recent events involving India and India -U.S. relations. Sources include... India -U.S. Relations. This report will be updated regularly. 02/13/07 — Commerce Secretary Gutierrez began a two-day visit to New Delhi, where he
Curran, Vernon; Fleet, Lisa; Greene, Melanie
2012-01-01
Resuscitation and life support skills training comprises a significant proportion of continuing education programming for health professionals. The purpose of this study was to explore the perceptions and attitudes of certified resuscitation providers toward the retention of resuscitation skills, regular skills updating, and methods for enhancing retention. A mixed-methods, explanatory study design was undertaken utilizing focus groups and an online survey-questionnaire of rural and urban health care providers. Rural providers reported less experience with real codes and lower abilities across a variety of resuscitation areas. Mock codes, practice with an instructor and a team, self-practice with a mannequin, and e-learning were popular methods for skills updating. Aspects of team performance that were felt to influence resuscitation performance included: discrepancies in skill levels, lack of communication, and team leaders not up to date on their skills. Confidence in resuscitation abilities was greatest after one had recently practiced or participated in an update or an effective debriefing session. Lowest confidence was reported when team members did not work well together, there was no clear leader of the resuscitation code, or if team members did not communicate. The study findings highlight the importance of access to update methods for improving providers' confidence and abilities, and the need for emphasis on teamwork training in resuscitation. An eclectic approach combining methods may be the best strategy for addressing the needs of health professionals across various clinical departments and geographic locales. Copyright © 2012 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.
2015-02-01
monitoring of veterans with major depressive disorder (MDD) and whether those who are prescribed an antidepressant receive recommended care, we...determined that VA data may underestimate the prevalence of major depressive disorder among veterans and that a lack of training for VA clinicians on...not always appropriately coded encounters with veterans they diagnosed as having MDD, instead using a less specific diagnostic code for “ depression
Acquisition Handbook - Update. Comprehensive Approach to Reusable Defensive Software (CARDS)
1994-03-25
designs, and implementation components (source code, test plans, procedures and results, and system/software documentation). This handbook provides a...activities where software components are acquired, evaluated, tested and sometimes modified. In addition to serving as a facility for the acquisition and...systems from such components [1]. Implementation components are at the lowest level and consist of: specifications; detailed designs; code, test
Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release
NASA Astrophysics Data System (ADS)
Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.
2017-11-01
We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.
Aeroacoustic Analysis of Turbofan Noise Generation
NASA Technical Reports Server (NTRS)
Meyer, Harold D.; Envia, Edmane
1996-01-01
This report provides an updated version of analytical documentation for the V072 Rotor Wake/Stator Interaction Code. It presents the theoretical derivation of the equations used in the code and, where necessary, it documents the enhancements and changes made to the original code since its first release. V072 is a package of FORTRAN computer programs which calculate the in-duct acoustic modes excited by a fan/stator stage operating in a subsonic mean flow. Sound is generated by the stator vanes interacting with the mean wakes of the rotor blades. In this updated version, only the tonal noise produced at the blade passing frequency and its harmonics, is described. The broadband noise component analysis, which was part of the original report, is not included here. The code provides outputs of modal pressure and power amplitudes generated by the rotor-wake/stator interaction. The rotor/stator stage is modeled as an ensemble of blades and vanes of zero camber and thickness enclosed within an infinite hard-walled annular duct. The amplitude of each propagating mode is computed and summed to obtain the harmonics of sound power flux within the duct for both upstream and downstream propagating modes.
Minucci, Angelo; Moradkhani, Kamran; Hwang, Ming Jing; Zuppi, Cecilia; Giardina, Bruno; Capoluongo, Ettore
2012-03-15
In the present paper we have updated the G6PD mutations database, including all the last discovered G6PD genetic variants. We underline that the last database has been published by Vulliamy et al. [1] who analytically reported 140 G6PD mutations: along with Vulliamy's database, there are two main sites, such as http://202.120.189.88/mutdb/ and www.LOVD.nl/MR, where almost all G6PD mutations can be found. Compared to the previous mutation reports, in our paper we have included for each mutation some additional information, such as: the secondary structure and the enzyme 3D position involving by mutation, the creation or abolition of a restriction site (with the enzyme involved) and the conservation score associated with each amino acid position. The mutations reported in the present tab have been divided according to the gene's region involved (coding and non-coding) and mutations affecting the coding region in: single, multiple (at least with two bases involved) and deletion. We underline that for the listed mutations, reported in italic, literature doesn't provide all the biochemical or bio-molecular information or the research data. Finally, for the "old" mutations, we tried to verify features previously reported and, when subsequently modified, we updated the specific information using the latest literature data. Copyright © 2012 Elsevier Inc. All rights reserved.
Seluge++: A Secure Over-the-Air Programming Scheme in Wireless Sensor Networks
Doroodgar, Farzan; Razzaque, Mohammad Abdur; Isnin, Ismail Fauzi
2014-01-01
Over-the-air dissemination of code updates in wireless sensor networks have been researchers' point of interest in the last few years, and, more importantly, security challenges toward the remote propagation of code updating have occupied the majority of efforts in this context. Many security models have been proposed to establish a balance between the energy consumption and security strength, having their concentration on the constrained nature of wireless sensor network (WSN) nodes. For authentication purposes, most of them have used a Merkle hash tree to avoid using multiple public cryptography operations. These models mostly have assumed an environment in which security has to be at a standard level. Therefore, they have not investigated the tree structure for mission-critical situations in which security has to be at the maximum possible level (e.g., military applications, healthcare). Considering this, we investigate existing security models used in over-the-air dissemination of code updates for possible vulnerabilities, and then, we provide a set of countermeasures, correspondingly named Security Model Requirements. Based on the investigation, we concentrate on Seluge, one of the existing over-the-air programming schemes, and we propose an improved version of it, named Seluge++, which complies with the Security Model Requirements and replaces the use of the inefficient Merkle tree with a novel method. Analytical and simulation results show the improvements in Seluge++ compared to Seluge. PMID:24618781
Sum of the Magnitude for Hard Decision Decoding Algorithm Based on Loop Update Detection
Meng, Jiahui; Zhao, Danfeng; Tian, Hai; Zhang, Liang
2018-01-01
In order to improve the performance of non-binary low-density parity check codes (LDPC) hard decision decoding algorithm and to reduce the complexity of decoding, a sum of the magnitude for hard decision decoding algorithm based on loop update detection is proposed. This will also ensure the reliability, stability and high transmission rate of 5G mobile communication. The algorithm is based on the hard decision decoding algorithm (HDA) and uses the soft information from the channel to calculate the reliability, while the sum of the variable nodes’ (VN) magnitude is excluded for computing the reliability of the parity checks. At the same time, the reliability information of the variable node is considered and the loop update detection algorithm is introduced. The bit corresponding to the error code word is flipped multiple times, before this is searched in the order of most likely error probability to finally find the correct code word. Simulation results show that the performance of one of the improved schemes is better than the weighted symbol flipping (WSF) algorithm under different hexadecimal numbers by about 2.2 dB and 2.35 dB at the bit error rate (BER) of 10−5 over an additive white Gaussian noise (AWGN) channel, respectively. Furthermore, the average number of decoding iterations is significantly reduced. PMID:29342963
Seluge++: a secure over-the-air programming scheme in wireless sensor networks.
Doroodgar, Farzan; Abdur Razzaque, Mohammad; Isnin, Ismail Fauzi
2014-03-11
Over-the-air dissemination of code updates in wireless sensor networks have been researchers' point of interest in the last few years, and, more importantly, security challenges toward the remote propagation of code updating have occupied the majority of efforts in this context. Many security models have been proposed to establish a balance between the energy consumption and security strength, having their concentration on the constrained nature of wireless sensor network (WSN) nodes. For authentication purposes, most of them have used a Merkle hash tree to avoid using multiple public cryptography operations. These models mostly have assumed an environment in which security has to be at a standard level. Therefore, they have not investigated the tree structure for mission-critical situations in which security has to be at the maximum possible level (e.g., military applications, healthcare). Considering this, we investigate existing security models used in over-the-air dissemination of code updates for possible vulnerabilities, and then, we provide a set of countermeasures, correspondingly named Security Model Requirements. Based on the investigation, we concentrate on Seluge, one of the existing over-the-air programming schemes, and we propose an improved version of it, named Seluge++, which complies with the Security Model Requirements and replaces the use of the inefficient Merkle tree with a novel method. Analytical and simulation results show the improvements in Seluge++ compared to Seluge.
Staying Active: Physical Activity and Exercise
... Bulletins Patient Education Green Journal Clinical Updates Practice Management Coding Health Info Technology Professional Liability Managing Your Practice Patient Safety & Quality Payment Reform (MACRA) Education & Events Annual Meeting CME ...
2002 CNA Code of Ethics: some recommendations.
Kikuchi, June F
2004-07-01
The Canadian Nurses Association (CNA) recently revised its 1997 Code of Ethics for Registered Nurses to reflect the context within which nurses practise today. Given the unprecedented changes that have taken place within the profession, healthcare and society, it was timely for the CNA to review and revise its Code. But the revisions were relatively minor; important problematic, substantive aspects of the Code were essentially left untouched and persist in the updated 2002 Code. In this paper, three of those aspects are examined and discussed: the 2002 Code's (a) definition of health and well-being, (b) notion of respect and (c) conception of justice. Recommendations are made. It is hoped that these comments will encourage nurse leaders in Canada to initiate discussion of the Code now, in preparation for its next planned revision in 2007.
NEQAIRv14.0 Release Notes: Nonequilibrium and Equilibrium Radiative Transport Spectra Program
NASA Technical Reports Server (NTRS)
Brandis, Aaron Michael; Cruden, Brett A.
2014-01-01
NEQAIR v14.0 is the first parallelized version of NEQAIR. Starting from the last version of the code that went through the internal software release process at NASA Ames (NEQAIR 2008), there have been significant updates to the physics in the code and the computational efficiency. NEQAIR v14.0 supersedes NEQAIR v13.2, v13.1 and the suite of NEQAIR2009 versions. These updates have predominantly been performed by Brett Cruden and Aaron Brandis from ERC Inc at NASA Ames Research Center in 2013 and 2014. A new naming convention is being adopted with this current release. The current and future versions of the code will be named NEQAIR vY.X. The Y will refer to a major release increment. Minor revisions and update releases will involve incrementing X. This is to keep NEQAIR more in line with common software release practices. NEQAIR v14.0 is a standalone software tool for line-by-line spectral computation of radiative intensities and/or radiative heat flux, with one-dimensional transport of radiation. In order to accomplish this, NEQAIR v14.0, as in previous versions, requires the specification of distances (in cm), temperatures (in K) and number densities (in parts/cc) of constituent species along lines of sight. Therefore, it is assumed that flow quantities have been extracted from flow fields computed using other tools, such as CFD codes like DPLR or LAURA, and that lines of sight have been constructed and written out in the format required by NEQAIR v14.0. There are two principal modes for running NEQAIR v14.0. In the first mode NEQAIR v14.0 is used as a tool for creating synthetic spectra of any desired resolution (including convolution with a specified instrument/slit function). The first mode is typically exercised in simulating/interpreting spectroscopic measurements of different sources (e.g. shock tube data, plasma torches, etc.). In the second mode, NEQAIR v14.0 is used as a radiative heat flux prediction tool for flight projects. Correspondingly, NEQAIR has also been used to simulate the radiance measured on previous flight missions. This report summarizes the database updates, corrections that have been made to the code, changes to input files, parallelization, the current usage recommendations, including test cases, and an indication of the performance enhancements achieved.
Quality of head injury coding from autopsy reports with AIS © 2005 update 2008.
Schick, Sylvia; Humrich, Anton; Graw, Matthias
2018-02-28
ABSTACT Objective: Coding injuries from autopsy reports of traffic accident victims according to Abbreviated Injury Scale AIS © 2005 update 2008 [1] is quite time consuming. The suspicion arose, that many issues leading to discussion between coder and control reader were based on information required by the AIS that was not documented in the autopsy reports. To quantify this suspicion, we introduced an AIS-detail-indicator (AIS-DI). To each injury in the AIS Codebook one letter from A to N was assigned indicating the level of detail. Rules were formulated to receive repeatable assignments. This scheme was applied to a selection of 149 multiply injured traffic fatalities. The frequencies of "not A" codes were calculated for each body region and it was analysed, why the most detailed level A had not been coded. As a first finding, the results of the head region are presented. 747 AIS head injury codes were found in 137 traffic fatalities, and 60% of these injuries were coded with an AIS-DI of level A. There are three different explanations for codes of AIS-DI "not A": Group 1 "Missing information in autopsy report" (5%), Group 2 "Clinical data required by AIS" (20%), and Group 3 "AIS system determined" (15%). Groups 1 and 2 show consequences for the ISS in 25 cases. Other body regions might perform differently. The AIS-DI can indicate the quality of the underlying data basis and, depending on the aims of different AIS users it can be a helpful tool for quality checks.
Vinje, Kristine Hansen; Phan, Linh Thi Hong; Nguyen, Tuan Thanh; Henjum, Sigrun; Ribe, Lovise Omoijuanfo; Mathisen, Roger
2017-06-01
To review regulations and to perform a media audit of promotion of products under the scope of the International Code of Marketing of Breast-milk Substitutes ('the Code') in South-East Asia. We reviewed national regulations relating to the Code and 800 clips of editorial content, 387 advertisements and 217 Facebook posts from January 2015 to January 2016. We explored the ecological association between regulations and market size, and between the number of advertisements and market size and growth of milk formula. Cambodia, Indonesia, Myanmar, Thailand and Vietnam. Regulations on the child's age for inappropriate marketing of products are all below the Code's updated recommendation of 36 months (i.e. 12 months in Thailand and Indonesia; 24 months in the other three countries) and are voluntary in Thailand. Although the advertisements complied with the national regulations on the age limit, they had content (e.g. stages of milk formula; messages about the benefit; pictures of a child) that confused audiences. Market size and growth of milk formula were positively associated with the number of newborns and the number of advertisements, and were not affected by the current level of implementation of breast-milk substitute laws and regulations. The present media audit reveals inappropriate promotion and insufficient national regulation of products under the scope of the Code in South-East Asia. Strengthened implementation of regulations aligned with the Code's updated recommendation should be part of comprehensive strategies to minimize the harmful effects of advertisements of breast-milk substitutes on maternal and child nutrition and health.
NONCODE v2.0: decoding the non-coding.
He, Shunmin; Liu, Changning; Skogerbø, Geir; Zhao, Haitao; Wang, Jie; Liu, Tao; Bai, Baoyan; Zhao, Yi; Chen, Runsheng
2008-01-01
The NONCODE database is an integrated knowledge database designed for the analysis of non-coding RNAs (ncRNAs). Since NONCODE was first released 3 years ago, the number of known ncRNAs has grown rapidly, and there is growing recognition that ncRNAs play important regulatory roles in most organisms. In the updated version of NONCODE (NONCODE v2.0), the number of collected ncRNAs has reached 206 226, including a wide range of microRNAs, Piwi-interacting RNAs and mRNA-like ncRNAs. The improvements brought to the database include not only new and updated ncRNA data sets, but also an incorporation of BLAST alignment search service and access through our custom UCSC Genome Browser. NONCODE can be found under http://www.noncode.org or http://noncode.bioinfo.org.cn.
Interactive Exploration for Continuously Expanding Neuron Databases.
Li, Zhongyu; Metaxas, Dimitris N; Lu, Aidong; Zhang, Shaoting
2017-02-15
This paper proposes a novel framework to help biologists explore and analyze neurons based on retrieval of data from neuron morphological databases. In recent years, the continuously expanding neuron databases provide a rich source of information to associate neuronal morphologies with their functional properties. We design a coarse-to-fine framework for efficient and effective data retrieval from large-scale neuron databases. In the coarse-level, for efficiency in large-scale, we employ a binary coding method to compress morphological features into binary codes of tens of bits. Short binary codes allow for real-time similarity searching in Hamming space. Because the neuron databases are continuously expanding, it is inefficient to re-train the binary coding model from scratch when adding new neurons. To solve this problem, we extend binary coding with online updating schemes, which only considers the newly added neurons and update the model on-the-fly, without accessing the whole neuron databases. In the fine-grained level, we introduce domain experts/users in the framework, which can give relevance feedback for the binary coding based retrieval results. This interactive strategy can improve the retrieval performance through re-ranking the above coarse results, where we design a new similarity measure and take the feedback into account. Our framework is validated on more than 17,000 neuron cells, showing promising retrieval accuracy and efficiency. Moreover, we demonstrate its use case in assisting biologists to identify and explore unknown neurons. Copyright © 2017 Elsevier Inc. All rights reserved.
Gyrofluid Modeling of Turbulent, Kinetic Physics
NASA Astrophysics Data System (ADS)
Despain, Kate Marie
2011-12-01
Gyrofluid models to describe plasma turbulence combine the advantages of fluid models, such as lower dimensionality and well-developed intuition, with those of gyrokinetics models, such as finite Larmor radius (FLR) effects. This allows gyrofluid models to be more tractable computationally while still capturing much of the physics related to the FLR of the particles. We present a gyrofluid model derived to capture the behavior of slow solar wind turbulence and describe the computer code developed to implement the model. In addition, we describe the modifications we made to a gyrofluid model and code that simulate plasma turbulence in tokamak geometries. Specifically, we describe a nonlinear phase mixing phenomenon, part of the E x B term, that was previously missing from the model. An inherently FLR effect, it plays an important role in predicting turbulent heat flux and diffusivity levels for the plasma. We demonstrate this importance by comparing results from the updated code to studies done previously by gyrofluid and gyrokinetic codes. We further explain what would be necessary to couple the updated gyrofluid code, gryffin, to a turbulent transport code, thus allowing gryffin to play a role in predicting profiles for fusion devices such as ITER and to explore novel fusion configurations. Such a coupling would require the use of Graphical Processing Units (GPUs) to make the modeling process fast enough to be viable. Consequently, we also describe our experience with GPU computing and demonstrate that we are poised to complete a gryffin port to this innovative architecture.
Fiedler, Jan; Baker, Andrew H; Dimmeler, Stefanie; Heymans, Stephane; Mayr, Manuel; Thum, Thomas
2018-05-23
Non-coding RNAs are increasingly recognized not only as regulators of various biological functions but also as targets for a new generation of RNA therapeutics and biomarkers. We hereby review recent insights relating to non-coding RNAs including microRNAs (e.g. miR-126, miR-146a), long non-coding RNAs (e.g. MIR503HG, GATA6-AS, SMILR) and circular RNAs (e.g. cZNF292) and their role in vascular diseases. This includes identification and therapeutic use of hypoxia-regulated non-coding RNAs and endogenous non-coding RNAs that regulate intrinsic smooth muscle cell signalling, age-related non-coding RNAs and non-coding RNAs involved in the regulation of mitochondrial biology and metabolic control. Finally, we discuss non-coding RNA species with biomarker potential.
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alonemore » reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.« less
Electron-cloud updated simulation results for the PSR, and recent results for the SNS
NASA Astrophysics Data System (ADS)
Pivi, M.; Furman, M. A.
2002-05-01
Recent simulation results for the main features of the electron cloud in the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and updated results for the Proton Storage Ring (PSR) at Los Alamos are presented in this paper. A refined model for the secondary emission process including the so called true secondary, rediffused and backscattered electrons has recently been included in the electron-cloud code.
... your menstrual flow. They come in different sizes, styles, and thicknesses. Some have extra material on the ... Bulletins Patient Education Green Journal Clinical Updates Practice Management Coding Health Info Technology Professional Liability Managing Your ...
76 FR 40844 - Changes to Move Update Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-12
... accuracy standard: a. For computerized lists, Coding Accuracy Support System (CASS)- certified address matching software and current USPS City State Product, within a mailer's computer systems or through an...
New "Risk-Targeted" Seismic Maps Introduced into Building Codes
Luco, Nicholas; Garrett, B.; Hayes, J.
2012-01-01
Throughout most municipalities of the United States, structural engineers design new buildings using the U.S.-focused International Building Code (IBC). Updated editions of the IBC are published every 3 years. The latest edition (2012) contains new "risk-targeted maximum considered earthquake" (MCER) ground motion maps, which are enabling engineers to incorporate a more consistent and better defined level of seismic safety into their building designs.
NASA Astrophysics Data System (ADS)
Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.
2016-12-01
Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).
NASA Astrophysics Data System (ADS)
Frisoni, Manuela
2016-03-01
ANITA-2000 is a code package for the activation characterization of materials exposed to neutron irradiation released by ENEA to OECD-NEADB and ORNL-RSICC. The main component of the package is the activation code ANITA-4M that computes the radioactive inventory of a material exposed to neutron irradiation. The code requires the decay data library (file fl1) containing the quantities describing the decay properties of the unstable nuclides and the library (file fl2) containing the gamma ray spectra emitted by the radioactive nuclei. The fl1 and fl2 files of the ANITA-2000 code package, originally based on the evaluated nuclear data library FENDL/D-2.0, were recently updated on the basis of the JEFF-3.1.1 Radioactive Decay Data Library. This paper presents the results of the validation of the new fl1 decay data library through the comparison of the ANITA-4M calculated values with the measured electron and photon decay heats and activities of fusion material samples irradiated at the 14 MeV Frascati Neutron Generator (FNG) of the NEA-Frascati Research Centre. Twelve material samples were considered, namely: Mo, Cu, Hf, Mg, Ni, Cd, Sn, Re, Ti, W, Ag and Al. The ratios between calculated and experimental values (C/E) are shown and discussed in this paper.
MedlinePlus Videos and Cool Tools
... My ACOG Join Pay Dues Follow us: Women's Health Care Physicians Contact Us My ACOG ACOG Departments Donate ... Education Green Journal Clinical Updates Practice Management Coding Health ... Education & Events Annual Meeting CME Overview CREOG ...
10 CFR 602.19 - Records and data.
Code of Federal Regulations, 2011 CFR
2011-01-01
... software used to compile, manage, and analyze data; (2) Define all technical characteristics necessary for reading or processing the records; (3) Define file and record content and codes; (4) Describe update...
10 CFR 602.19 - Records and data.
Code of Federal Regulations, 2010 CFR
2010-01-01
... software used to compile, manage, and analyze data; (2) Define all technical characteristics necessary for reading or processing the records; (3) Define file and record content and codes; (4) Describe update...
10 CFR 602.19 - Records and data.
Code of Federal Regulations, 2012 CFR
2012-01-01
... software used to compile, manage, and analyze data; (2) Define all technical characteristics necessary for reading or processing the records; (3) Define file and record content and codes; (4) Describe update...
10 CFR 602.19 - Records and data.
Code of Federal Regulations, 2014 CFR
2014-01-01
... software used to compile, manage, and analyze data; (2) Define all technical characteristics necessary for reading or processing the records; (3) Define file and record content and codes; (4) Describe update...
10 CFR 602.19 - Records and data.
Code of Federal Regulations, 2013 CFR
2013-01-01
... software used to compile, manage, and analyze data; (2) Define all technical characteristics necessary for reading or processing the records; (3) Define file and record content and codes; (4) Describe update...
Computer program CDCID: an automated quality control program using CDC update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singer, G.L.; Aguilar, F.
1984-04-01
A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. Themore » computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program.« less
GSE, data management system programmers/User' manual
NASA Technical Reports Server (NTRS)
Schlagheck, R. A.; Dolerhie, B. D., Jr.; Ghiglieri, F. J.
1974-01-01
The GSE data management system is a computerized program which provides for a central storage source for key data associated with the mechanical ground support equipment (MGSE). Eight major sort modes can be requested by the user. Attributes that are printed automatically with each sort include the GSE end item number, description, class code, functional code, fluid media, use location, design responsibility, weight, cost, quantity, dimensions, and applicable documents. Multiple subsorts are available for the class code, functional code, fluid media, use location, design responsibility, and applicable document categories. These sorts and how to use them are described. The program and GSE data bank may be easily updated and expanded.
Benchmarking GPU and CPU codes for Heisenberg spin glass over-relaxation
NASA Astrophysics Data System (ADS)
Bernaschi, M.; Parisi, G.; Parisi, L.
2011-06-01
We present a set of possible implementations for Graphics Processing Units (GPU) of the Over-relaxation technique applied to the 3D Heisenberg spin glass model. The results show that a carefully tuned code can achieve more than 100 GFlops/s of sustained performance and update a single spin in about 0.6 nanoseconds. A multi-hit technique that exploits the GPU shared memory further reduces this time. Such results are compared with those obtained by means of a highly-tuned vector-parallel code on latest generation multi-core CPUs.
NASA Astrophysics Data System (ADS)
Price-Whelan, Adrian M.
2016-01-01
Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.
ACON: a multipurpose production controller for plasma physics codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snell, C.
1983-01-01
ACON is a BCON controller designed to run large production codes on the CTSS Cray-1 or the LTSS 7600 computers. ACON can also be operated interactively, with input from the user's terminal. The controller can run one code or a sequence of up to ten codes during the same job. Options are available to get and save Mass storage files, to perform Historian file updating operations, to compile and load source files, and to send out print and film files. Special features include ability to retry after Mass failures, backup options for saving files, startup messages for the various codes,more » and ability to reserve specified amounts of computer time after successive code runs. ACON's flexibility and power make it useful for running a number of different production codes.« less
JSPAM: A restricted three-body code for simulating interacting galaxies
NASA Astrophysics Data System (ADS)
Wallin, J. F.; Holincheck, A. J.; Harvey, A.
2016-07-01
Restricted three-body codes have a proven ability to recreate much of the disturbed morphology of actual interacting galaxies. As more sophisticated n-body models were developed and computer speed increased, restricted three-body codes fell out of favor. However, their supporting role for performing wide searches of parameter space when fitting orbits to real systems demonstrates a continuing need for their use. Here we present the model and algorithm used in the JSPAM code. A precursor of this code was originally described in 1990, and was called SPAM. We have recently updated the software with an alternate potential and a treatment of dynamical friction to more closely mimic the results from n-body tree codes. The code is released publicly for use under the terms of the Academic Free License ("AFL") v. 3.0 and has been added to the Astrophysics Source Code Library.
American Academy of Pain Medicine
... Updates for 2018 webinar will outline CPT and ICT-10 code changes coming in 2018, guide understanding ... and moderators do not necessarily reflect the official policies of the Department of Health and Human Services; ...
2014-09-11
This final rule introduces regulatory flexibilities and general improvements for certification to the 2014 Edition EHR certification criteria (2014 Edition). It also codifies a few revisions and updates to the ONC HIT Certification Program for certification to the 2014 Edition and future editions of certification criteria as well as makes administrative updates to the Code of Federal Regulations.
77 FR 546 - Adjustment of Nationwide Significant Risk Threshold
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-05
...In accordance with Appendix D to Title 49 Code of Federal Regulations (CFR) Part 222, Use of Locomotive Horns at Highway-Rail Grade Crossings, FRA is updating the Nationwide Significant Risk Threshold (NSRT). This action is needed to ensure that the public has the proper threshold of permissible risk for calculating quiet zones established in relationship to the NSRT. This is the fifth update to the NSRT, which has fallen from 14,007 to 13,722.
75 FR 82136 - Adjustment of Nationwide Significant Risk Threshold
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-29
...In accordance with Appendix D to Title 49 Code of Federal Regulations (CFR) Part 222, Use of Locomotive Horns at Highway-Rail Grade Crossings, FRA is updating the Nationwide Significant Risk Threshold (NSRT). This action is needed to ensure that the public has the proper threshold of permissible risk for calculating quiet zones established in relationship to the NSRT. This is the fourth update to the NSRT, which has fallen from 18,775 to 14,007.
78 FR 70623 - Adjustment of Nationwide Significant Risk Threshold
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
...In accordance with appendix D to title 49 Code of Federal Regulations (CFR) part 222, Use of Locomotive Horns at Public Highway- Rail Grade Crossings, FRA is updating the Nationwide Significant Risk Threshold (NSRT). This action is needed to ensure that the public has the proper threshold of permissible risk for calculating quiet zones established in relationship to the NSRT. This is the sixth update to the NSRT, which is increasing from 13,722 to 14,347.
Closure Plan for the Area 5 Radioactive Waste Management Site at the Nevada Test Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
NSTec Environmental Management
The Area 5 Radioactive Waste Management Site (RMWS) at the Nevada Test Site (NTS) is managed and operated by National Security Technologies, LLC (NSTec), for the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO). This document is the first update of the preliminary closure plan for the Area 5 RWMS at the NTS that was presented in the Integrated Closure and Monitoring Plan (DOE, 2005a). The major updates to the plan include a new closure schedule, updated closure inventory, updated site and facility characterization data, the Title II engineering cover design, and the closure processmore » for the 92-Acre Area of the RWMS. The format and content of this site-specific plan follows the Format and Content Guide for U.S. Department of Energy Low-Level Waste Disposal Facility Closure Plans (DOE, 1999a). This interim closure plan meets closure and post-closure monitoring requirements of the order DOE O 435.1, manual DOE M 435.1-1, Title 40 Code of Federal Regulations (CFR) Part 191, 40 CFR 265, Nevada Administrative Code (NAC) 444.743, and Resource Conservation and Recovery Act (RCRA) requirements as incorporated into NAC 444.8632. The Area 5 RWMS accepts primarily packaged low-level waste (LLW), low-level mixed waste (LLMW), and asbestiform low-level waste (ALLW) for disposal in excavated disposal cells.« less
Tensor Dictionary Learning for Positive Definite Matrices.
Sivalingam, Ravishankar; Boley, Daniel; Morellas, Vassilios; Papanikolopoulos, Nikolaos
2015-11-01
Sparse models have proven to be extremely successful in image processing and computer vision. However, a majority of the effort has been focused on sparse representation of vectors and low-rank models for general matrices. The success of sparse modeling, along with popularity of region covariances, has inspired the development of sparse coding approaches for these positive definite descriptors. While in earlier work, the dictionary was formed from all, or a random subset of, the training signals, it is clearly advantageous to learn a concise dictionary from the entire training set. In this paper, we propose a novel approach for dictionary learning over positive definite matrices. The dictionary is learned by alternating minimization between sparse coding and dictionary update stages, and different atom update methods are described. A discriminative version of the dictionary learning approach is also proposed, which simultaneously learns dictionaries for different classes in classification or clustering. Experimental results demonstrate the advantage of learning dictionaries from data both from reconstruction and classification viewpoints. Finally, a software library is presented comprising C++ binaries for all the positive definite sparse coding and dictionary learning approaches presented here.
Modification and benchmarking of MCNP for low-energy tungsten spectra.
Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M
2000-12-01
The MCNP Monte Carlo radiation transport code was modified for diagnostic medical physics applications. In particular, the modified code was thoroughly benchmarked for the production of polychromatic tungsten x-ray spectra in the 30-150 kV range. Validating the modified code for coupled electron-photon transport with benchmark spectra was supplemented with independent electron-only and photon-only transport benchmarks. Major revisions to the code included the proper treatment of characteristic K x-ray production and scoring, new impact ionization cross sections, and new bremsstrahlung cross sections. Minor revisions included updated photon cross sections, electron-electron bremsstrahlung production, and K x-ray yield. The modified MCNP code is benchmarked to electron backscatter factors, x-ray spectra production, and primary and scatter photon transport.
Colorectal Cancer Risk Assessment Tool
... 11/12/2014 Risk Calculator About the Tool Colorectal Cancer Risk Factors Download SAS and Gauss Code Page ... Rectal Cancer: Prevention, Genetics, Causes Tests to Detect Colorectal Cancer and Polyps Cancer Risk Prediction Resources Update November ...
NASA Technical Reports Server (NTRS)
Sandor, Aniko; Moses, Haifa
2016-01-01
Speech alarms have been used extensively in aviation and included in International Building Codes (IBC) and National Fire Protection Association's (NFPA) Life Safety Code. However, they have not been implemented on space vehicles. Previous studies conducted at NASA JSC showed that speech alarms lead to faster identification and higher accuracy. This research evaluated updated speech and tone alerts in a laboratory environment and in the Human Exploration Research Analog (HERA) in a realistic setup.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, P. T.; Dickson, T. L.; Yin, S.
The current regulations to insure that nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to transients such as pressurized thermal shock (PTS) events were derived from computational models developed in the early-to-mid 1980s. Since that time, advancements and refinements in relevant technologies that impact RPV integrity assessment have led to an effort by the NRC to re-evaluate its PTS regulations. Updated computational methodologies have been developed through interactions between experts in the relevant disciplines of thermal hydraulics, probabilistic risk assessment, materials embrittlement, fracture mechanics, and inspection (flaw characterization). Contributors to the development of these methodologies include themore » NRC staff, their contractors, and representatives from the nuclear industry. These updated methodologies have been integrated into the Fracture Analysis of Vessels -- Oak Ridge (FAVOR, v06.1) computer code developed for the NRC by the Heavy Section Steel Technology (HSST) program at Oak Ridge National Laboratory (ORNL). The FAVOR, v04.1, code represents the baseline NRC-selected applications tool for re-assessing the current PTS regulations. This report is intended to document the technical bases for the assumptions, algorithms, methods, and correlations employed in the development of the FAVOR, v06.1, code.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
...This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2014 (for discharges occurring on or after October 1, 2013 and on or before September 30, 2014) as required by the statute. This final rule also revised the list of diagnosis codes that may be counted toward an IRF's ``60 percent rule'' compliance calculation to determine ``presumptive compliance,'' update the IRF facility-level adjustment factors using an enhanced estimation methodology, revise sections of the Inpatient Rehabilitation Facility-Patient Assessment Instrument, revise requirements for acute care hospitals that have IRF units, clarify the IRF regulation text regarding limitation of review, update references to previously changed sections in the regulations text, and revise and update quality measures and reporting requirements under the IRF quality reporting program.
2013-08-06
This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2014 (for discharges occurring on or after October 1, 2013 and on or before September 30, 2014) as required by the statute. This final rule also revised the list of diagnosis codes that may be counted toward an IRF's "60 percent rule'' compliance calculation to determine "presumptive compliance,'' update the IRF facility-level adjustment factors using an enhanced estimation methodology, revise sections of the Inpatient Rehabilitation Facility-Patient Assessment Instrument, revise requirements for acute care hospitals that have IRF units, clarify the IRF regulation text regarding limitation of review, update references to previously changed sections in the regulations text, and revise and update quality measures and reporting requirements under the IRF quality reporting program.
A recursive linear predictive vocoder
NASA Astrophysics Data System (ADS)
Janssen, W. A.
1983-12-01
A non-real time 10 pole recursive autocorrelation linear predictive coding vocoder was created for use in studying effects of recursive autocorrelation on speech. The vocoder is composed of two interchangeable pitch detectors, a speech analyzer, and speech synthesizer. The time between updating filter coefficients is allowed to vary from .125 msec to 20 msec. The best quality was found using .125 msec between each update. The greatest change in quality was noted when changing from 20 msec/update to 10 msec/update. Pitch period plots for the center clipping autocorrelation pitch detector and simplified inverse filtering technique are provided. Plots of speech into and out of the vocoder are given. Formant versus time three dimensional plots are shown. Effects of noise on pitch detection and formants are shown. Noise effects the voiced/unvoiced decision process causing voiced speech to be re-constructed as unvoiced.
American College of Obstetricians and Gynecologists
... Workshops Postgraduate Courses CME Overview Full Meeting Calendar Green Journal Access the Green Journal , an official publication of ACOG: TOC and ... Resources & Publications Committee Opinions Practice Bulletins Patient ... Journal Clinical Updates Practice Management Coding Health Info ...
Prenatal Genetic Testing Chart
... www.acog.org/Patients/FAQs/Prenatal-Genetic-Diagnostic-Tests › › Resources & Publications Committee Opinions Practice Bulletins Patient Education Green Journal Clinical Updates Practice Management Coding Health Info Technology Professional Liability Managing Your Practice Patient Safety & Quality ...
76 FR 22802 - Interim Enforcement Policy for Minimum Days Off Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... NRC's fitness for duty regulations and will remain in place until the NRC publishes a revised rule... Code of Federal Regulations, Part 26, ``Fitness for Duty Programs.'' The Commission updated the...
Optimization Based Efficiencies in First Order Reliability Analysis
NASA Technical Reports Server (NTRS)
Peck, Jeffrey A.; Mahadevan, Sankaran
2003-01-01
This paper develops a method for updating the gradient vector of the limit state function in reliability analysis using Broyden's rank one updating technique. In problems that use commercial code as a black box, the gradient calculations are usually done using a finite difference approach, which becomes very expensive for large system models. The proposed method replaces the finite difference gradient calculations in a standard first order reliability method (FORM) with Broyden's Quasi-Newton technique. The resulting algorithm of Broyden updates within a FORM framework (BFORM) is used to run several example problems, and the results compared to standard FORM results. It is found that BFORM typically requires fewer functional evaluations that FORM to converge to the same answer.
Persistence of opinion in the Sznajd consensus model: computer simulation
NASA Astrophysics Data System (ADS)
Stauffer, D.; de Oliveira, P. M. C.
2002-12-01
The density of never changed opinions during the Sznajd consensus-finding process decays with time t as 1/t^θ. We find θ simeq 3/8 for a chain, compatible with the exact Ising result of Derrida et al. In higher dimensions, however, the exponent differs from the Ising θ. With simultaneous updating of sublattices instead of the usual random sequential updating, the number of persistent opinions decays roughly exponentially. Some of the simulations used multi-spin coding.
Hétu, Sébastien; Luo, Yi; D’Ardenne, Kimberlee; Lohrenz, Terry
2017-01-01
Abstract As models of shared expectations, social norms play an essential role in our societies. Since our social environment is changing constantly, our internal models of it also need to change. In humans, there is mounting evidence that neural structures such as the insula and the ventral striatum are involved in detecting norm violation and updating internal models. However, because of methodological challenges, little is known about the possible involvement of midbrain structures in detecting norm violation and updating internal models of our norms. Here, we used high-resolution cardiac-gated functional magnetic resonance imaging and a norm adaptation paradigm in healthy adults to investigate the role of the substantia nigra/ventral tegmental area (SN/VTA) complex in tracking signals related to norm violation that can be used to update internal norms. We show that the SN/VTA codes for the norm’s variance prediction error (PE) and norm PE with spatially distinct regions coding for negative and positive norm PE. These results point to a common role played by the SN/VTA complex in supporting both simple reward-based and social decision making. PMID:28981876
CoNNeCT Baseband Processor Module Boot Code SoftWare (BCSW)
NASA Technical Reports Server (NTRS)
Yamamoto, Clifford K.; Orozco, David S.; Byrne, D. J.; Allen, Steven J.; Sahasrabudhe, Adit; Lang, Minh
2012-01-01
This software provides essential startup and initialization routines for the CoNNeCT baseband processor module (BPM) hardware upon power-up. A command and data handling (C&DH) interface is provided via 1553 and diagnostic serial interfaces to invoke operational, reconfiguration, and test commands within the code. The BCSW has features unique to the hardware it is responsible for managing. In this case, the CoNNeCT BPM is configured with an updated CPU (Atmel AT697 SPARC processor) and a unique set of memory and I/O peripherals that require customized software to operate. These features include configuration of new AT697 registers, interfacing to a new HouseKeeper with a flash controller interface, a new dual Xilinx configuration/scrub interface, and an updated 1553 remote terminal (RT) core. The BCSW is intended to provide a "safe" mode for the BPM when initially powered on or when an unexpected trap occurs, causing the processor to reset. The BCSW allows the 1553 bus controller in the spacecraft or payload controller to operate the BPM over 1553 to upload code; upload Xilinx bit files; perform rudimentary tests; read, write, and copy the non-volatile flash memory; and configure the Xilinx interface. Commands also exist over 1553 to cause the CPU to jump or call a specified address to begin execution of user-supplied code. This may be in the form of a real-time operating system, test routine, or specific application code to run on the BPM.
The ANTARES Code: New Developments
NASA Astrophysics Data System (ADS)
Blies, P. M.; Kupka, F.; Muthsam, H. J.
2015-10-01
We give an update on the ANTARES code. It was presented by Muthsam et al. (2010) and has since experienced various improvements and has also been extended by new features which we will mention in this paper. Two new features will be presented in a bit more detail: the parallel multigrid solver for the 2D non-linear, generalized Helmholtz equation by Happenhofer (2014) and the capability to use curvilinear grids by Grimm-Strele (2014).
NASA Astrophysics Data System (ADS)
Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus
2018-04-01
This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.
Medendorp, W. P.
2015-01-01
It is known that the brain uses multiple reference frames to code spatial information, including eye-centered and body-centered frames. When we move our body in space, these internal representations are no longer in register with external space, unless they are actively updated. Whether the brain updates multiple spatial representations in parallel, or whether it restricts its updating mechanisms to a single reference frame from which other representations are constructed, remains an open question. We developed an optimal integration model to simulate the updating of visual space across body motion in multiple or single reference frames. To test this model, we designed an experiment in which participants had to remember the location of a briefly presented target while being translated sideways. The behavioral responses were in agreement with a model that uses a combination of eye- and body-centered representations, weighted according to the reliability in which the target location is stored and updated in each reference frame. Our findings suggest that the brain simultaneously updates multiple spatial representations across body motion. Because both representations are kept in sync, they can be optimally combined to provide a more precise estimate of visual locations in space than based on single-frame updating mechanisms. PMID:26490289
Development of V/STOL methodology based on a higher order panel method
NASA Technical Reports Server (NTRS)
Bhateley, I. C.; Howell, G. A.; Mann, H. W.
1983-01-01
The development of a computational technique to predict the complex flowfields of V/STOL aircraft was initiated in which a number of modules and a potential flow aerodynamic code were combined in a comprehensive computer program. The modules were developed in a building-block approach to assist the user in preparing the geometric input and to compute parameters needed to simulate certain flow phenomena that cannot be handled directly within a potential flow code. The PAN AIR aerodynamic code, which is higher order panel method, forms the nucleus of this program. PAN AIR's extensive capability for allowing generalized boundary conditions allows the modules to interact with the aerodynamic code through the input and output files, thereby requiring no changes to the basic code and easy replacement of updated modules.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
Regan, R. Steve; LaFontaine, Jacob H.
2017-10-05
This report documents seven enhancements to the U.S. Geological Survey (USGS) Precipitation-Runoff Modeling System (PRMS) hydrologic simulation code: two time-series input options, two new output options, and three updates of existing capabilities. The enhancements are (1) new dynamic parameter module, (2) new water-use module, (3) new Hydrologic Response Unit (HRU) summary output module, (4) new basin variables summary output module, (5) new stream and lake flow routing module, (6) update to surface-depression storage and flow simulation, and (7) update to the initial-conditions specification. This report relies heavily upon U.S. Geological Survey Techniques and Methods, book 6, chapter B7, which documents PRMS version 4 (PRMS-IV). A brief description of PRMS is included in this report.
Esophageal function testing: Billing and coding update.
Khan, A; Massey, B; Rao, S; Pandolfino, J
2018-01-01
Esophageal function testing is being increasingly utilized in diagnosis and management of esophageal disorders. There have been several recent technological advances in the field to allow practitioners the ability to more accurately assess and treat such conditions, but there has been a relative lack of education in the literature regarding the associated Common Procedural Terminology (CPT) codes and methods of reimbursement. This review, commissioned and supported by the American Neurogastroenterology and Motility Society Council, aims to summarize each of the CPT codes for esophageal function testing and show the trends of associated reimbursement, as well as recommend coding methods in a practical context. We also aim to encourage many of these codes to be reviewed on a gastrointestinal (GI) societal level, by providing evidence of both discrepancies in coding definitions and inadequate reimbursement in this new era of esophageal function testing. © 2017 John Wiley & Sons Ltd.
American Academy of Pediatric Dentistry
... 500 Welcome Bonus for AAPD Members! Pediatric Dentist Toolkit Now Available! AAPD Coding and Insurance Manual 2018 Updates Looking to Find a Job? Looking to Fill a Position? Try the New AAPD Career Center Download the AAPD Reference Manual App Today! ...
NASA Technical Reports Server (NTRS)
Kotler, R. S.
1983-01-01
File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.
Roots of success: Marketing strategies for the 21st Century
Ian Doescher
2012-01-01
As the demographic of gardeners and farmers changes, retail nurseries should update their marketing approach. This paper reviews best marketing practices and discusses marketing technologies, including quick response (QR) codes, websites, online marketing, and social media.
Boundary layer simulator improvement
NASA Technical Reports Server (NTRS)
Praharaj, S. C.; Schmitz, C.; Frost, C.; Engel, C. D.; Fuller, C. E.; Bender, R. L.; Pond, J.
1984-01-01
High chamber pressure expander cycles proposed for orbit transfer vehicles depend primarily on the heat energy transmitted from the combustion products through the thrust wall chamber wall. The heat transfer to the nozzle wall is affected by such variables as wall roughness, relamarization, and the presence of particles in the flow. Motor performance loss for these nozzles with thick boundary layers is inaccurate using the existing procedure coded BLIMPJ. Modifications and innovations to the code are examined. Updated routines are listed.
1990-03-27
coding of certain population characteristic data and thus delay the publication of these data. This is similar to what happened in the 1980 census...when, because of budget shortfalls, the Bureau reduced the number of staff who coded population characteristic 5 data from questionnaires, contributing...Decennial Census: An Update, (GAO/T-GGD-89-15, Mar. 23, 1989). 6 missing population characteristic data would have been resolved either by telephone or a
AGU's Updated Scientific Integrity and Professional Ethics Policy
NASA Astrophysics Data System (ADS)
McPhaden, M. J.
2017-12-01
AGU'S mission is to promote discovery in Earth and space science for the benefit of humanity. This mission can only be accomplished if all those engaged in the scientific enterprise uphold the highest standards of scientific integrity and professional ethics. AGU's Scientific Integrity and Professional Ethics Policy provides a set of principles and guidelines for AGU members, staff, volunteers, contractors, and non-members participating in AGU sponsored programs and activities. The policy has recently been updated to include a new code of conduct that broadens the definition of scientific misconduct to include discrimination, harassment, and bullying. This presentation provides the context for what motivated the updated policy, an outline of the policy itself, and a discussion of how it is being communicated and applied.
Jiansen Li; Jianqi Sun; Ying Song; Yanran Xu; Jun Zhao
2014-01-01
An effective way to improve the data acquisition speed of magnetic resonance imaging (MRI) is using under-sampled k-space data, and dictionary learning method can be used to maintain the reconstruction quality. Three-dimensional dictionary trains the atoms in dictionary in the form of blocks, which can utilize the spatial correlation among slices. Dual-dictionary learning method includes a low-resolution dictionary and a high-resolution dictionary, for sparse coding and image updating respectively. However, the amount of data is huge for three-dimensional reconstruction, especially when the number of slices is large. Thus, the procedure is time-consuming. In this paper, we first utilize the NVIDIA Corporation's compute unified device architecture (CUDA) programming model to design the parallel algorithms on graphics processing unit (GPU) to accelerate the reconstruction procedure. The main optimizations operate in the dictionary learning algorithm and the image updating part, such as the orthogonal matching pursuit (OMP) algorithm and the k-singular value decomposition (K-SVD) algorithm. Then we develop another version of CUDA code with algorithmic optimization. Experimental results show that more than 324 times of speedup is achieved compared with the CPU-only codes when the number of MRI slices is 24.
77 FR 37446 - Advisory Committee on the Medical Uses of Isotopes: Meeting Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-21
...; and (6) update on domestic production of molybdenum-99. The regular meeting agenda is subject to..., Advisory Committee Management Officer. [FR Doc. 2012-15173 Filed 6-20-12; 8:45 am] BILLING CODE 7590-01-P ...
75 FR 34004 - State Cemetery Grants
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-16
... architectural design codes that apply to grant applicants, we decided to update those references in a separate.... 39.63 Architectural design standards. Subpart C--Operation and Maintenance Projects Grant... acquisition, design and planning, earth moving, landscaping, construction, and provision of initial operating...
Effecting IT infrastructure culture change: management by processes and metrics
NASA Technical Reports Server (NTRS)
Miller, R. L.
2001-01-01
This talk describes the processes and metrics used by Jet Propulsion Laboratory to bring about the required IT infrastructure culture change to update and certify, as Y2K compliant, thousands of computers and millions of lines of code.
WDEC: A Code for Modeling White Dwarf Structure and Pulsations
NASA Astrophysics Data System (ADS)
Bischoff-Kim, Agnès; Montgomery, Michael H.
2018-05-01
The White Dwarf Evolution Code (WDEC), written in Fortran, makes models of white dwarf stars. It is fast, versatile, and includes the latest physics. The code evolves hot (∼100,000 K) input models down to a chosen effective temperature by relaxing the models to be solutions of the equations of stellar structure. The code can also be used to obtain g-mode oscillation modes for the models. WDEC has a long history going back to the late 1960s. Over the years, it has been updated and re-packaged for modern computer architectures and has specifically been used in computationally intensive asteroseismic fitting. Generations of white dwarf astronomers and dozens of publications have made use of the WDEC, although the last true instrument paper is the original one, published in 1975. This paper discusses the history of the code, necessary to understand why it works the way it does, details the physics and features in the code today, and points the reader to where to find the code and a user guide.
Resonance Parameter Adjustment Based on Integral Experiments
Sobes, Vladimir; Leal, Luiz; Arbanas, Goran; ...
2016-06-02
Our project seeks to allow coupling of differential and integral data evaluation in a continuous-energy framework and to use the generalized linear least-squares (GLLS) methodology in the TSURFER module of the SCALE code package to update the parameters of a resolved resonance region evaluation. We recognize that the GLLS methodology in TSURFER is identical to the mathematical description of a Bayesian update in SAMMY, the SAMINT code was created to use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Traditionally, SAMMY used differential experimental data to adjust nuclear data parameters. Integral experimental data, suchmore » as in the International Criticality Safety Benchmark Experiments Project, remain a tool for validation of completed nuclear data evaluations. SAMINT extracts information from integral benchmarks to aid the nuclear data evaluation process. Later, integral data can be used to resolve any remaining ambiguity between differential data sets, highlight troublesome energy regions, determine key nuclear data parameters for integral benchmark calculations, and improve the nuclear data covariance matrix evaluation. Moreover, SAMINT is not intended to bias nuclear data toward specific integral experiments but should be used to supplement the evaluation of differential experimental data. Using GLLS ensures proper weight is given to the differential data.« less
Re-evaluation and updating of the seismic hazard of Lebanon
NASA Astrophysics Data System (ADS)
Huijer, Carla; Harajli, Mohamed; Sadek, Salah
2016-01-01
This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.
2017-08-03
This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2018 as required by the statute. As required by section 1886(j)(5) of the Social Security Act (the Act), this rule includes the classification and weighting factors for the IRF prospective payment system's (IRF PPS) case-mix groups and a description of the methodologies and data used in computing the prospective payment rates for FY 2018. This final rule also revises the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) diagnosis codes that are used to determine presumptive compliance under the "60 percent rule," removes the 25 percent payment penalty for inpatient rehabilitation facility patient assessment instrument (IRF-PAI) late transmissions, removes the voluntary swallowing status item (Item 27) from the IRF-PAI, summarizes comments regarding the criteria used to classify facilities for payment under the IRF PPS, provides for a subregulatory process for certain annual updates to the presumptive methodology diagnosis code lists, adopts the use of height/weight items on the IRF-PAI to determine patient body mass index (BMI) greater than 50 for cases of single-joint replacement under the presumptive methodology, and revises and updates measures and reporting requirements under the IRF quality reporting program (QRP).
FY15 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Shemon, E. R.; Smith, M. A.
2015-09-30
This report summarizes the current status of NEAMS activities in FY2015. The tasks this year are (1) to improve solution methods for steady-state and transient conditions, (2) to develop features and user friendliness to increase the usability and applicability of the code, (3) to improve and verify the multigroup cross section generation scheme, (4) to perform verification and validation tests of the code using SFRs and thermal reactor cores, and (5) to support early users of PROTEUS and update the user manuals.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-04
...This final rule sets forth updates to the home health prospective payment system (HH PPS) rates, including: the national standardized 60-day episode rates; the national per-visit rates; and the low utilization payment amount (LUPA) under the Medicare PPS for home health agencies effective January 1, 2012. This rule applies a 1.4 percent update factor to the episode rates, which reflects a 1 percent reduction applied to the 2.4 percent market basket update factor, as mandated by the Affordable Care Act. This rule also updates the wage index used under the HH PPS, and further reduces home health payments to account for continued nominal growth in case-mix which is unrelated to changes in patient health status. This rule removes two hypertension codes from the HH PPS case-mix system, thereby requiring recalibration of the case-mix weights. In addition, the rule implements two structural changes designed to decrease incentives to upcode and provide unneeded therapy services. Finally, this rule incorporates additional flexibility regarding face-to-face encounters with providers related to home health care.
2011-11-04
This final rule sets forth updates to the home health prospective payment system (HH PPS) rates, including: the national standardized 60-day episode rates; the national per-visit rates; and the low utilization payment amount (LUPA) under the Medicare PPS for home health agencies effective January 1, 2012. This rule applies a 1.4 percent update factor to the episode rates, which reflects a 1 percent reduction applied to the 2.4 percent market basket update factor, as mandated by the Affordable Care Act. This rule also updates the wage index used under the HH PPS, and further reduces home health payments to account for continued nominal growth in case-mix which is unrelated to changes in patient health status. This rule removes two hypertension codes from the HH PPS case-mix system, thereby requiring recalibration of the case-mix weights. In addition, the rule implements two structural changes designed to decrease incentives to upcode and provide unneeded therapy services. Finally, this rule incorporates additional flexibility regarding face-to-face encounters with providers related to home health care.
Molecular taxonomy and phylogeny
USDA-ARS?s Scientific Manuscript database
The cyst nematodes comprise a group of sedentary endoparasitic nematodes that impact a wide range of crops in both tropical and temperate regions of the world. This chapter updates the taxonomy and phylogeny of this group and describes the nuclear protein coding, ribosomal, and mitochondrial sequenc...
76 FR 42688 - Updating State Residential Building Energy Efficiency Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-19
... 19, 2013. ADDRESSES: Certification Statements must be addressed to the Buildings Technologies Program...-rise (greater than three stories) multifamily residential buildings and hotel, motel, and other..., townhouses, row houses, and low-rise multifamily buildings (not greater than three stories) such as...
UPEML Version 3.0: A machine-portable CDC update emulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehlhorn, T.A.; Haill, T.A.
1992-04-01
UPEML is a machine-portable program that emulates a subset of the functions of the standard CDC Update. Machine-portability has been achieved by conforming to ANSI standards for Fortran-77. UPEML is compact and fairly efficient; however, it only allows a restricted syntax as compared with the CDC Update. This program was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX/VMS mainframes and UNIX workstations. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both UNICOS and CTSS operating systems, and on Sun, HP, Stardent and IBM workstations. UPEMLmore » was originally released with the ITS electron/photon Monte Carlo transport package, which was developed on a CDC-7600 and makes extensive use of conditional file structure to combine several problem geometry and machine options into a single program file. UPEML 3.0 is an enhanced version of the original code and is being independently released for use at any installation or with any code package. Version 3.0 includes enhanced error checking, full ASCII character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the complete file. Version 3.0 also checks for overlapping corrections, allows processing of pested calls to common decks, and allows the use of alternate files in READ and ADDFILE commands. Finally, UPEML Version 3.0 allows the assignment of input and output files at runtime on the control line.« less
UPEML Version 3. 0: A machine-portable CDC update emulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehlhorn, T.A.; Haill, T.A.
1992-04-01
UPEML is a machine-portable program that emulates a subset of the functions of the standard CDC Update. Machine-portability has been achieved by conforming to ANSI standards for Fortran-77. UPEML is compact and fairly efficient; however, it only allows a restricted syntax as compared with the CDC Update. This program was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX/VMS mainframes and UNIX workstations. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both UNICOS and CTSS operating systems, and on Sun, HP, Stardent and IBM workstations. UPEMLmore » was originally released with the ITS electron/photon Monte Carlo transport package, which was developed on a CDC-7600 and makes extensive use of conditional file structure to combine several problem geometry and machine options into a single program file. UPEML 3.0 is an enhanced version of the original code and is being independently released for use at any installation or with any code package. Version 3.0 includes enhanced error checking, full ASCII character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the complete file. Version 3.0 also checks for overlapping corrections, allows processing of pested calls to common decks, and allows the use of alternate files in READ and ADDFILE commands. Finally, UPEML Version 3.0 allows the assignment of input and output files at runtime on the control line.« less
Internal ballistics model update for ASRM dome
NASA Technical Reports Server (NTRS)
Bowden, Mark H.; Jenkins, Billy Z.
1991-01-01
A previous report (no. 5-32279, contract NAS8-36955, DO 51) describes the measures taken to adapt the NASA Complex Burning Region Model and code so that is was applicable to the Advanced Solid Rocket Motor as envisioned at that time. The code so modified was called the CBRM-A. CBRM-A could calculate the port volume and burning area for the star, transition, and cylindrically perforated regions of the motor. Described here is a subsequent effort to add computation of port volume and burning area for the Advanced Solid Rocket Motor head dome. Sample output, input, and overview of the models are included. The software was configured in two forms - a stand alone head dome code and a code integrating the head dome solution with the CBRM-A.
Facts and updates about cardiovascular non-coding RNAs in heart failure.
Thum, Thomas
2015-09-01
About 11% of all deaths include heart failure as a contributing cause. The annual cost of heart failure amounts to US $34,000,000,000 in the United States alone. With the exception of heart transplantation, there is no curative therapy available. Only occasionally there are new areas in science that develop into completely new research fields. The topic on non-coding RNAs, including microRNAs, long non-coding RNAs, and circular RNAs, is such a field. In this short review, we will discuss the latest developments about non-coding RNAs in cardiovascular disease. MicroRNAs are short regulatory non-coding endogenous RNA species that are involved in virtually all cellular processes. Long non-coding RNAs also regulate gene and protein levels; however, by much more complicated and diverse mechanisms. In general, non-coding RNAs have been shown to be of great value as therapeutic targets in adverse cardiac remodelling and also as diagnostic and prognostic biomarkers for heart failure. In the future, non-coding RNA-based therapeutics are likely to enter the clinical reality offering a new treatment approach of heart failure.
ETF system code: composition and applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reid, R.L.; Wu, K.F.
1980-01-01
A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies,more » such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system.« less
[Towards a new Tunisian Medical Code of Deontology].
Aissaoui, Abir; Haj Salem, Nidhal; Chadly, Ali
2010-06-01
The Medical Code of Deontology is a legal text including the physician's duties towards his patients, colleagues, auxiliaries and the community. Considering the scientific, legal and social changes, the deontology code should be revised periodically. The first Tunisian Medical Code of Deontology (TMCD) was promulgated in 1973 and abrogated in 1993 by the new Code. This version has never been reviewed and does not seem to fit the current conditions of medical practice. The TMCD does not contain texts referring to information given to the patient, pain control, palliative care and management of the end of life as well as protection of medical data. Furthermore, the TMCD does not include rules related to tissues and organs transplantation and medical assisted human reproduction in accordance with Tunisian legal texts. We aim in this paper at analyzing the insufficiencies of the TMCD and suggesting modifications in order to update it.
NASA Lewis Steady-State Heat Pipe Code Architecture
NASA Technical Reports Server (NTRS)
Mi, Ye; Tower, Leonard K.
2013-01-01
NASA Glenn Research Center (GRC) has developed the LERCHP code. The PC-based LERCHP code can be used to predict the steady-state performance of heat pipes, including the determination of operating temperature and operating limits which might be encountered under specified conditions. The code contains a vapor flow algorithm which incorporates vapor compressibility and axially varying heat input. For the liquid flow in the wick, Darcy s formula is employed. Thermal boundary conditions and geometric structures can be defined through an interactive input interface. A variety of fluid and material options as well as user defined options can be chosen for the working fluid, wick, and pipe materials. This report documents the current effort at GRC to update the LERCHP code for operating in a Microsoft Windows (Microsoft Corporation) environment. A detailed analysis of the model is presented. The programming architecture for the numerical calculations is explained and flowcharts of the key subroutines are given
Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise
2018-05-01
Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.
NSTX-U Control System Upgrades
Erickson, K. G.; Gates, D. A.; Gerhardt, S. P.; ...
2014-06-01
The National Spherical Tokamak Experiment (NSTX) is undergoing a wealth of upgrades (NSTX-U). These upgrades, especially including an elongated pulse length, require broad changes to the control system that has served NSTX well. A new fiber serial Front Panel Data Port input and output (I/O) stream will supersede the aging copper parallel version. Driver support for the new I/O and cyber security concerns require updating the operating system from Redhat Enterprise Linux (RHEL) v4 to RedHawk (based on RHEL) v6. While the basic control system continues to use the General Atomics Plasma Control System (GA PCS), the effort to forwardmore » port the entire software package to run under 64-bit Linux instead of 32-bit Linux included PCS modifications subsequently shared with GA and other PCS users. Software updates focused on three key areas: (1) code modernization through coding standards (C99/C11), (2) code portability and maintainability through use of the GA PCS code generator, and (3) support of 64-bit platforms. Central to the control system upgrade is the use of a complete real time (RT) Linux platform provided by Concurrent Computer Corporation, consisting of a computer (iHawk), an operating system and drivers (RedHawk), and RT tools (NightStar). Strong vendor support coupled with an extensive RT toolset influenced this decision. The new real-time Linux platform, I/O, and software engineering will foster enhanced capability and performance for NSTX-U plasma control.« less
2014-08-06
This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2015 as required by the statute. This final rule finalizes a policy to collect data on the amount and mode (that is, Individual, Concurrent, Group, and Co-Treatment) of therapy provided in the IRF setting according to therapy discipline, revises the list of diagnosis and impairment group codes that presumptively meet the "60 percent rule'' compliance criteria, provides a way for IRFs to indicate on the Inpatient Rehabilitation Facility-Patient Assessment Instrument (IRF-PAI) form whether the prior treatment and severity requirements have been met for arthritis cases to presumptively meet the "60 percent rule'' compliance criteria, and revises and updates quality measures and reporting requirements under the IRF quality reporting program (QRP). This rule also delays the effective date for the revisions to the list of diagnosis codes that are used to determine presumptive compliance under the "60 percent rule'' that were finalized in FY 2014 IRF PPS final rule and adopts the revisions to the list of diagnosis codes that are used to determine presumptive compliance under the "60 percent rule'' that are finalized in this rule. This final rule also addresses the implementation of the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM), for the IRF prospective payment system (PPS), which will be effective when ICD-10-CM becomes the required medical data code set for use on Medicare claims and IRF-PAI submissions.
NASA Technical Reports Server (NTRS)
Rojdev, Kristina; Koontz, Steve; Reddell, Brandon; Atwell, William; Boeder, Paul
2015-01-01
An accurate prediction of spacecraft avionics single event effect (SEE) radiation susceptibility is key to ensuring a safe and reliable vehicle. This is particularly important for long-duration deep space missions for human exploration where there is little or no chance for a quick emergency return to Earth. Monte Carlo nuclear reaction and transport codes such as FLUKA can be used to generate very accurate models of the expected in-flight radiation environment for SEE analyses. A major downside to using a Monte Carlo-based code is that the run times can be very long (on the order of days). A more popular choice for SEE calculations is the CREME96 deterministic code, which offers significantly shorter run times (on the order of seconds). However, CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Another modeling option to consider is the deterministic code HZETRN 20104, which includes updates to address secondary particle shower effects more accurately. This paper builds on previous work by Rojdev, et al. to compare the use of HZETRN 2010 against CREME96 as a tool to verify spacecraft avionics system reliability in a space flight SEE environment. This paper will discuss modifications made to HZETRN 2010 to improve its performance for calculating SEE rates and compare results with both in-flight SEE rates and other calculation methods.
Computer program for calculating thermodynamic and transport properties of fluids
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Braon, A. K.; Peller, I. C.
1975-01-01
Computer code has been developed to provide thermodynamic and transport properties of liquid argon, carbon dioxide, carbon monoxide, fluorine, helium, methane, neon, nitrogen, oxygen, and parahydrogen. Equation of state and transport coefficients are updated and other fluids added as new material becomes available.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-14
..., (RTCA Paper No. 137-10/SC216-029). Report on the PMC/ICC action on TOR: Publication Progress and Update... Advisory Committee. [FR Doc. 2010-22879 Filed 9-13-10; 8:45 am] BILLING CODE 4910-13-P ...
American Society for Colposcopy and Cervical Pathology
... Colposcopy Standards Recommendations Patient Resources Journal Membership Member Benefits Join/Renew Member Resources Careers About History Bylaws ... MD, MS, Thomas C. Wright, Jr, MD ASCCP Mobile App Updated Consensus Guidelines for Managing Abnormal Cervical Cancer Screening Tests and Cancer ... * Email: * Enter code: * Message: Thank you Your ...
ERIC Educational Resources Information Center
IDRA Newsletter, 1995
1995-01-01
This theme issue focuses on the drastic revision of the Texas education code undertaken during the 1995 state legislative session. "Education Policy Reform: Key Points for Districts" (Albert Cortez, Mikki Symonds) outlines critical issues in the legislation that have an impact on educational quality: charter schools exempt from state…
Advances in pleural disease management including updated procedural coding.
Haas, Andrew R; Sterman, Daniel H
2014-08-01
Over 1.5 million pleural effusions occur in the United States every year as a consequence of a variety of inflammatory, infectious, and malignant conditions. Although rarely fatal in isolation, pleural effusions are often a marker of a serious underlying medical condition and contribute to significant patient morbidity, quality-of-life reduction, and mortality. Pleural effusion management centers on pleural fluid drainage to relieve symptoms and to investigate pleural fluid accumulation etiology. Many recent studies have demonstrated important advances in pleural disease management approaches for a variety of pleural fluid etiologies, including malignant pleural effusion, complicated parapneumonic effusion and empyema, and chest tube size. The last decade has seen greater implementation of real-time imaging assistance for pleural effusion management and increasing use of smaller bore percutaneous chest tubes. This article will briefly review recent pleural effusion management literature and update the latest changes in common procedural terminology billing codes as reflected in the changing landscape of imaging use and percutaneous approaches to pleural disease management.
NASA Astrophysics Data System (ADS)
Mielikainen, Jarno; Huang, Bormin; Huang, Allen H.-L.
2015-05-01
Intel Many Integrated Core (MIC) ushers in a new era of supercomputing speed, performance, and compatibility. It allows the developers to run code at trillions of calculations per second using the familiar programming model. In this paper, we present our results of optimizing the updated Goddard shortwave radiation Weather Research and Forecasting (WRF) scheme on Intel Many Integrated Core Architecture (MIC) hardware. The Intel Xeon Phi coprocessor is the first product based on Intel MIC architecture, and it consists of up to 61 cores connected by a high performance on-die bidirectional interconnect. The co-processor supports all important Intel development tools. Thus, the development environment is familiar one to a vast number of CPU developers. Although, getting a maximum performance out of Xeon Phi will require using some novel optimization techniques. Those optimization techniques are discusses in this paper. The results show that the optimizations improved performance of the original code on Xeon Phi 7120P by a factor of 1.3x.
Allocentrically implied target locations are updated in an eye-centred reference frame.
Thompson, Aidan A; Glover, Christopher V; Henriques, Denise Y P
2012-04-18
When reaching to remembered target locations following an intervening eye movement a systematic pattern of error is found indicating eye-centred updating of visuospatial memory. Here we investigated if implicit targets, defined only by allocentric visual cues, are also updated in an eye-centred reference frame as explicit targets are. Participants viewed vertical bars separated by varying distances, and horizontal lines of equivalently varying lengths, implying a "target" location at the midpoint of the stimulus. After determining the implied "target" location from only the allocentric stimuli provided, participants saccaded to an eccentric location, and reached to the remembered "target" location. Irrespective of the type of stimulus reaching errors to these implicit targets are gaze-dependent, and do not differ from those found when reaching to remembered explicit targets. Implicit target locations are coded and updated as a function of relative gaze direction with respect to those implied locations just as explicit targets are, even though no target is specifically represented. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Li, Jiansen; Song, Ying; Zhu, Zhen; Zhao, Jun
2017-05-01
Dual-dictionary learning (Dual-DL) method utilizes both a low-resolution dictionary and a high-resolution dictionary, which are co-trained for sparse coding and image updating, respectively. It can effectively exploit a priori knowledge regarding the typical structures, specific features, and local details of training sets images. The prior knowledge helps to improve the reconstruction quality greatly. This method has been successfully applied in magnetic resonance (MR) image reconstruction. However, it relies heavily on the training sets, and dictionaries are fixed and nonadaptive. In this research, we improve Dual-DL by using self-adaptive dictionaries. The low- and high-resolution dictionaries are updated correspondingly along with the image updating stage to ensure their self-adaptivity. The updated dictionaries incorporate both the prior information of the training sets and the test image directly. Both dictionaries feature improved adaptability. Experimental results demonstrate that the proposed method can efficiently and significantly improve the quality and robustness of MR image reconstruction.
Data Management for a Climate Data Record in an Evolving Technical Landscape
NASA Astrophysics Data System (ADS)
Moore, K. D.; Walter, J.; Gleason, J. L.
2017-12-01
For nearly twenty years, NASA Langley Research Center's Clouds and the Earth's Radiant Energy System (CERES) Science Team has been producing a suite of data products that forms a persistent climate data record of the Earth's radiant energy budget. Many of the team's physical scientists and key research contributors have been with the team since the launch of the first CERES instrument in 1997. This institutional knowledge is irreplaceable and its longevity and continuity are among the reasons that the team has been so productive. Such legacy involvement, however, can also be a limiting factor. Some CERES scientists-cum-coders might possess skills that were state-of-the-field when they were emerging scientists but may now be outdated with respect to developments in software development best practices and supporting technologies. Both programming languages and processing frameworks have evolved significantly in the past twenty years, and updating one of these factors warrants consideration of updating the other. With the imminent launch of a final CERES instrument and the good health of those in flight, the CERES data record stands to continue far into the future. The CERES Science Team is, therefore, undergoing a re-architecture of its codebase to maintain compatibility with newer data processing platforms and technologies and to leverage modern software development best practices. This necessitates training our staff and consequently presents several challenges, including: Development continues immediately on the next "edition" of research algorithms upon release of the previous edition. How can code be rewritten at the same time that the science algorithms are being updated and integrated? With limited time to devote to training, how can we update the staff's existing skillset without slowing progress or introducing new errors? The CERES Science Team is large and complex, much like the current state of its codebase. How can we identify, in a breadth-wise manner, areas for code improvement across multiple research groups that maintain code with varying semantics but common concepts? In this work, we discuss the successes and pitfalls of this major re-architecture effort and share how we will sustain improvement into the future.
Kim, Daehee; Kim, Dongwan; An, Sunshin
2016-07-09
Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.
Kim, Daehee; Kim, Dongwan; An, Sunshin
2016-01-01
Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616
The Athena Astrophysical MHD Code in Cylindrical Geometry
NASA Astrophysics Data System (ADS)
Skinner, M. A.; Ostriker, E. C.
2011-10-01
We have developed a method for implementing cylindrical coordinates in the Athena MHD code (Skinner & Ostriker 2010). The extension has been designed to alter the existing Cartesian-coordinates code (Stone et al. 2008) as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport, a central feature of the Athena algorithm, while making use of previously implemented code modules such as the eigensystems and Riemann solvers. Angular-momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully. We describe modifications for cylindrical coordinates of the higher-order spatial reconstruction and characteristic evolution steps as well as the finite-volume and constrained transport updates. Finally, we have developed a test suite of standard and novel problems in one-, two-, and three-dimensions designed to validate our algorithms and implementation and to be of use to other code developers. The code is suitable for use in a wide variety of astrophysical applications and is freely available for download on the web.
A Tool for Longitudinal Beam Dynamics in Synchrotrons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostiguy, J.-F.; Lebedev, V. A.
2017-05-01
A number of codes are available to simulate longitudinal dynamics in synchrotrons. Some established ones include TIBETAN, LONG1D, ESME and ORBIT. While they embody a wealth of accumulated wisdom and experience, most of these codes were written decades ago and to some extent they reflect the constraints of their time. As a result, there is an interest for updated tools taking better advantage of modern software and hardware capabilities. At Fermilab, the PIP-II project has provided the impetus for development of such a tool. In this contribution, we discuss design decisions and code architecture. A selection of test cases basedmore » on an initial prototype are also presented.« less
NASA Astrophysics Data System (ADS)
Holmes, Mary Anne; Marin-Spiotta, Erika; Schneider, Blair
2017-04-01
Harassment, sexual and otherwise, including bullying and discrimination, remains an ongoing problem in the science workforce. In response to monthly revelations of harassment in academic science in the U.S. in 2016, the American Geophysical Union (AGU) convened a workshop to discuss strategies for professional societies to address this pernicious practice. Participants included researchers on this topic and members from professional science societies, academia, and U.S. federal government agencies. We agreed on the following principles: - Harassment, discrimination and bullying most often occur between a superior (e.g., an advisor, professor, supervisor) and a student or early career professional, representing a power difference that disadvantages the less-powerful scientist. - Harassment drives excellent potential as well as current scientists from the field who would otherwise contribute to the advancement of science, engineering and technology. - Harassment, therefore, represents a form of scientific misconduct, and should be treated as plagiarism, falsification, and other forms of scientific misconduct are treated, with meaningful consequences. To address harassment and to change the culture of science, professional societies can and should: ensure that their Code of Ethics and/or Code of Conduct addresses harassment with clear definitions of what constitutes this behavior, including in academic, professional, conference and field settings; provide a clear and well-disseminated mechanism for reporting violations to the society; have a response person or team in the society that can assist those who feel affected by harassment; and provide a mechanism to revisit and update Codes on a regular basis. The Code should be disseminated widely to members and apply to all members and staff. A revised Code of Ethics is now being constructed by AGU, and will be ready for adoption in 2017. See http://harassment.agu.org/ for information updates.
Automated JPSS VIIRS GEO code change testing by using Chain Run Scripts
NASA Astrophysics Data System (ADS)
Chen, W.; Wang, W.; Zhao, Q.; Das, B.; Mikles, V. J.; Sprietzer, K.; Tsidulko, M.; Zhao, Y.; Dharmawardane, V.; Wolf, W.
2015-12-01
The Joint Polar Satellite System (JPSS) is the next generation polar-orbiting operational environmental satellite system. The first satellite in the JPSS series of satellites, J-1, is scheduled to launch in early 2017. J1 will carry similar versions of the instruments that are on board of Suomi National Polar-Orbiting Partnership (S-NPP) satellite which was launched on October 28, 2011. The center for Satellite Applications and Research Algorithm Integration Team (STAR AIT) uses the Algorithm Development Library (ADL) to run S-NPP and pre-J1 algorithms in a development and test mode. The ADL is an offline test system developed by Raytheon to mimic the operational system while enabling a development environment for plug and play algorithms. The Perl Chain Run Scripts have been developed by STAR AIT to automate the staging and processing of multiple JPSS Sensor Data Record (SDR) and Environmental Data Record (EDR) products. JPSS J1 VIIRS Day Night Band (DNB) has anomalous non-linear response at high scan angles based on prelaunch testing. The flight project has proposed multiple mitigation options through onboard aggregation, and the Option 21 has been suggested by the VIIRS SDR team as the baseline aggregation mode. VIIRS GEOlocation (GEO) code analysis results show that J1 DNB GEO product cannot be generated correctly without the software update. The modified code will support both Op21, Op21/26 and is backward compatible with SNPP. J1 GEO code change version 0 delivery package is under development for the current change request. In this presentation, we will discuss how to use the Chain Run Script to verify the code change and Lookup Tables (LUTs) update in ADL Block2.
2012-01-01
The aim of this letter is to facilitate the standardisation of Abbreviated Injury Scale (AIS) codesets used to code injuries in trauma registries. We have compiled a definitive list of the changes which have been implemented between the AIS 2005 and Update 2008 versions. While the AIS 2008 codeset appears to have remained consistent since its release, we have identified discrepancies between the codesets in copies of AIS 2005 dictionaries. As a result, we recommend that use of the AIS 2005 should be discontinued in favour of the Update 2008 version. PMID:22301065
Ringdal, Kjetil G; Hestnes, Morten; Palmer, Cameron S
2012-02-02
The aim of this letter is to facilitate the standardisation of Abbreviated Injury Scale (AIS) codesets used to code injuries in trauma registries. We have compiled a definitive list of the changes which have been implemented between the AIS 2005 and Update 2008 versions. While the AIS 2008 codeset appears to have remained consistent since its release, we have identified discrepancies between the codesets in copies of AIS 2005 dictionaries. As a result, we recommend that use of the AIS 2005 should be discontinued in favour of the Update 2008 version.
CFD and Neutron codes coupling on a computational platform
NASA Astrophysics Data System (ADS)
Cerroni, D.; Da Vià, R.; Manservisi, S.; Menghini, F.; Scardovelli, R.
2017-01-01
In this work we investigate the thermal-hydraulics behavior of a PWR nuclear reactor core, evaluating the power generation distribution taking into account the local temperature field. The temperature field, evaluated using a self-developed CFD module, is exchanged with a neutron code, DONJON-DRAGON, which updates the macroscopic cross sections and evaluates the new neutron flux. From the updated neutron flux the new peak factor is evaluated and the new temperature field is computed. The exchange of data between the two codes is obtained thanks to their inclusion into the computational platform SALOME, an open-source tools developed by the collaborative project NURESAFE. The numerical libraries MEDmem, included into the SALOME platform, are used in this work, for the projection of computational fields from one problem to another. The two problems are driven by a common supervisor that can access to the computational fields of both systems, in every time step, the temperature field, is extracted from the CFD problem and set into the neutron problem. After this iteration the new power peak factor is projected back into the CFD problem and the new time step can be computed. Several computational examples, where both neutron and thermal-hydraulics quantities are parametrized, are finally reported in this work.
Groundwater flow simulation of the Savannah River Site general separations area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.; Bagwell, L.; Bennett, P.
The most recent groundwater flow model of the General Separations Area, Savannah River Site, is referred to as the “GSA/PORFLOW” model. GSA/PORFLOW was developed in 2004 by porting an existing General Separations Area groundwater flow model from the FACT code to the PORFLOW code. The preceding “GSA/FACT” model was developed in 1997 using characterization and monitoring data through the mid-1990’s. Both models were manually calibrated to field data. Significantly more field data have been acquired since the 1990’s and model calibration using mathematical optimization software has become routine and recommended practice. The current task involved updating the GSA/PORFLOW model usingmore » selected field data current through at least 2015, and use of the PEST code to calibrate the model and quantify parameter uncertainty. This new GSA groundwater flow model is named “GSA2016” in reference to the year in which most development occurred. The GSA2016 model update is intended to address issues raised by the DOE Low-Level Waste (LLW) Disposal Facility Federal Review Group (LFRG) in a 2008 review of the E-Area Performance Assessment, and by the Nuclear Regulatory Commission in reviews of tank closure and Saltstone Disposal Facility Performance Assessments.« less
An X-ray look at the first head-trail nebula in an X-ray binary
NASA Astrophysics Data System (ADS)
Soleri, Paolo
2011-09-01
Head-tail trails are a common feature in active galactic nuclei and pulsar bow-shocks. Heinz et al. (2008) suggested that also X-ray binaries, being jet sources moving with high velocities in dense media, can leave trails of highly ionized plasma that should be detectable at radio frequencies. During bservations of faint-persistent X-ray binaries, we discovered an optical nebula around the X-ray binary SAX J1712.6-3739, consisting of a bow-shock ring-like nebula in front of the binary and two trails originating close to it. This is the first detection of such structure in a X-ray binary and it opens a new sub-field in the study of these objects. Observations with XMM-Newton and Chandra are now needed to investigate the properties of the surrounding nebula.
An X-ray look at the first head-trail nebula in an X-ray binary
NASA Astrophysics Data System (ADS)
Soleri, Paolo
2010-10-01
Head-tail trails are a common feature in active galactic nuclei and pulsar bow-shocks. Heinz et al. (2008) suggested that also X-ray binaries, being jet sources moving with high velocities in dense media, can leave trails of highly ionized plasma that should be detectable at radio frequencies. During observations of faint-persistent X-ray binaries, we discovered an optical nebula around the X-ray binary SAX J1712.6-3739, consisting of a bow-shock ring-like nebula ``in front'' of the binary and two trails originating close to it. This is the first detection of such structure in a X-ray binary and it opens a new sub-field in the study of these objects. Observations with XMM-Newton and Chandra are now needed to investigate the properties of the surrounding nebula.
AN OVERVIEW OF EPANET VERSION 3.0
EPANET is a widely used public domain software package for modeling the hydraulic and water quality behavior of water distribution systems over an extended period of time. The last major update to the code was version 2.0 released in 2000 (Rossman, 2000). Since that time there ha...
State Requirements for Educational Facilities, 1999.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Office of Educational Facilities.
This updated, two-volume document provides guidance for those involved in the educational facilities procurement process, and includes recent legislative changes affecting the state of Florida's building code. The first volume is organized by the sequence of steps required in the facilities procurement process and presents state requirements for…
Face Adaptation and Attractiveness Aftereffects in 8-Year-Olds and Adults
ERIC Educational Resources Information Center
Anzures, Gizelle; Mondloch, Catherine J.; Lackner, Christine
2009-01-01
A novel method was used to investigate developmental changes in face processing: attractiveness aftereffects. Consistent with the norm-based coding model, viewing consistently distorted faces shifts adults' attractiveness preferences toward the adapting stimuli. Thus, adults' attractiveness judgments are influenced by a continuously updated face…
Braiding by Majorana tracking and long-range CNOT gates with color codes
NASA Astrophysics Data System (ADS)
Litinski, Daniel; von Oppen, Felix
2017-11-01
Color-code quantum computation seamlessly combines Majorana-based hardware with topological error correction. Specifically, as Clifford gates are transversal in two-dimensional color codes, they enable the use of the Majoranas' non-Abelian statistics for gate operations at the code level. Here, we discuss the implementation of color codes in arrays of Majorana nanowires that avoid branched networks such as T junctions, thereby simplifying their realization. We show that, in such implementations, non-Abelian statistics can be exploited without ever performing physical braiding operations. Physical braiding operations are replaced by Majorana tracking, an entirely software-based protocol which appropriately updates the Majoranas involved in the color-code stabilizer measurements. This approach minimizes the required hardware operations for single-qubit Clifford gates. For Clifford completeness, we combine color codes with surface codes, and use color-to-surface-code lattice surgery for long-range multitarget CNOT gates which have a time overhead that grows only logarithmically with the physical distance separating control and target qubits. With the addition of magic state distillation, our architecture describes a fault-tolerant universal quantum computer in systems such as networks of tetrons, hexons, or Majorana box qubits, but can also be applied to nontopological qubit platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dornsife, William P.; Kirk, J. Scott; Shaw, Chris G.
2012-07-01
This Performance Assessment (PA) submittal is an update to the original PA that was developed to support the licensing of the Waste Control Specialists LLC Low-Level Radioactive Waste (LLRW) disposal facility. This update includes both the Compact Waste Facility (CWF) and the Federal Waste Facility (FWF), in accordance with Radioactive Material License (RML) No. R04100, License Condition (LC) 87. While many of the baseline assumptions supporting the initial license application PA were incorporated in this update, a new transport code, GoldSim, and new deterministic groundwater flow codes, including HYDRUS and MODFLOWSURFACT{sup TM}, were employed to demonstrate compliance with the performancemore » objectives codified in the regulations and RML No. R04100, LC 87. A revised source term, provided by the Texas Commission on Environmental Quality staff, was used to match the initial 15 year license term. This updated PA clearly confirms and demonstrates the robustness of the characteristics of the site's geology and the advanced engineering design of the disposal units. Based on the simulations from fate and transport models, the radiation doses to members of the general public and site workers predicted in the initial and updated PA were a small fraction of the criterion doses of 0.25 mSv and 50 mSv, respectively. In a comparison between the results of the updated PA against the one developed in support of the initial license, both clearly demonstrated the robustness of the characteristics of the site's geology and engineering design of the disposal units. Based on the simulations from fate and transport models, the radiation doses to members of the general public predicted in the initial and updated PA were a fraction of the allowable 25 mrem/yr (0.25 m sievert/yr) dose standard for tens-of-thousands of years into the future. Draft Texas guidance on performance assessment (TCEQ, 2004) recommends a period of analysis equal to 1,000 years or until peak doses from the more mobile radionuclides occur. The EPA National Emissions Standards for Hazardous Air Pollutants limits radionuclide doses through the air pathway to 10 mrem/yr. Gaseous radionuclide doses from the CWF and the FWF, due to decomposition gases, are a small fraction of the dose limit. The radon flux from the CWF and FWF were compared to the flux limit of 20 pCi/m{sup 2}-s from 40 CFR 192. Because of the thick cover system, the calculated radon flux was a very small fraction of the limit. (authors)« less
[The Abbreviated Injury Scale (AIS). Options and problems in application].
Haasper, C; Junge, M; Ernstberger, A; Brehme, H; Hannawald, L; Langer, C; Nehmzow, J; Otte, D; Sander, U; Krettek, C; Zwipp, H
2010-05-01
The new AIS (Abbreviated Injury Scale) was released with an update by the AAAM (Association for the Advancement of Automotive Medicine) in 2008. It is a universal scoring system in the field of trauma applicable in clinic and research. In engineering it is used as a classification system for vehicle safety. The AIS can therefore be considered as an international, interdisciplinary and universal code of injury severity. This review focuses on a historical overview, potential applications and new coding options in the current version and also outlines the associated problems.
NASA Technical Reports Server (NTRS)
Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris
2008-01-01
The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.
Update on Development of Mesh Generation Algorithms in MeshKit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, Rajeev; Vanderzee, Evan; Mahadevan, Vijay
2015-09-30
MeshKit uses a graph-based design for coding all its meshing algorithms, which includes the Reactor Geometry (and mesh) Generation (RGG) algorithms. This report highlights the developmental updates of all the algorithms, results and future work. Parallel versions of algorithms, documentation and performance results are reported. RGG GUI design was updated to incorporate new features requested by the users; boundary layer generation and parallel RGG support were added to the GUI. Key contributions to the release, upgrade and maintenance of other SIGMA1 libraries (CGM and MOAB) were made. Several fundamental meshing algorithms for creating a robust parallel meshing pipeline in MeshKitmore » are under development. Results and current status of automated, open-source and high quality nuclear reactor assembly mesh generation algorithms such as trimesher, quadmesher, interval matching and multi-sweeper are reported.« less
PCG: A prototype incremental compilation facility for the SAGA environment, appendix F
NASA Technical Reports Server (NTRS)
Kimball, Joseph John
1985-01-01
A programming environment supports the activity of developing and maintaining software. New environments provide language-oriented tools such as syntax-directed editors, whose usefulness is enhanced because they embody language-specific knowledge. When syntactic and semantic analysis occur early in the cycle of program production, that is, during editing, the use of a standard compiler is inefficient, for it must re-analyze the program before generating code. Likewise, it is inefficient to recompile an entire file, when the editor can determine that only portions of it need updating. The pcg, or Pascal code generation, facility described here generates code directly from the syntax trees produced by the SAGA syntax directed Pascal editor. By preserving the intermediate code used in the previous compilation, it can limit recompilation to the routines actually modified by editing.
Lessons Learned through the Development and Publication of AstroImageJ
NASA Astrophysics Data System (ADS)
Collins, Karen
2018-01-01
As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.
NOAA/DOE CWP structural analysis package. [CWPFLY, CWPEXT, COTEC, and XOTEC codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pompa, J.A.; Lunz, D.F.
1979-09-01
The theoretical development and computer code user's manual for analysis of the Ocean Thermal Energy Conversion (OTEC) plant cold water pipe (CWP) are presented. The analysis of the CWP includes coupled platform/CWP loadngs and dynamic responses. This report with the exception of the Introduction and Appendix F was orginally published as Hydronautics, Inc., Technical Report No. 7825-2 (by Barr, Chang, and Thasanatorn) in November 1978. A detailed theoretical development of the equations describing the coupled platform/CWP system and preliminary validation efforts are described. The appendices encompass a complete user's manual, describing the inputs, outputs and operation of the four componentmore » programs, and detail changes and updates implemented since the original release of the code by Hydronautics. The code itself is available through NOAA's Office of Ocean Technology and Engineering Services.« less
NASA Astrophysics Data System (ADS)
Class, G.; Meyder, R.; Stratmanns, E.
1985-12-01
The large data base for validation and development of computer codes for two-phase flow, generated at the COSIMA facility, is reviewed. The aim of COSIMA is to simulate the hydraulic, thermal, and mechanical conditions in the subchannel and the cladding of fuel rods in pressurized water reactors during the blowout phase of a loss of coolant accident. In terms of fuel rod behavior, it is found that during blowout under realistic conditions only small strains are reached. For cladding rupture extremely high rod internal pressures are necessary. The behavior of fuel rod simulators and the effect of thermocouples attached to the cladding outer surface are clarified. Calculations performed with the codes RELAP and DRUFAN show satisfactory agreement with experiments. This can be improved by updating the phase separation models in the codes.
Qualitative Data Analysis: A Methods Sourcebook. Third Edition
ERIC Educational Resources Information Center
Miles, Matthew B.; Huberman, A. Michael; Saldana, Johnny
2014-01-01
The Third Edition of Miles & Huberman's classic research methods text is updated and streamlined by Johnny Saldaña, author of "The Coding Manual for Qualitative Researchers." Several of the data display strategies from previous editions are now presented in re-envisioned and reorganized formats to enhance reader accessibility and…
Using the Student Edition of Update on Law-Related Education.
ERIC Educational Resources Information Center
Banaszak, Ronald A.
1997-01-01
Provides accompanying learning activities for each of the articles in the same issue. The brief articles address a number of legal issues concerning young people including dress codes, teen smoking, curfews, restricted areas (such as the mall), and child labor. Includes a law term crossword puzzle. (MJP)
78 FR 75471 - Section 3504 Agent Employment Tax Liability
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... under section 3504 of the Internal Revenue Code to perform acts required of employers who are home care... home care services, which are subject to taxes under the Federal Unemployment Tax Act. The final... amendments to the existing regulatory language designed to update citations and be consistent with the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-20
... Relating to Amendments to the Discovery Guide Used in Customer Arbitration Proceedings, as Modified by... update the Discovery Guide (``Guide'') used in customer arbitration proceedings.\\1\\ According to FINRA, the Guide supplements the discovery rules contained in the FINRA Code of Arbitration Procedure for...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
...-AM78 Prevailing Rate Systems; North American Industry Classification System Based Federal Wage System... 2007 North American Industry Classification System (NAICS) codes currently used in Federal Wage System... (OPM) issued a final rule (73 FR 45853) to update the 2002 North American Industry Classification...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coddington, M.; Fox, K.; Stanfield, S.
Federal and state regulators are faced with the challenge of keeping interconnection procedures updated against a backdrop of evolving technology, new codes and standards, and considerably transformed market conditions. This report is intended to educate policymakers and stakeholders on beneficial reforms that will keep interconnection processes efficient and cost-effective while maintaining a safe and reliable power system.
75 FR 54131 - Updating State Residential Building Energy Efficiency Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-03
... and 95 degrees F for heating (for heat pumps), the 2000 IECC insulation requirement for supply ducts in unconditioned spaces is R-5 (minimum) for nearly all cases. Insulation required by the 2000 IECC... Duct Insulation Requirements Duct insulation requirements generally increased in the 2003 IECC. The...
State of the States, 2012: Arts Education State Policy Summary
ERIC Educational Resources Information Center
Arts Education Partnership (NJ1), 2012
2012-01-01
The "State of the States 2012" summarizes state policies for arts education identified in statute or code for all 50 states and the District of Columbia. Information is based primarily on results from the AEP Arts Education State Policy Survey conducted in 2010-11, and updated in April 2012.
76 FR 12367 - Proposed Information Collection; Visibility Valuation Survey Pilot Study
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... Survey Pilot Study AGENCY: National Park Service, U.S. Department of the Interior. ACTION: Notice... Code of Federal Regulations). Updated estimates of visibility benefits are required because the studies... a pilot study to test the survey instrument and implementation procedures prior to the full survey...
78 FR 23194 - Federal Acquisition Regulation; Commercial and Government Entity Code
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
... Award Management Name Change, Phase 1 Implementation) which will make a global update to all of the... outside the United States; and Support supply chain traceability and integrity efforts. II. Discussion and.... For Contractors registered in the System for Award Management (SAM), the DLA Logistics Information...
77 FR 4808 - Conference on Air Quality Modeling
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
... update our available modeling tools with state-of-the-science techniques and for the public to offer new... C111, 109 T.W. Alexander Drive, Research Triangle Park, NC 27711. FOR FURTHER INFORMATION CONTACT... Quality Assessment Division, Mail Code C439-01, Research Triangle Park, NC 27711; telephone: (919) 541...
78 FR 46256 - Adoption of Updated EDGAR Filer Manual
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-31
... (202) 551-3600; in the Division of Trading and Markets for questions concerning Form 13H contact...://www.sec.gov/info/edgar.shtml . You can also inspect the document at the National Archives and Records...: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html . By the...
up zip code Case Studies Weatherization: Improving Home Safety and Reducing Your Energy Bill home energy efficient? Your House is a System Living Off The Sun, Or, No Electricity Bill Kermit was Cottage Energy Blogs 5 Most Effective Ways to Save on Your Energy Bill Updating Guest Bathroom With Energy
76 FR 51985 - ICD-9-CM Coordination and Maintenance Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-19
... and Public Health Data Standards Staff, announces the following meeting. Name: ICD-9-CM Coordination.... 2012 ICD-10-PCS GEM and Reimbursement Map Updates. ICD-10-PCS Official Coding Guidelines. ICD-10 MS... Pickett, Medical Systems Administrator, Classifications and Public Health Data Standards Staff, NCHS, 3311...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-23
... Promulgation of Air Quality Implementation Plans; Illinois; Air Quality Standards Revision AGENCY... Illinois state implementation plan (SIP) to reflect current National Ambient Air Quality Standards (NAAQS... Implementation Plan at 35 Illinois Administrative Code part 243, which updates National Ambient Air Quality...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-30
... [In alphabetical order] FDIC Ref. No. Bank name City State Date closed 10433 Fort Lee Federal Savings Fort Lee NJ 4/20/2012 Bank, FSB. [FR Doc. 2012-10330 Filed 4-27-12; 8:45 am] BILLING CODE 6714-01-P ...
78 FR 36549 - Sunshine Act; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
..., Office of External Affairs, (202) 942-1640. Dated: June 13, 2013. James B. Petri, Secretary, Federal Retirement Thrift Investment Board. [FR Doc. 2013-14524 Filed 6-14-13; 11:15 am] BILLING CODE 6760-01-P ... of Financial Management Report 5. FY 2013-2017 Strategic Plan Update [[Page 36550
Performance and Application of Parallel OVERFLOW Codes on Distributed and Shared Memory Platforms
NASA Technical Reports Server (NTRS)
Djomehri, M. Jahed; Rizk, Yehia M.
1999-01-01
The presentation discusses recent studies on the performance of the two parallel versions of the aerodynamics CFD code, OVERFLOW_MPI and _MLP. Developed at NASA Ames, the serial version, OVERFLOW, is a multidimensional Navier-Stokes flow solver based on overset (Chimera) grid technology. The code has recently been parallelized in two ways. One is based on the explicit message-passing interface (MPI) across processors and uses the _MPI communication package. This approach is primarily suited for distributed memory systems and workstation clusters. The second, termed the multi-level parallel (MLP) method, is simple and uses shared memory for all communications. The _MLP code is suitable on distributed-shared memory systems. For both methods, the message passing takes place across the processors or processes at the advancement of each time step. This procedure is, in effect, the Chimera boundary conditions update, which is done in an explicit "Jacobi" style. In contrast, the update in the serial code is done in more of the "Gauss-Sidel" fashion. The programming efforts for the _MPI code is more complicated than for the _MLP code; the former requires modification of the outer and some inner shells of the serial code, whereas the latter focuses only on the outer shell of the code. The _MPI version offers a great deal of flexibility in distributing grid zones across a specified number of processors in order to achieve load balancing. The approach is capable of partitioning zones across multiple processors or sending each zone and/or cluster of several zones into a single processor. The message passing across the processors consists of Chimera boundary and/or an overlap of "halo" boundary points for each partitioned zone. The MLP version is a new coarse-grain parallel concept at the zonal and intra-zonal levels. A grouping strategy is used to distribute zones into several groups forming sub-processes which will run in parallel. The total volume of grid points in each group are approximately balanced. A proper number of threads are initially allocated to each group, and in subsequent iterations during the run-time, the number of threads are adjusted to achieve load balancing across the processes. Each process exploits the multitasking directives already established in Overflow.
2011-01-01
Introduction Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Methods Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. Results The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. Conclusions The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available. PMID:21548991
Palmer, Cameron S; Franklyn, Melanie; Read-Allsopp, Christine; McLellan, Susan; Niggemeyer, Louise E
2011-05-08
Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available.
Rhodes, Gillian; Ewing, Louise; Jeffery, Linda; Avard, Eleni; Taylor, Libby
2014-09-01
Faces are adaptively coded relative to visual norms that are updated by experience. This coding is compromised in autism and the broader autism phenotype, suggesting that atypical adaptive coding of faces may be an endophenotype for autism. Here we investigate the nature of this atypicality, asking whether adaptive face-coding mechanisms are fundamentally altered, or simply less responsive to experience, in autism. We measured adaptive coding, using face identity aftereffects, in cognitively able children and adolescents with autism and neurotypical age- and ability-matched participants. We asked whether these aftereffects increase with adaptor identity strength as in neurotypical populations, or whether they show a different pattern indicating a more fundamental alteration in face-coding mechanisms. As expected, face identity aftereffects were reduced in the autism group, but they nevertheless increased with adaptor strength, like those of our neurotypical participants, consistent with norm-based coding of face identity. Moreover, their aftereffects correlated positively with face recognition ability, consistent with an intact functional role for adaptive coding in face recognition ability. We conclude that adaptive norm-based face-coding mechanisms are basically intact in autism, but are less readily calibrated by experience. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arai, Kenji; Ebata, Shigeo
1997-07-01
This paper summarizes the current and anticipated use of the thermal-hydraulic and neutronic codes for the BWR transient and accident analyses in Japan. The codes may be categorized into the licensing codes and the best estimate codes for the BWR transient and accident analyses. Most of the licensing codes have been originally developed by General Electric. Some codes have been updated based on the technical knowledge obtained in the thermal hydraulic study in Japan, and according to the BWR design changes. The best estimates codes have been used to support the licensing calculations and to obtain the phenomenological understanding ofmore » the thermal hydraulic phenomena during a BWR transient or accident. The best estimate codes can be also applied to a design study for a next generation BWR to which the current licensing model may not be directly applied. In order to rationalize the margin included in the current BWR design and develop a next generation reactor with appropriate design margin, it will be required to improve the accuracy of the thermal-hydraulic and neutronic model. In addition, regarding the current best estimate codes, the improvement in the user interface and the numerics will be needed.« less
The Astrophysics Source Code Library: An Update
NASA Astrophysics Data System (ADS)
Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.
2012-01-01
The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.
Semi Automated Land Cover Layer Updating Process Utilizing Spectral Analysis and GIS Data Fusion
NASA Astrophysics Data System (ADS)
Cohen, L.; Keinan, E.; Yaniv, M.; Tal, Y.; Felus, A.; Regev, R.
2018-04-01
Technological improvements made in recent years of mass data gathering and analyzing, influenced the traditional methods of updating and forming of the national topographic database. It has brought a significant increase in the number of use cases and detailed geo information demands. Processes which its purpose is to alternate traditional data collection methods developed in many National Mapping and Cadaster Agencies. There has been significant progress in semi-automated methodologies aiming to facilitate updating of a topographic national geodatabase. Implementation of those is expected to allow a considerable reduction of updating costs and operation times. Our previous activity has focused on building automatic extraction (Keinan, Zilberstein et al, 2015). Before semiautomatic updating method, it was common that interpreter identification has to be as detailed as possible to hold most reliable database eventually. When using semi-automatic updating methodologies, the ability to insert human insights based knowledge is limited. Therefore, our motivations were to reduce the created gap by allowing end-users to add their data inputs to the basic geometric database. In this article, we will present a simple Land cover database updating method which combines insights extracted from the analyzed image, and a given spatial data of vector layers. The main stages of the advanced practice are multispectral image segmentation and supervised classification together with given vector data geometric fusion while maintaining the principle of low shape editorial work to be done. All coding was done utilizing open source software components.
NASA Technical Reports Server (NTRS)
Mcbride, Bonnie J.; Reno, Martin A.; Gordon, Sanford
1994-01-01
The NASA Lewis chemical equilibrium program with applications continues to be improved and updated. The latest version is CET93. This code, with smaller arrays, has been compiled for use on an IBM or IBM-compatible personal computer and is called CETPC. This report is intended to be primarily a users manual for CET93 and CETPC. It does not repeat the more complete documentation of earlier reports on the equilibrium program. Most of the discussion covers input and output files, two new options (ONLY and comments), example problems, and implementation of CETPC.
GAME: GAlaxy Machine learning for Emission lines
NASA Astrophysics Data System (ADS)
Ucci, G.; Ferrara, A.; Pallottini, A.; Gallerani, S.
2018-06-01
We present an updated, optimized version of GAME (GAlaxy Machine learning for Emission lines), a code designed to infer key interstellar medium physical properties from emission line intensities of ultraviolet /optical/far-infrared galaxy spectra. The improvements concern (a) an enlarged spectral library including Pop III stars, (b) the inclusion of spectral noise in the training procedure, and (c) an accurate evaluation of uncertainties. We extensively validate the optimized code and compare its performance against empirical methods and other available emission line codes (PYQZ and HII-CHI-MISTRY) on a sample of 62 SDSS stacked galaxy spectra and 75 observed HII regions. Very good agreement is found for metallicity. However, ionization parameters derived by GAME tend to be higher. We show that this is due to the use of too limited libraries in the other codes. The main advantages of GAME are the simultaneous use of all the measured spectral lines and the extremely short computational times. We finally discuss the code potential and limitations.
The new Italian code of medical ethics.
Fineschi, V; Turillazzi, E; Cateni, C
1997-01-01
In June 1995, the Italian code of medical ethics was revised in order that its principles should reflect the ever-changing relationship between the medical profession and society and between physicians and patients. The updated code is also a response to new ethical problems created by scientific progress; the discussion of such problems often shows up a need for better understanding on the part of the medical profession itself. Medical deontology is defined as the discipline for the study of norms of conduct for the health care professions, including moral and legal norms as well as those pertaining more strictly to professional performance. The aim of deontology is therefore, the in-depth investigation and revision of the code of medical ethics. It is in the light of this conceptual definition that one should interpret a review of the different codes which have attempted, throughout the various periods of Italy's recent history, to adapt ethical norms to particular social and health care climates. PMID:9279746
Matrix-Product-State Algorithm for Finite Fractional Quantum Hall Systems
NASA Astrophysics Data System (ADS)
Liu, Zhao; Bhatt, R. N.
2015-09-01
Exact diagonalization is a powerful tool to study fractional quantum Hall (FQH) systems. However, its capability is limited by the exponentially increasing computational cost. In order to overcome this difficulty, density-matrix-renormalization-group (DMRG) algorithms were developed for much larger system sizes. Very recently, it was realized that some model FQH states have exact matrix-product-state (MPS) representation. Motivated by this, here we report a MPS code, which is closely related to, but different from traditional DMRG language, for finite FQH systems on the cylinder geometry. By representing the many-body Hamiltonian as a matrix-product-operator (MPO) and using single-site update and density matrix correction, we show that our code can efficiently search the ground state of various FQH systems. We also compare the performance of our code with traditional DMRG. The possible generalization of our code to infinite FQH systems and other physical systems is also discussed.
Scoring the Strengths and Weaknesses of Underage Drinking Laws in the United States
Fell, James C.; Thomas, Sue; Scherer, Michael; Fisher, Deborah A.; Romano, Eduardo
2015-01-01
Several studies have examined the impact of a number of minimum legal drinking age 21 (MLDA-21) laws on underage alcohol consumption and alcohol-related crashes in the United States. These studies have contributed to our understanding of how alcohol control laws affect drinking and driving among those who are under age 21. However, much of the extant literature examining underage drinking laws use a “Law/No law” coding which may obscure the variability inherent in each law. Previous literature has demonstrated that inclusion of law strengths may affect outcomes and overall data fit when compared to “Law/No law” coding. In an effort to assess the relative strength of states’ underage drinking legislation, a coding system was developed in 2006 and applied to 16 MLDA-21 laws. The current article updates the previous endeavor and outlines a detailed strength coding mechanism for the current 20 MLDA-21 laws. PMID:26097775
STELLTRANS: A Transport Analysis Suite for Stellarators
NASA Astrophysics Data System (ADS)
Mittelstaedt, Joseph; Lazerson, Samuel; Pablant, Novimir; Weir, Gavin; W7-X Team
2016-10-01
The stellarator transport code STELLTRANS allows us to better analyze the power balance in W7-X. Although profiles of temperature and density are measured experimentally, geometrical factors are needed in conjunction with these measurements to properly analyze heat flux densities in stellarators. The STELLTRANS code interfaces with VMEC to find an equilibrium flux surface configuration and with TRAVIS to determine the RF heating and current drive in the plasma. Stationary transport equations are then considered which are solved using a boundary value differential equation solver. The equations and quantities considered are averaged over flux surfaces to reduce the system to an essentially one dimensional problem. We have applied this code to data from W-7X and were able to calculate the heat flux coefficients. We will also present extensions of the code to a predictive capacity which would utilize DKES to find neoclassical transport coefficients to update the temperature and density profiles.
Calculations of Helium Bubble Evolution in the PISCES Experiments with Cluster Dynamics
NASA Astrophysics Data System (ADS)
Blondel, Sophie; Younkin, Timothy; Wirth, Brian; Lasa, Ane; Green, David; Canik, John; Drobny, Jon; Curreli, Davide
2017-10-01
Plasma surface interactions in fusion tokamak reactors involve an inherently multiscale, highly non-equilibrium set of phenomena, for which current models are inadequate to predict the divertor response to and feedback on the plasma. In this presentation, we describe the latest code developments of Xolotl, a spatially-dependent reaction diffusion cluster dynamics code to simulate the divertor surface response to fusion-relevant plasma exposure. Xolotl is part of a code-coupling effort to model both plasma and material simultaneously; the first benchmark for this effort is the series of PISCES linear device experiments. We will discuss the processes leading to surface morphology changes, which further affect erosion, as well as how Xolotl has been updated in order to communicate with other codes. Furthermore, we will show results of the sub-surface evolution of helium bubbles in tungsten as well as the material surface displacement under these conditions.
Status Report on NEAMS PROTEUS/ORIGEN Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A
2016-02-18
The US Department of Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program has contributed significantly to the development of the PROTEUS neutron transport code at Argonne National Laboratory and to the Oak Ridge Isotope Generation and Depletion Code (ORIGEN) depletion/decay code at Oak Ridge National Laboratory. PROTEUS’s key capability is the efficient and scalable (up to hundreds of thousands of cores) neutron transport solver on general, unstructured, three-dimensional finite-element-type meshes. The scalability and mesh generality enable the transfer of neutron and power distributions to other codes in the NEAMS toolkit for advanced multiphysics analysis. Recently, ORIGEN has received considerablemore » modernization to provide the high-performance depletion/decay capability within the NEAMS toolkit. This work presents a description of the initial integration of ORIGEN in PROTEUS, mainly performed during FY 2015, with minor updates in FY 2016.« less
CFL3D Version 6.4-General Usage and Aeroelastic Analysis
NASA Technical Reports Server (NTRS)
Bartels, Robert E.; Rumsey, Christopher L.; Biedron, Robert T.
2006-01-01
This document contains the course notes on the computational fluid dynamics code CFL3D version 6.4. It is intended to provide from basic to advanced users the information necessary to successfully use the code for a broad range of cases. Much of the course covers capability that has been a part of previous versions of the code, with material compiled from a CFL3D v5.0 manual and from the CFL3D v6 web site prior to the current release. This part of the material is presented to users of the code not familiar with computational fluid dynamics. There is new capability in CFL3D version 6.4 presented here that has not previously been published. There are also outdated features no longer used or recommended in recent releases of the code. The information offered here supersedes earlier manuals and updates outdated usage. Where current usage supersedes older versions, notation of that is made. These course notes also provides hints for usage, code installation and examples not found elsewhere.
Rodrigues, J M; Trombert-Paviot, B; Baud, R; Wagner, J; Meusnier-Carriot, F
1998-01-01
GALEN has developed a language independent common reference model based on a medically oriented ontology and practical tools and techniques for managing healthcare terminology including natural language processing. GALEN-IN-USE is the current phase which applied the modelling and the tools to the development or the updating of coding systems for surgical procedures in different national coding centers co-operating within the European Federation of Coding Centre (EFCC) to create a language independent knowledge repository for multicultural Europe. We used an integrated set of artificial intelligence terminology tools named CLAssification Manager workbench to process French professional medical language rubrics into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation we generate controlled French natural language. The French national coding centre is then able to retrieve the initial professional rubrics with different categories of concepts, to compare the professional language proposed by expert clinicians to the French generated controlled vocabulary and to finalize the linguistic labels of the coding system in relation with the meanings of the conceptual system structure.
Comparison of injury severity between AIS 2005 and AIS 1990 in a large injury database
Barnes, J; Hassan, A; Cuerden, R; Cookson, R; Kohlhofer, J
2009-01-01
The aim of this study is to investigate the differences in car occupant injury severity recorded in AIS 2005 compared to AIS 1990 and to outline the likely effects on future data analysis findings. Occupant injury data in the UK Cooperative Crash Injury Study Database (CCIS) were coded for the period February 2006 to November 2007 using both AIS 1990 and AIS 2005. Data for 1,994 occupants with over 6000 coded injuries were reviewed at the AIS and MAIS level of severities and body regions to determine changes between the two coding methodologies. Overall there was an apparent general trend for fewer injuries to be coded at the AIS 4+ severity and more injuries to be coded at the AIS 2 severity. When these injury trends were reviewed in more detail it was found that the body regions which contributed the most to these changes in severity were the head, thorax and extremities. This is one of the first studies to examine the implications for large databases when changing to an updated method for coding injuries. PMID:20184835
Hasselmo, Michael E.
2008-01-01
The spiking activity of hippocampal neurons during REM sleep exhibits temporally structured replay of spiking occurring during previously experienced trajectories (Louie and Wilson, 2001). Here, temporally structured replay of place cell activity during REM sleep is modeled in a large-scale network simulation of grid cells, place cells and head direction cells. During simulated waking behavior, the movement of the simulated rat drives activity of a population of head direction cells that updates the activity of a population of entorhinal grid cells. The population of grid cells drives the activity of place cells coding individual locations. Associations between location and movement direction are encoded by modification of excitatory synaptic connections from place cells to speed modulated head direction cells. During simulated REM sleep, the population of place cells coding an experienced location activates the head direction cells coding the associated movement direction. Spiking of head direction cells then causes frequency shifts within the population of entorhinal grid cells to update a phase representation of location. Spiking grid cells then activate new place cells that drive new head direction activity. In contrast to models that perform temporally compressed sequence retrieval similar to sharp wave activity, this model can simulate data on temporally structured replay of hippocampal place cell activity during REM sleep at time scales similar to those observed during waking. These mechanisms could be important for episodic memory of trajectories. PMID:18973557
Sajjad, Muhammad; Mehmood, Irfan; Baik, Sung Wook
2015-01-01
Image super-resolution (SR) plays a vital role in medical imaging that allows a more efficient and effective diagnosis process. Usually, diagnosing is difficult and inaccurate from low-resolution (LR) and noisy images. Resolution enhancement through conventional interpolation methods strongly affects the precision of consequent processing steps, such as segmentation and registration. Therefore, we propose an efficient sparse coded image SR reconstruction technique using a trained dictionary. We apply a simple and efficient regularized version of orthogonal matching pursuit (ROMP) to seek the coefficients of sparse representation. ROMP has the transparency and greediness of OMP and the robustness of the L1-minization that enhance the dictionary learning process to capture feature descriptors such as oriented edges and contours from complex images like brain MRIs. The sparse coding part of the K-SVD dictionary training procedure is modified by substituting OMP with ROMP. The dictionary update stage allows simultaneously updating an arbitrary number of atoms and vectors of sparse coefficients. In SR reconstruction, ROMP is used to determine the vector of sparse coefficients for the underlying patch. The recovered representations are then applied to the trained dictionary, and finally, an optimization leads to high-resolution output of high-quality. Experimental results demonstrate that the super-resolution reconstruction quality of the proposed scheme is comparatively better than other state-of-the-art schemes.
DATMAN: A reliability data analysis program using Bayesian updating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, M.; Feltus, M.A.
1996-12-31
Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less
NCIP has migrated 132 repositories from the NCI subversion repository to our public NCIP GitHub channel with the goal of facilitating third party contributions to the existing code base. Within the GitHub environment, we are advocating use of the GitHub “fork and pull” model.
DOT National Transportation Integrated Search
2002-01-01
In 1986, 33.1-23.5:1 of the Code of Virginia established new rates for payments to Henrico and Arlington counties to maintain their secondary roads and specified how the rates were to be adjusted annually. The rates specified for 1986 maintenance pay...
76 FR 50815 - TSCA Inventory Update Reporting Modifications; Chemical Data Reporting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-16
... longer accept paper submissions or electronic media (i.e., as a file on a CD- ROM) for any CDR submission...-mail address: [email protected] . SUPPLEMENTARY INFORMATION: I. Does this action apply to me? You... byproduct chemical substance (NAICS codes 22, 322, 331, and 3344; e.g., utilities, paper manufacturing...
76 FR 73506 - Adoption of Updated EDGAR Filer Manual
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-29
... Mackintosh, Office of Information Technology, at (202) 551-3600; in the Division of Trading and Markets for... is http://www.sec.gov/info/edgar.shtml . You can also inspect the document at the National Archives... (202) 741-6030, or go to: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr...
75 FR 17853 - Adoption of Updated EDGAR Filer Manual
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-08
... Trading and Markets for questions regarding OMB expiration dates for Forms TA-1 and TA-2 contact Catherine... Archives and Records Administration (NARA). For information on the availability of this material at NARA, call 202-741- 6030, or go to: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr...
75 FR 30855 - Meeting of the California Desert District Advisory Council
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-02
..., California. Agenda topics will include updates by Council members and reports from the BLM District Manager and five field office managers. Final agenda items, including details of the field tour, will be... Manager. [FR Doc. 2010-13229 Filed 6-1-10; 8:45 am] BILLING CODE 4310-40-P ...
Engineering for Sustainable Development and the Common Good
ERIC Educational Resources Information Center
Kelly, William E.
2006-01-01
In 1994, the American Society of Civil Engineers (ASCE) updated its Code of Ethics to include specific statements on sustainable development and at about the same time, 1994, ASCE adopted its Policy 418 on sustainable development. Sustainable development as defined by ASCE "is the challenge of meeting human needs for natural resources, industrial…
Using Program Theory-Driven Evaluation Science to Crack the Da Vinci Code
ERIC Educational Resources Information Center
Donaldson, Stewart I.
2005-01-01
Program theory-driven evaluation science uses substantive knowledge, as opposed to method proclivities, to guide program evaluations. It aspires to update, clarify, simplify, and make more accessible the evolving theory of evaluation practice commonly referred to as theory-driven or theory-based evaluation. The evaluator in this chapter provides a…
Turbulent Mixing Chemistry in Disks
NASA Astrophysics Data System (ADS)
Semenov, D.; Wiebe, D.
2006-11-01
A gas-grain chemical model with surface reaction and 1D/2D turbulent mixing is available for protoplanetary disks and molecular clouds. Current version is based on the updated UMIST'95 database with gas-grain interactions (accretion, desorption, photoevaporation, etc.) and modified rate equation approach to surface chemistry (see also abstract for the static chemistry code).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-20
... and Building Codes, U.S. Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy... posted at http://www1.eere.energy.gov/buildings/appliance_standards/asrac.html : Update on Commercial... Energy, Building Technologies Program, Mailstop EE-2J, 1000 Independence Avenue SW., Washington, DC 20585...
77 FR 15053 - Manual for Courts-Martial; Proposed Amendments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
... the M.R.E. and a Word document using color-coded text and comments to explain amendments. Updated... evidence. f. Commenter recommended using the words ``pursuant to statutory authority'' in M.R.E. 807. JSC... the rule to findings. i. Commenter recommended removing the word ``allegedly'' from proposed M.R.E...
78 FR 48727 - Proposed Revisions to Design of Structures, Components, Equipment and Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... Analysis Reports for Nuclear Power Plants: LWR Edition,'' Section 3.9.3 ``ASME Code Class 1, 2, and 3...'s Agencywide Documents Access and Management System (ADAMS): You may access publicly available... operational readiness of snubbers (ADAMS Accession No. ML070720041), and review interfaces have been updated...
Particle-in-Cell laser-plasma simulation on Xeon Phi coprocessors
NASA Astrophysics Data System (ADS)
Surmin, I. A.; Bastrakov, S. I.; Efimenko, E. S.; Gonoskov, A. A.; Korzhimanov, A. V.; Meyerov, I. B.
2016-05-01
This paper concerns the development of a high-performance implementation of the Particle-in-Cell method for plasma simulation on Intel Xeon Phi coprocessors. We discuss the suitability of the method for Xeon Phi architecture and present our experience in the porting and optimization of the existing parallel Particle-in-Cell code PICADOR. Direct porting without code modification gives performance on Xeon Phi close to that of an 8-core CPU on a benchmark problem with 50 particles per cell. We demonstrate step-by-step optimization techniques, such as improving data locality, enhancing parallelization efficiency and vectorization leading to an overall 4.2 × speedup on CPU and 7.5 × on Xeon Phi compared to the baseline version. The optimized version achieves 16.9 ns per particle update on an Intel Xeon E5-2660 CPU and 9.3 ns per particle update on an Intel Xeon Phi 5110P. For a real problem of laser ion acceleration in targets with surface grating, where a large number of macroparticles per cell is required, the speedup of Xeon Phi compared to CPU is 1.6 ×.
NASA Technical Reports Server (NTRS)
Brown, James L.
2014-01-01
Examined is sensitivity of separation extent, wall pressure and heating to variation of primary input flow parameters, such as Mach and Reynolds numbers and shock strength, for 2D and Axisymmetric Hypersonic Shock Wave Turbulent Boundary Layer interactions obtained by Navier-Stokes methods using the SST turbulence model. Baseline parametric sensitivity response is provided in part by comparison with vetted experiments, and in part through updated correlations based on free interaction theory concepts. A recent database compilation of hypersonic 2D shock-wave/turbulent boundary layer experiments extensively used in a prior related uncertainty analysis provides the foundation for this updated correlation approach, as well as for more conventional validation. The primary CFD method for this work is DPLR, one of NASA's real-gas aerothermodynamic production RANS codes. Comparisons are also made with CFL3D, one of NASA's mature perfect-gas RANS codes. Deficiencies in predicted separation response of RANS/SST solutions to parametric variations of test conditions are summarized, along with recommendations as to future turbulence approach.
APPRIS 2017: principal isoforms for multiple gene sets
Rodriguez-Rivas, Juan; Di Domenico, Tomás; Vázquez, Jesús; Valencia, Alfonso
2018-01-01
Abstract The APPRIS database (http://appris-tools.org) uses protein structural and functional features and information from cross-species conservation to annotate splice isoforms in protein-coding genes. APPRIS selects a single protein isoform, the ‘principal’ isoform, as the reference for each gene based on these annotations. A single main splice isoform reflects the biological reality for most protein coding genes and APPRIS principal isoforms are the best predictors of these main proteins isoforms. Here, we present the updates to the database, new developments that include the addition of three new species (chimpanzee, Drosophila melangaster and Caenorhabditis elegans), the expansion of APPRIS to cover the RefSeq gene set and the UniProtKB proteome for six species and refinements in the core methods that make up the annotation pipeline. In addition APPRIS now provides a measure of reliability for individual principal isoforms and updates with each release of the GENCODE/Ensembl and RefSeq reference sets. The individual GENCODE/Ensembl, RefSeq and UniProtKB reference gene sets for six organisms have been merged to produce common sets of splice variants. PMID:29069475
NASA Astrophysics Data System (ADS)
Hanasaki, Itsuo; Kawano, Satoyuki
2013-11-01
Motility of bacteria is usually recognized in the trajectory data and compared with Brownian motion, but the diffusion coefficient is insufficient to evaluate it. In this paper, we propose a method based on the large deviation principle. We show that it can be used to evaluate the non-Gaussian characteristics of model Escherichia coli motions and to distinguish combinations of the mean running duration and running speed that lead to the same diffusion coefficient. Our proposed method does not require chemical stimuli to induce the chemotaxis in a specific direction, and it is applicable to various types of self-propelling motions for which no a priori information of, for example, threshold parameters for run and tumble or head/tail direction is available. We also address the issue of the finite-sample effect on the large deviation quantities, but we propose to make use of it to characterize the nature of motility.
Dust-wall and dust-plasma interaction in the MIGRAINe code
NASA Astrophysics Data System (ADS)
Vignitchouk, L.; Tolias, P.; Ratynskaia, S.
2014-09-01
The physical models implemented in the recently developed dust dynamics code MIGRAINe are described. A major update of the treatment of secondary electron emission, stemming from models adapted to typical scrape-off layer temperatures, is reported. Sputtering and plasma species backscattering are introduced from fits of available experimental data and their relative importance to dust charging and heating is assessed in fusion-relevant scenarios. Moreover, the description of collisions between dust particles and plasma-facing components, based on the approximation of elastic-perfectly plastic adhesive spheres, has been upgraded to take into account the effects of particle size and temperature.
Predicting Cavitation on Marine and Hydrokinetic Turbine Blades with AeroDyn V15.04
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murray, Robynne
Cavitation is an important consideration in the design of marine and hydrokinetic (MHK) turbines. The National Renewable Energy Laboratory's AeroDyn performance code was originally developed for horizontal-axis wind turbines and did not have the capability to predict cavitation inception. Therefore, AeroDyn has been updated to include the ability to predict cavitation on MHK turbines based on user-specified vapor pressure and submerged depth. This report outlines a verification of the AeroDyn V15.04 performance code for MHK turbines through a comparison to publicly available performance data.
Multitasking TORT under UNICOS: Parallel performance models and measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, A.; Azmy, Y.Y.
1999-09-27
The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.
Non-coding RNAs in cardiac fibrosis: emerging biomarkers and therapeutic targets.
Chen, Zhongxiu; Li, Chen; Lin, Ke; Cai, Huawei; Ruan, Weiqiang; Han, Junyang; Rao, Li
2017-12-14
Non-coding RNAs (ncRNAs) are a class of RNA molecules that do not encode proteins. ncRNAs are involved in cell proliferation, apoptosis, differentiation, metabolism, and other physiological processes as well as the pathogenesis of diseases. Cardiac fibrosis is increasingly recognized as a common final pathway in advanced heart diseases. Many studies have shown that the occurrence and development of cardiac fibrosis is closely related to the regulation of ncRNAs. This review will highlight recent updates regarding the involvement of ncRNAs in cardiac fibrosis, and their potential as emerging biomarkers and therapeutic targets.
An address geocoding method for improving rural spatial information infrastructure
NASA Astrophysics Data System (ADS)
Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing
2010-11-01
The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.
Upgrades of Two Computer Codes for Analysis of Turbomachinery
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.; Liou, Meng-Sing
2005-01-01
Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.
Locations of serial reach targets are coded in multiple reference frames.
Thompson, Aidan A; Henriques, Denise Y P
2010-12-01
Previous work from our lab, and elsewhere, has demonstrated that remembered target locations are stored and updated in an eye-fixed reference frame. That is, reach errors systematically vary as a function of gaze direction relative to a remembered target location, not only when the target is viewed in the periphery (Bock, 1986, known as the retinal magnification effect), but also when the target has been foveated, and the eyes subsequently move after the target has disappeared but prior to reaching (e.g., Henriques, Klier, Smith, Lowy, & Crawford, 1998; Sorrento & Henriques, 2008; Thompson & Henriques, 2008). These gaze-dependent errors, following intervening eye movements, cannot be explained by representations whose frame is fixed to the head, body or even the world. However, it is unknown whether targets presented sequentially would all be coded relative to gaze (i.e., egocentrically/absolutely), or if they would be coded relative to the previous target (i.e., allocentrically/relatively). It might be expected that the reaching movements to two targets separated by 5° would differ by that distance. But, if gaze were to shift between the first and second reaches, would the movement amplitude between the targets differ? If the target locations are coded allocentrically (i.e., the location of the second target coded relative to the first) then the movement amplitude should be about 5°. But, if the second target is coded egocentrically (i.e., relative to current gaze direction), then the reaches to this target and the distances between the subsequent movements should vary systematically with gaze as described above. We found that requiring an intervening saccade to the opposite side of 2 briefly presented targets between reaches to them resulted in a pattern of reaching error that systematically varied as a function of the distance between current gaze and target, and led to a systematic change in the distance between the sequential reach endpoints as predicted by an egocentric frame anchored to the eye. However, the amount of change in this distance was smaller than predicted by a pure eye-fixed representation, suggesting that relative positions of the targets or allocentric coding was also used in sequential reach planning. The spatial coding and updating of sequential reach target locations seems to rely on a combined weighting of multiple reference frames, with one of them centered on the eye. Copyright © 2010 Elsevier Ltd. All rights reserved.
Impact of GNSS orbit modeling on LEO orbit and gravity field determination
NASA Astrophysics Data System (ADS)
Arnold, Daniel; Meyer, Ulrich; Sušnik, Andreja; Dach, Rolf; Jäggi, Adrian
2017-04-01
On January 4, 2015 the Center for Orbit Determination in Europe (CODE) changed the solar radiation pressure modeling for GNSS satellites to an updated version of the empirical CODE orbit model (ECOM). Furthermore, since September 2012 CODE operationally computes satellite clock corrections not only for the 3-day long-arc solutions, but also for the non-overlapping 1-day GNSS orbits. This provides different sets of GNSS products for Precise Point Positioning, as employed, e.g., in the GNSS-based precise orbit determination of low Earth orbiters (LEOs) and the subsequent Earth gravity field recovery from kinematic LEO orbits. While the impact of the mentioned changes in orbit modeling and solution strategy on the GNSS orbits and geophysical parameters was studied in detail, their implications on the LEO orbits were not yet analyzed. We discuss the impact of the update of the ECOM and the influence of 1-day and 3-day GNSS orbit solutions on zero-difference LEO orbit and gravity field determination, where the GNSS orbits and clock corrections, as well as the Earth rotation parameters are introduced as fixed external products. Several years of kinematic and reduced-dynamic orbits for the two GRACE LEOs are computed with GNSS products based on both the old and the updated ECOM, as well as with 1- and 3-day GNSS products. The GRACE orbits are compared by means of standard validation measures. Furthermore, monthly and long-term GPS-only and combined GPS/K-band gravity field solutions are derived from the different sets of kinematic LEO orbits. GPS-only fields are validated by comparison to combined GPS/K-band solutions, while the combined solutions are validated by analysis of the formal errors, as well as by comparing them to the combined GRACE solutions of the European Gravity Service for Improved Emergency Management (EGSIEM) project.
NASA Astrophysics Data System (ADS)
Schumacher, Florian; Friederich, Wolfgang
Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).
Tang, G.; Yuan, F.; Bisht, G.; ...
2015-12-17
We explore coupling to a configurable subsurface reactive transport code as a flexible and extensible approach to biogeochemistry in land surface models; our goal is to facilitate testing of alternative models and incorporation of new understanding. A reaction network with the CLM-CN decomposition, nitrification, denitrification, and plant uptake is used as an example. We implement the reactions in the open-source PFLOTRAN code, coupled with the Community Land Model (CLM), and test at Arctic, temperate, and tropical sites. To make the reaction network designed for use in explicit time stepping in CLM compatible with the implicit time stepping used in PFLOTRAN,more » the Monod substrate rate-limiting function with a residual concentration is used to represent the limitation of nitrogen availability on plant uptake and immobilization. To achieve accurate, efficient, and robust numerical solutions, care needs to be taken to use scaling, clipping, or log transformation to avoid negative concentrations during the Newton iterations. With a tight relative update tolerance to avoid false convergence, an accurate solution can be achieved with about 50 % more computing time than CLM in point mode site simulations using either the scaling or clipping methods. The log transformation method takes 60–100 % more computing time than CLM. The computing time increases slightly for clipping and scaling; it increases substantially for log transformation for half saturation decrease from 10 −3 to 10 −9 mol m −3, which normally results in decreasing nitrogen concentrations. The frequent occurrence of very low concentrations (e.g. below nanomolar) can increase the computing time for clipping or scaling by about 20 %; computing time can be doubled for log transformation. Caution needs to be taken in choosing the appropriate scaling factor because a small value caused by a negative update to a small concentration may diminish the update and result in false convergence even with very tight relative update tolerance. As some biogeochemical processes (e.g., methane and nitrous oxide production and consumption) involve very low half saturation and threshold concentrations, this work provides insights for addressing nonphysical negativity issues and facilitates the representation of a mechanistic biogeochemical description in earth system models to reduce climate prediction uncertainty.« less
New spectroscopy in the HITRAN2016 database and its impact on atmospheric retrievals
NASA Astrophysics Data System (ADS)
Gordon, I.; Rothman, L. S.; Kochanov, R. V.; Tan, Y.; Toon, G. C.
2017-12-01
The HITRAN spectroscopic database is a backbone of the interpretation of spectral atmospheric retrievals and is an important input to the radiative transfer codes. The database is serving the atmospheric community for nearly half-a-century with every new edition being released every four years. The most recent release of the database is HITRAN2016 [1]. It consists of line-by-line lists, experimental absorption cross-sections, collision-induced absorption data and aerosol indices of refraction. In this presentation it will be stressed the importance of using the most recent edition of the database in the radiative transfer codes. The line-by-line lists for most of the HITRAN molecules were updated (and two new molecules added) in comparison with the previous compilation HITRAN2012 [2] that has been in use, along with some intermediate updates, since 2012. The extent of the updates ranges from updating a few lines of certain molecules to complete replacements of the lists and introduction of additional isotopologues. In addition, the amount of molecules in cross-sectional part of the database has increased dramatically from nearly 50 to over 300. The molecules covered by the HITRAN database are important in planetary remote sensing, environment monitoring (in particular, biomass burning detection), climate applications, industrial pollution tracking, atrophysics, and more. Taking advantage of the new structure and interface available at www.hitran.org [3] and the HITRAN Application Programming Interface [4] the amount of parameters has also been significantly increased, now incorporating, for instance, non-Voigt line profiles [5]; broadening by gases other than air and "self" [6]; and other phenomena, including line mixing. This is a very important novelty that needs to be properly introduced in the radiative transfer codes in order to advance accurate interpretation of the remote sensing retrievals. This work is supported by the NASA PDART (NNX16AG51G) and AURA (NNX 17AI78G) programs. References[1] I.E. Gordon et al, JQSRT in press (2017) http://doi.org/10.1016/j.jqsrt.2017.06.038. [2] L.S. Rothman et al, JQSRT 130, 4 (2013). [3] C. Hill et al, JQSRT 177, 4 (2016). [4] R.V. Kochanov et al, JQSRT 177, 15 (2016). [5] P. Wcisło et al., JQSRT 177, 75 (2016). [6] J. S. Wilzewski et al., JQSRT 168, 193 (2016).
Enhancing the Remote Variable Operations in NPSS/CCDK
NASA Technical Reports Server (NTRS)
Sang, Janche; Follen, Gregory; Kim, Chan; Lopez, Isaac; Townsend, Scott
2001-01-01
Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase the code reusability. The remote variable scheme provided in NPSS/CCDK helps programmers easily migrate the Fortran codes towards a client-server platform. This scheme gives the client the capability of accessing the variables at the server site. In this paper, we review and enhance the remote variable scheme by using the operator overloading features in C++. The enhancement enables NPSS programmers to use remote variables in much the same way as traditional variables. The remote variable scheme adopts the lazy update approach and the prefetch method. The design strategies and implementation techniques are described in details. Preliminary performance evaluation shows that communication overhead can be greatly reduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fasso, A.; Ferrari, A.; Ferrari, A.
In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, andmore » with the SLAC data.« less
Making your code citable with the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.
2016-01-01
The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.
Donkuru, McDonald; Chitanda, Jackson M; Verrall, Ronald E; El-Aneed, Anas
2014-04-15
This study aimed at evaluating the collision-induced dissociation tandem mass spectrometric (CID-MS/MS) fragmentation patterns of novel β-cyclodextrin-substituted- and bis-pyridinium gemini surfactants currently being explored as nanomaterial drug delivery agents. In the β-cyclodextrin-substituted gemini surfactants, a β-cyclodextrin ring is grafted onto an N,N-bis(dimethylalkyl)-α,ω-aminoalkane-diammonium moiety using variable succinyl linkers. In contrast, the bis-pyridinium gemini surfactants are based on a 1,1'-(1,1'-(ethane-1,2-diylbis(sulfanediyl))bis(alkane-2,1-diyl))dipyridinium template, defined by two symmetrical N-alkylpyridinium parts connected through a fixed ethane dithiol spacer. Detection of the precursor ion [M](2+) species of the synthesized compounds and the determination of mass accuracies were conducted using a QqTOF-MS instrument. A multi-stage tandem MS analysis of the detected [M](2+) species was conducted using the QqQ-LIT-MS instrument. Both instruments were equipped with an electrospray ionization (ESI) source. Abundant precursor ion [M](2+) species were detected for all compounds at sub-1 ppm mass accuracies. The β-cyclodextrin-substituted compounds, fragmented via two main pathways: Pathway 1: the loss of one head-tail region produces a [M-(N(Me)2-R)](2+) ion, from which sugar moieties (Glc) are sequentially cleaved; Pathway 2: both head-tail regions are lost to give [M-2(N(Me)2-R)](+), followed by consecutive loss of Glc units. Alternatively, the cleavage of the Glc units could also have occurred simultaneously. Nevertheless, the fragmentation evolved around the quaternary ammonium cations, with characteristic cleavage of Glc moieties. For the bis-pyridinium gemini compounds, they either lost neutral pyridine(s) to give doubly charged ions (Pathway A) or formed complementary pyridinium alongside other singly charged ions (Pathway B). Similar to β-cyclodextrin-substituted compounds, the fragmentation was centered on the pyridinium functional groups. The MS(n) analyses of these novel gemini surfactants, reported here for the first time, revealed diagnostic ions for each compound, with a universal fragmentation pattern for each compound series. The diagnostic ions will be employed within liquid chromatography (LC)/MS/MS methods for screening, identification, and quantification of these compounds within biological samples. Copyright © 2014 John Wiley & Sons, Ltd.
GRASP/Ada 95: Reverse Engineering Tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1996-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped an algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD), and a new visualization for a fine-grained complexity metric called the Complexity Profile Graph (CPG). By synchronizing the CSD and the CPG, the CSD view of control structure, nesting, and source code is directly linked to the corresponding visualization of statement level complexity in the CPG. GRASP has been integrated with GNAT, the GNU Ada 95 Translator to provide a comprehensive graphical user interface and development environment for Ada 95. The user may view, edit, print, and compile source code as a CSD with no discernible addition to storage or computational overhead. The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada 95 source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. The current update has focused on the design and implementation of a new Motif compliant user interface, and a new CSD generator consisting of a tagger and renderer. The Complexity Profile Graph (CPG) is based on a set of functions that describes the context, content, and the scaling for complexity on a statement by statement basis. When combined graphicafly, the result is a composite profile of complexity for the program unit. Ongoing research includes the development and refinement of the associated functions, and the development of the CPG generator prototype. The current Version 5.0 prototype provides the capability for the user to generate CSDs and CPGs from Ada 95 source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application. This report provides an overview of the GRASP/Ada project with an emphasis on the current update.
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1992-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation of Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAS 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3, the prototype was evaluated by software engineering students at Auburn University and then updated with significant enhancements to the user interface including editing capabilities. Version 3.2 of the prototype was prepared for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application.
Faculty Observables and Self-Reported Responsiveness to Academic Dishonesty
ERIC Educational Resources Information Center
Burrus, Robert T., Jr.; Jones, Adam T.; Sackley, William H.; Walker, Michael
2015-01-01
Prior to 2009, a mid-sized public institution in the southeast had a faculty-driven honor policy characterized by little education about the policy and no tracking of repeat offenders. An updated code, implemented in August of 2009, required that students sign an honor pledge, created a formal student honor board, and developed a process to track…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-15
... (also called water quality criteria) for human health and aquatic life for toxic pollutants in the... Commission in 1996 adopted water quality criteria for human health and aquatic life for Water Quality Zones 2... Objectives for Toxic Pollutants for the Protection of Aquatic Life'', Table 6, ``Stream Quality Objectives...
NREL: International Activities - U.S.-China Renewable Energy Partnership
Solar PV and TC88 Wind working groups. Renewable Energy Technology These projects enhance policies to Collaboration on innovative business models and financing solutions for solar PV deployment. Micrositing and O development. Current Projects Recommendations for photovoltaic (PV) and wind grid code updates. New energy
77 FR 48524 - Board of Scientific Counselors, National Center for Health Statistics; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-14
... Regulations, Title 41, Code of Federal Regulation, Subpart 101-20.301, all persons entering in or on Federal...; update on the National survey of Family Growth; the initiation of the review of the Office of Research... the presenter. Written comments should not exceed five single-spaced typed pages in length and must be...
The Role of Categories and Spatial Cuing in Global-Scale Location Estimates
ERIC Educational Resources Information Center
Friedman, Alinda
2009-01-01
Seven independent groups estimated the location of North American cities using both spatial and numeric response modes and a variety of perceptual and memory supports. These supports included having location markers for each city color coded by nation and identified by name, giving participants the opportunity to see and update all their estimates…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-03
... ARS provisions listed in table 1 with the exception of ARS 41- 2121, paragraph (5), which defines the... paragraph (4) of ARS section 41-2121) in connection with our approval of the carbon monoxide redesignation... ADWM (codified in the Arizona Administrative Code). Arizona Revised Statutes (ARS) section 41-2132...
2010 College Course Map. Technical Report. NCES 2012-162
ERIC Educational Resources Information Center
Bryan, Michael; Simone, Sean
2012-01-01
The College Course Map (CCM) is a taxonomy system for coding postsecondary education courses in NCES research studies. Originally developed in 1988 in support of the postsecondary transcript study in the National Longitudinal Study of the High School Class of 1972 (NLS-72), the taxonomy was updated in 1993 for the High School and Beyond Study…
A data collection and processing procedure for evaluating a research program
Giuseppe Rensi; H. Dean Claxton
1972-01-01
A set of computer programs compiled for the information processing requirements of a model for evaluating research proposals are described. The programs serve to assemble and store information, periodically update it, and convert it to a form usable for decision-making. Guides for collecting and coding data are explained. The data-processing options available and...
ERIC Educational Resources Information Center
Ku, H.; Fulcher, R.
2007-01-01
The aim of the current paper is to share the processes in revising the courseware of the course of "Engineering Management Science" coded as ENG4004, in the Bachelor of Engineering (Mechanical, Mechatronics, Electrical and Electronic, Computer Systems, Instrumentation and Control), Bachelor of Engineering Technology (Mechanical, Building…
Development of an expert based ICD-9-CM and ICD-10-CM map to AIS 2005 update 2008.
Loftis, Kathryn L; Price, Janet P; Gillich, Patrick J; Cookman, Kathy J; Brammer, Amy L; St Germain, Trish; Barnes, Jo; Graymire, Vickie; Nayduch, Donna A; Read-Allsopp, Christine; Baus, Katherine; Stanley, Patsye A; Brennan, Maureen
2016-09-01
This article describes how maps were developed from the clinical modifications of the 9th and 10th revisions of the International Classification of Diseases (ICD) to the Abbreviated Injury Scale 2005 Update 2008 (AIS08). The development of the mapping methodology is described, with discussion of the major assumptions used in the process to map ICD codes to AIS severities. There were many intricacies to developing the maps, because the 2 coding systems, ICD and AIS, were developed for different purposes and contain unique classification structures to meet these purposes. Experts in ICD and AIS analyzed the rules and coding guidelines of both injury coding schemes to develop rules for mapping ICD injury codes to the AIS08. This involved subject-matter expertise, detailed knowledge of anatomy, and an in-depth understanding of injury terms and definitions as applied in both taxonomies. The official ICD-9-CM and ICD-10-CM versions (injury sections) were mapped to the AIS08 codes and severities, following the rules outlined in each coding manual. The panel of experts was composed of coders certified in ICD and/or AIS from around the world. In the process of developing the map from ICD to AIS, the experts created rules to address issues with the differences in coding guidelines between the 2 schemas and assure a consistent approach to all codes. Over 19,000 ICD codes were analyzed and maps were generated for each code to AIS08 chapters, AIS08 severities, and Injury Severity Score (ISS) body regions. After completion of the maps, 14,101 (74%) of the eligible 19,012 injury-related ICD-9-CM and ICD-10-CM codes were assigned valid AIS08 severity scores between 1 and 6. The remaining 4,911 codes were assigned an AIS08 of 9 (unknown) or were determined to be nonmappable because the ICD description lacked sufficient qualifying information for determining severity according to AIS rules. There were also 15,214 (80%) ICD codes mapped to AIS08 chapter and ISS body region, which allow for ISS calculations for patient data sets. This mapping between ICD and AIS provides a comprehensive, expert-designed solution for analysts to bridge the data gap between the injury descriptions provided in hospital codes (ICD-9-CM, ICD-10-CM) and injury severity codes (AIS08). By applying consistent rules from both the ICD and AIS taxonomies, the expert panel created these definitive maps, which are the only ones endorsed by the Association for the Advancement of Automotive Medicine (AAAM). Initial validation upheld the quality of these maps for the estimation of AIS severity, but future work should include verification of these maps for MAIS and ISS estimations with large data sets. These ICD-AIS maps will support data analysis from databases with injury information classified in these 2 different systems and open new doors for the investigation of injury from traumatic events using large injury data sets.
Implementation of Energy Code Controls Requirements in New Commercial Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.; Hatten, Mike
Most state energy codes in the United States are based on one of two national model codes; ANSI/ASHRAE/IES 90.1 (Standard 90.1) or the International Code Council (ICC) International Energy Conservation Code (IECC). Since 2004, covering the last four cycles of Standard 90.1 updates, about 30% of all new requirements have been related to building controls. These requirements can be difficult to implement and verification is beyond the expertise of most building code officials, yet the assumption in studies that measure the savings from energy codes is that they are implemented and working correctly. The objective of the current research ismore » to evaluate the degree to which high impact controls requirements included in commercial energy codes are properly designed, commissioned and implemented in new buildings. This study also evaluates the degree to which these control requirements are realizing their savings potential. This was done using a three-step process. The first step involved interviewing commissioning agents to get a better understanding of their activities as they relate to energy code required controls measures. The second involved field audits of a sample of commercial buildings to determine whether the code required control measures are being designed, commissioned and correctly implemented and functioning in new buildings. The third step includes compilation and analysis of the information gather during the first two steps. Information gathered during these activities could be valuable to code developers, energy planners, designers, building owners, and building officials.« less
[The evolution of the Italian Code of Medical Deontology: a historical-epistemological perspective].
Conti, A A
The Italian Code of Medical Deontology is a set of self-discipline rules prefixed by the medical profession, that are mandatory for the members of the medical registers, who must conform to these rules. The history of the Italian Code of Medical Deontology dates back to the beginning of the twentieth century. In 1903 it appeared in the form of a "Code of Ethics and Deontology" and was prepared by the Board of the Medical Register of Sassari (Sardinia). This Board inserted the principles inspiring the correct practice of the medical profession in an articulated and self-normative system, also foreseeing disciplinary measures. About ten years later, in 1912, the Medical Register of Turin (Piedmont) elaborated a Code which constituted the basis for a subsequent elaboration leading to a Unified Code of Medical Ethics (1924). After World War II the idea prevailed in Italy that the codes of medical deontology should undergo periodical review, updating and dissemination, and the new 1947 text (Turin) was for the first time amply diffused among Italian physicians. The next national code dates back to 1958, and twenty years later a revision was published. In the 1989 Code new topics appeared, including organ transplantation, artificial in vitro insemination and the role of police doctors; these and other issues were later developed in the 1995, 1998 and 2006 versions of the Code. The last available edition of the Italian Code of Medical Deontology is that of May 2014.
Computation of Thermally Perfect Properties of Oblique Shock Waves
NASA Technical Reports Server (NTRS)
Tatum, Kenneth E.
1996-01-01
A set of compressible flow relations describing flow properties across oblique shock waves, derived for a thermally perfect, calorically imperfect gas, is applied within the existing thermally perfect gas (TPG) computer code. The relations are based upon a value of cp expressed as a polynomial function of temperature. The updated code produces tables of compressible flow properties of oblique shock waves, as well as the original properties of normal shock waves and basic isentropic flow, in a format similar to the tables for normal shock waves found in NACA Rep. 1135. The code results are validated in both the calorically perfect and the calorically imperfect, thermally perfect temperature regimes through comparisons with the theoretical methods of NACA Rep. 1135, and with a state-of-the-art computational fluid dynamics code. The advantages of the TPG code for oblique shock wave calculations, as well as for the properties of isentropic flow and normal shock waves, are its ease of use, and its applicability to any type of gas (monatomic, diatomic, triatomic, polyatomic, or any specified mixture thereof).
Coupled Kinetic-MHD Simulations of Divertor Heat Load with ELM Perturbations
NASA Astrophysics Data System (ADS)
Cummings, Julian; Chang, C. S.; Park, Gunyoung; Sugiyama, Linda; Pankin, Alexei; Klasky, Scott; Podhorszki, Norbert; Docan, Ciprian; Parashar, Manish
2010-11-01
The effect of Type-I ELM activity on divertor plate heat load is a key component of the DOE OFES Joint Research Target milestones for this year. In this talk, we present simulations of kinetic edge physics, ELM activity, and the associated divertor heat loads in which we couple the discrete guiding-center neoclassical transport code XGC0 with the nonlinear extended MHD code M3D using the End-to-end Framework for Fusion Integrated Simulations, or EFFIS. In these coupled simulations, the kinetic code and the MHD code run concurrently on the same massively parallel platform and periodic data exchanges are performed using a memory-to-memory coupling technology provided by EFFIS. The M3D code models the fast ELM event and sends frequent updates of the magnetic field perturbations and electrostatic potential to XGC0, which in turn tracks particle dynamics under the influence of these perturbations and collects divertor particle and energy flux statistics. We describe here how EFFIS technologies facilitate these coupled simulations and discuss results for DIII-D, NSTX and Alcator C-Mod tokamak discharges.
A User''s Guide to the Zwikker-Kosten Transmission Line Code (ZKTL)
NASA Technical Reports Server (NTRS)
Kelly, J. J.; Abu-Khajeel, H.
1997-01-01
This user's guide documents updates to the Zwikker-Kosten Transmission Line Code (ZKTL). This code was developed for analyzing new liner concepts developed to provide increased sound absorption. Contiguous arrays of multi-degree-of-freedom (MDOF) liner elements serve as the model for these liner configurations, and Zwikker and Kosten's theory of sound propagation in channels is used to predict the surface impedance. Transmission matrices for the various liner elements incorporate both analytical and semi-empirical methods. This allows standard matrix techniques to be employed in the code to systematically calculate the composite impedance due to the individual liner elements. The ZKTL code consists of four independent subroutines: 1. Single channel impedance calculation - linear version (SCIC) 2. Single channel impedance calculation - nonlinear version (SCICNL) 3. Multi-channel, multi-segment, multi-layer impedance calculation - linear version (MCMSML) 4. Multi-channel, multi-segment, multi-layer impedance calculation - nonlinear version (MCMSMLNL) Detailed examples, comments, and explanations for each liner impedance computation module are included. Also contained in the guide are depictions of the interactive execution, input files and output files.
Using the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.
2013-01-01
The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.
Jardine, Bartholomew; Raymond, Gary M; Bassingthwaighte, James B
2015-01-01
The Modular Program Constructor (MPC) is an open-source Java based modeling utility, built upon JSim's Mathematical Modeling Language (MML) ( http://www.physiome.org/jsim/) that uses directives embedded in model code to construct larger, more complicated models quickly and with less error than manually combining models. A major obstacle in writing complex models for physiological processes is the large amount of time it takes to model the myriad processes taking place simultaneously in cells, tissues, and organs. MPC replaces this task with code-generating algorithms that take model code from several different existing models and produce model code for a new JSim model. This is particularly useful during multi-scale model development where many variants are to be configured and tested against data. MPC encodes and preserves information about how a model is built from its simpler model modules, allowing the researcher to quickly substitute or update modules for hypothesis testing. MPC is implemented in Java and requires JSim to use its output. MPC source code and documentation are available at http://www.physiome.org/software/MPC/.
Premzl, Marko
2015-01-01
Using eutherian comparative genomic analysis protocol and public genomic sequence data sets, the present work attempted to update and revise two gene data sets. The most comprehensive third party annotation gene data sets of eutherian adenohypophysis cystine-knot genes (128 complete coding sequences), and d-dopachrome tautomerases and macrophage migration inhibitory factor genes (30 complete coding sequences) were annotated. For example, the present study first described primate-specific cystine-knot Prometheus genes, as well as differential gene expansions of D-dopachrome tautomerase genes. Furthermore, new frameworks of future experiments of two eutherian gene data sets were proposed. PMID:25941635
MODEST: A Tool for Geodesy and Astronomy
NASA Technical Reports Server (NTRS)
Sovers, Ojars J.; Jacobs, Christopher S.; Lanyi, Gabor E.
2004-01-01
Features of the JPL VLBI modeling and estimation software "MODEST" are reviewed. Its main advantages include thoroughly documented model physics, portability, and detailed error modeling. Two unique models are included: modeling of source structure and modeling of both spatial and temporal correlations in tropospheric delay noise. History of the code parallels the development of the astrometric and geodetic VLBI technique and the software retains many of the models implemented during its advancement. The code has been traceably maintained since the early 1980s, and will continue to be updated with recent IERS standards. Scripts are being developed to facilitate user-friendly data processing in the era of e-VLBI.
Parallel performance of TORT on the CRAY J90: Model and measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, A.; Azmy, Y.Y.
1997-10-01
A limitation on the parallel performance of TORT on the CRAY J90 is the amount of extra work introduced by the multitasking algorithm itself. The extra work beyond that of the serial version of the code, called overhead, arises from the synchronization of the parallel tasks and the accumulation of results by the master task. The goal of recent updates to TORT was to reduce the time consumed by these activities. To help understand which components of the multitasking algorithm contribute significantly to the overhead, a parallel performance model was constructed and compared to measurements of actual timings of themore » code.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-08-04
This code is an enhancement to the existing FLORIS code, SWR 14-20. In particular, this enhancement computes overall thrust and turbulence intensity throughout a wind plant. This information is used to form a description of the fatigue loads experienced throughtout the wind plant. FLORIS has been updated to include an optimization routine that optimizes FLORIS to minimize thrust and turbulence intensity (and therefore loads) across the wind plant. Previously, FLORIS had been designed to optimize power out of a wind plant. However, as turbines age, more wind plant owner/operators are looking for ways to reduce their fatigue loads without sacrificingmore » too much power.« less
DYNA3D Code Practices and Developments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, L.; Zywicz, E.; Raboin, P.
2000-04-21
DYNA3D is an explicit, finite element code developed to solve high rate dynamic simulations for problems of interest to the engineering mechanics community. The DYNA3D code has been under continuous development since 1976[1] by the Methods Development Group in the Mechanical Engineering Department of Lawrence Livermore National Laboratory. The pace of code development activities has substantially increased in the past five years, growing from one to between four and six code developers. This has necessitated the use of software tools such as CVS (Concurrent Versions System) to help manage multiple version updates. While on-line documentation with an Adobe PDF manualmore » helps to communicate software developments, periodically a summary document describing recent changes and improvements in DYNA3D software is needed. The first part of this report describes issues surrounding software versions and source control. The remainder of this report details the major capability improvements since the last publicly released version of DYNA3D in 1996. Not included here are the many hundreds of bug corrections and minor enhancements, nor the development in DYNA3D between the manual release in 1993[2] and the public code release in 1996.« less
UNIPIC code for simulations of high power microwave devices
NASA Astrophysics Data System (ADS)
Wang, Jianguo; Zhang, Dianhui; Liu, Chunliang; Li, Yongdong; Wang, Yue; Wang, Hongguang; Qiao, Hailiang; Li, Xiaoze
2009-03-01
In this paper, UNIPIC code, a new member in the family of fully electromagnetic particle-in-cell (PIC) codes for simulations of high power microwave (HPM) generation, is introduced. In the UNIPIC code, the electromagnetic fields are updated using the second-order, finite-difference time-domain (FDTD) method, and the particles are moved using the relativistic Newton-Lorentz force equation. The convolutional perfectly matched layer method is used to truncate the open boundaries of HPM devices. To model curved surfaces and avoid the time step reduction in the conformal-path FDTD method, CP weakly conditional-stable FDTD (WCS FDTD) method which combines the WCS FDTD and CP-FDTD methods, is implemented. UNIPIC is two-and-a-half dimensional, is written in the object-oriented C++ language, and can be run on a variety of platforms including WINDOWS, LINUX, and UNIX. Users can use the graphical user's interface to create the geometric structures of the simulated HPM devices, or input the old structures created before. Numerical experiments on some typical HPM devices by using the UNIPIC code are given. The results are compared to those obtained from some well-known PIC codes, which agree well with each other.
X-Ray, EUV, UV and Optical Emissivities of Astrophysical Plasmas
NASA Technical Reports Server (NTRS)
Raymond, John C.; West, Donald (Technical Monitor)
2000-01-01
This grant primarily covered the development of the thermal X-ray emission model code called APEC, which is meant to replace the Raymond and Smith (1977) code. The new code contains far more spectral lines and a great deal of updated atomic data. The code is now available (http://hea-www.harvard.edu/APEC), though new atomic data is still being added, particularly at longer wavelengths. While initial development of the code was funded by this grant, current work is carried on by N. Brickhouse, R. Smith and D. Liedahl under separate funding. Over the last five years, the grant has provided salary support for N. Brickhouse, R. Smith, a summer student (L. McAllister), an SAO predoctoral fellow (A. Vasquez), and visits by T. Kallman, D. Liedahl, P. Ghavamian, J.M. Laming, J. Li, P. Okeke, and M. Martos. In addition to the code development, the grant supported investigations into X-ray and UV spectral diagnostics as applied to shock waves in the ISM, accreting black holes and white dwarfs, and stellar coronae. Many of these efforts are continuing. Closely related work on the shock waves and coronal mass ejections in the solar corona has grown out of the efforts supported by the grant.
General Flow-Solver Code for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Dorney, Daniel; Sondak, Douglas
2006-01-01
Phantom is a computer code intended primarily for real-fluid turbomachinery problems. It is based on Corsair, an ideal-gas turbomachinery code, developed by the same authors, which evolved from the ROTOR codes from NASA Ames. Phantom is applicable to real and ideal fluids, both compressible and incompressible, flowing at subsonic, transonic, and supersonic speeds. It utilizes structured, overset, O- and H-type zonal grids to discretize flow fields and represent relative motions of components. Values on grid boundaries are updated at each time step by bilinear interpolation from adjacent grids. Inviscid fluxes are calculated to third-order spatial accuracy using Roe s scheme. Viscous fluxes are calculated using second-order-accurate central differences. The code is second-order accurate in time. Turbulence is represented by a modified Baldwin-Lomax algebraic model. The code offers two options for determining properties of fluids: One is based on equations of state, thermodynamic departure functions, and corresponding state principles. The other, which is more efficient, is based on splines generated from tables of properties of real fluids. Phantom currently contains fluid-property routines for water, hydrogen, oxygen, nitrogen, kerosene, methane, and carbon monoxide as well as ideal gases.
Psychological Distress and Emotional Expression on Facebook.
Bazarova, Natalya N; Choi, Yoon Hyung; Whitlock, Janis; Cosley, Dan; Sosik, Victoria
2017-03-01
Social network sites (SNS) are a novel social environment for college students with psychological distress to connect with their peers, but the nature and effects of these interactions are not well understood. This study reports findings from a Facebook study among 238 college students reporting nonspecific psychological distress using the K-6 scale. Behavioral data included Facebook status updates containing affect words written by participants within the past 60 days and the number of responses (comments and likes) each update received. The updates were also coded for depression symptoms. Self-report data included participants' self-presentational concerns, the affective valence of each post, effects of responses on mood, and satisfaction with the responses to and outcome of each status update. Higher psychological distress was associated with displaying depression language on Facebook, with higher self-presentational concerns, and with less satisfaction with audiences' responses and less overall satisfaction with the outcome of the interaction. These results offer a unique glimpse into the social world of college students with psychological distress through their everyday use of Facebook, and how the interplay of this novel environment and students' mental health impacts their social behaviors and interaction meaning-making on Facebook.
NASA Astrophysics Data System (ADS)
Mense, Mario; Schindelhauer, Christian
We introduce the Read-Write-Coding-System (RWC) - a very flexible class of linear block codes that generate efficient and flexible erasure codes for storage networks. In particular, given a message x of k symbols and a codeword y of n symbols, an RW code defines additional parameters k ≤ r,w ≤ n that offer enhanced possibilities to adjust the fault-tolerance capability of the code. More precisely, an RWC provides linear left(n,k,dright)-codes that have (a) minimum distance d = n - r + 1 for any two codewords, and (b) for each codeword there exists a codeword for each other message with distance of at most w. Furthermore, depending on the values r,w and the code alphabet, different block codes such as parity codes (e.g. RAID 4/5) or Reed-Solomon (RS) codes (if r = k and thus, w = n) can be generated. In storage networks in which I/O accesses are very costly and redundancy is crucial, this flexibility has considerable advantages as r and w can optimally be adapted to read or write intensive applications; only w symbols must be updated if the message x changes completely, what is different from other codes which always need to rewrite y completely as x changes. In this paper, we first state a tight lower bound and basic conditions for all RW codes. Furthermore, we introduce special RW codes in which all mentioned parameters are adjustable even online, that is, those RW codes are adaptive to changing demands. At last, we point out some useful properties regarding safety and security of the stored data.
Improving accuracy of clinical coding in surgery: collaboration is key.
Heywood, Nick A; Gill, Michael D; Charlwood, Natasha; Brindle, Rachel; Kirwan, Cliona C
2016-08-01
Clinical coding data provide the basis for Hospital Episode Statistics and Healthcare Resource Group codes. High accuracy of this information is required for payment by results, allocation of health and research resources, and public health data and planning. We sought to identify the level of accuracy of clinical coding in general surgical admissions across hospitals in the Northwest of England. Clinical coding departments identified a total of 208 emergency general surgical patients discharged between 1st March and 15th August 2013 from seven hospital trusts (median = 20, range = 16-60). Blinded re-coding was performed by a senior clinical coder and clinician, with results compared with the original coding outcome. Recorded codes were generated from OPCS-4 & ICD-10. Of all cases, 194 of 208 (93.3%) had at least one coding error and 9 of 208 (4.3%) had errors in both primary diagnosis and primary procedure. Errors were found in 64 of 208 (30.8%) of primary diagnoses and 30 of 137 (21.9%) of primary procedure codes. Median tariff using original codes was £1411.50 (range, £409-9138). Re-calculation using updated clinical codes showed a median tariff of £1387.50, P = 0.997 (range, £406-10,102). The most frequent reasons for incorrect coding were "coder error" and a requirement for "clinical interpretation of notes". Errors in clinical coding are multifactorial and have significant impact on primary diagnosis, potentially affecting the accuracy of Hospital Episode Statistics data and in turn the allocation of health care resources and public health planning. As we move toward surgeon specific outcomes, surgeons should increase collaboration with coding departments to ensure the system is robust. Copyright © 2016 Elsevier Inc. All rights reserved.
A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiler, Laura Painton; Wyss, Gregory Dane
2004-07-01
This document is a reference guide for the UNIX Library/Standalone version of the Latin Hypercube Sampling Software. This software has been developed to generate Latin hypercube multivariate samples. This version runs on Linux or UNIX platforms. This manual covers the use of the LHS code in a UNIX environment, run either as a standalone program or as a callable library. The underlying code in the UNIX Library/Standalone version of LHS is almost identical to the updated Windows version of LHS released in 1998 (SAND98-0210). However, some modifications were made to customize it for a UNIX environment and as a librarymore » that is called from the DAKOTA environment. This manual covers the use of the LHS code as a library and in the standalone mode under UNIX.« less
Neutron Angular Scatter Effects in 3DHZETRN: Quasi-Elastic
NASA Technical Reports Server (NTRS)
Wilson, John W.; Werneth, Charles M.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2017-01-01
The current 3DHZETRN code has a detailed three dimensional (3D) treatment of neutron transport based on a forward/isotropic assumption and has been compared to Monte Carlo (MC) simulation codes in various geometries. In most cases, it has been found that 3DHZETRN agrees with the MC codes to the extent they agree with each other. However, a recent study of neutron leakage from finite geometries revealed that further improvements to the 3DHZETRN formalism are needed. In the present report, angular scattering corrections to the neutron fluence are provided in an attempt to improve fluence estimates from a uniform sphere. It is found that further developments in the nuclear production models are required to fully evaluate the impact of transport model updates. A model for the quasi-elastic neutron production spectra is therefore developed and implemented into 3DHZETRN.
VICTORIA: A mechanistic model for radionuclide behavior in the reactor coolant system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaperow, J.H.; Bixler, N.E.
1996-12-31
VICTORIA is the U.S. Nuclear Regulatory Commission`s (NRC`s) mechanistic, best-estimate code for analysis of fission product release from the core and subsequent transport in the reactor vessel and reactor coolant system. VICTORIA requires thermal-hydraulic data (i.e., temperatures, pressures, and velocities) as input. In the past, these data have been taken from the results of calculations from thermal-hydraulic codes such as SCDAP/RELAP5, MELCOR, and MAAP. Validation and assessment of VICTORIA 1.0 have been completed. An independent peer review of VICTORIA, directed by Brookhaven National Laboratory and supported by experts in the areas of fuel release, fission product chemistry, and aerosol physics,more » has been undertaken. This peer review, which will independently assess the code`s capabilities, is nearing completion with the peer review committee`s final report expected in Dec 1996. A limited amount of additional development is expected as a result of the peer review. Following this additional development, the NRC plans to release VICTORIA 1.1 and an updated and improved code manual. Future plans mainly involve use of the code for plant calculations to investigate specific safety issues as they arise. Also, the code will continue to be used in support of the Phebus experiments.« less
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
Object-Oriented/Data-Oriented Design of a Direct Simulation Monte Carlo Algorithm
NASA Technical Reports Server (NTRS)
Liechty, Derek S.
2014-01-01
Over the past decade, there has been much progress towards improved phenomenological modeling and algorithmic updates for the direct simulation Monte Carlo (DSMC) method, which provides a probabilistic physical simulation of gas Rows. These improvements have largely been based on the work of the originator of the DSMC method, Graeme Bird. Of primary importance are improved chemistry, internal energy, and physics modeling and a reduction in time to solution. These allow for an expanded range of possible solutions In altitude and velocity space. NASA's current production code, the DSMC Analysis Code (DAC), is well-established and based on Bird's 1994 algorithms written in Fortran 77 and has proven difficult to upgrade. A new DSMC code is being developed in the C++ programming language using object-oriented and data-oriented design paradigms to facilitate the inclusion of the recent improvements and future development activities. The development efforts on the new code, the Multiphysics Algorithm with Particles (MAP), are described, and performance comparisons are made with DAC.
A Low-Storage-Consumption XML Labeling Method for Efficient Structural Information Extraction
NASA Astrophysics Data System (ADS)
Liang, Wenxin; Takahashi, Akihiro; Yokota, Haruo
Recently, labeling methods to extract and reconstruct the structural information of XML data, which are important for many applications such as XPath query and keyword search, are becoming more attractive. To achieve efficient structural information extraction, in this paper we propose C-DO-VLEI code, a novel update-friendly bit-vector encoding scheme, based on register-length bit operations combining with the properties of Dewey Order numbers, which cannot be implemented in other relevant existing schemes such as ORDPATH. Meanwhile, the proposed method also achieves lower storage consumption because it does not require either prefix schema or any reserved codes for node insertion. We performed experiments to evaluate and compare the performance and storage consumption of the proposed method with those of the ORDPATH method. Experimental results show that the execution times for extracting depth information and parent node labels using the C-DO-VLEI code are about 25% and 15% less, respectively, and the average label size using the C-DO-VLEI code is about 24% smaller, comparing with ORDPATH.
Employing multi-GPU power for molecular dynamics simulation: an extension of GALAMOST
NASA Astrophysics Data System (ADS)
Zhu, You-Liang; Pan, Deng; Li, Zhan-Wei; Liu, Hong; Qian, Hu-Jun; Zhao, Yang; Lu, Zhong-Yuan; Sun, Zhao-Yan
2018-04-01
We describe the algorithm of employing multi-GPU power on the basis of Message Passing Interface (MPI) domain decomposition in a molecular dynamics code, GALAMOST, which is designed for the coarse-grained simulation of soft matters. The code of multi-GPU version is developed based on our previous single-GPU version. In multi-GPU runs, one GPU takes charge of one domain and runs single-GPU code path. The communication between neighbouring domains takes a similar algorithm of CPU-based code of LAMMPS, but is optimised specifically for GPUs. We employ a memory-saving design which can enlarge maximum system size at the same device condition. An optimisation algorithm is employed to prolong the update period of neighbour list. We demonstrate good performance of multi-GPU runs on the simulation of Lennard-Jones liquid, dissipative particle dynamics liquid, polymer and nanoparticle composite, and two-patch particles on workstation. A good scaling of many nodes on cluster for two-patch particles is presented.
A taxonomy update for the family Polyomaviridae.
Calvignac-Spencer, Sébastien; Feltkamp, Mariet C W; Daugherty, Matthew D; Moens, Ugo; Ramqvist, Torbjörn; Johne, Reimar; Ehlers, Bernhard
2016-06-01
Many distinct polyomaviruses infecting a variety of vertebrate hosts have recently been discovered, and their complete genome sequence could often be determined. To accommodate this fast-growing diversity, the International Committee on Taxonomy of Viruses (ICTV) Polyomaviridae Study Group designed a host- and sequence-based rationale for an updated taxonomy of the family Polyomaviridae. Applying this resulted in numerous recommendations of taxonomical revisions, which were accepted by the Executive Committee of the ICTV in December 2015. New criteria for definition and creation of polyomavirus species were established that were based on the observed distance between large T antigen coding sequences. Four genera (Alpha-, Beta, Gamma- and Deltapolyomavirus) were delineated that together include 73 species. Species naming was made as systematic as possible - most species names now consist of the binomial name of the host species followed by polyomavirus and a number reflecting the order of discovery. It is hoped that this important update of the family taxonomy will serve as a stable basis for future taxonomical developments.
USDA-ARS?s Scientific Manuscript database
In the course of updating the scientific names of plant-associated fungi in the U. S. National Fungus Collections Fungal Databases to conform with one scientific name for fungi as required by the International Code of Nomenclature for algae, fungi and plants (ICN, McNeill & al. in Regnum Vegetable 1...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-29
.... Now TORS(OCL/DSC, update ED 154/Doxxx, * * *). Outcome FRAC/consultation DO306/ED 122 and Publication... TORs and Work Plan. Review of Position Papers and Contributions. 13:30-17:00: Plenary Session.... Robert L. Bostiga, RTCA Advisory Committee. [FR Doc. 2010-27260 Filed 10-28-10; 8:45 am] BILLING CODE...
ERIC Educational Resources Information Center
Mason, Janet
The booklet explains requirements for reporting abuse or neglect of children and disabled adults contained in the North Carolina Juvenile Code and the Protection of the Abused, Neglected or Exploited Disabled Adult Act. Following a brief historical review, the text discusses who must report abuse and neglect, what acts or conditions must be…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-14
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-64284; File No. SR-Phlx-2011-48] Self... Change To Update Provisions Regarding the Dress Code and Trade Verification April 8, 2011. Pursuant to.... 78s(b)(1). \\2\\ 17 CFR 240.19b-4. I. Self-Regulatory Organization's Statement of the Terms of Substance...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-01
... gas resources regulations to update some fees that cover BSEE's cost of processing and filing certain... natural gas on the OCS and to reflect advancements in technology and new information. The BSEE also..., Crude Petroleum and Natural Gas Extraction, and 213111, Drilling Oil and Gas Wells. For these NAICS code...
Veterans Benefits: Federal Employment Assistance
2008-01-14
Lordeman. 2 This paper does not provide information on VA education benefits for veterans. For more information on education benefits for veterans...see CRS Report RL33281, Montgomery GI Bill Education Benefits : Analysis of College Prices and Federal Student Aid Under the Higher (continued...) Order...Code RS22666 Updated January 14, 2008 Veterans Benefits : Federal Employment Assistance Christine Scott Specialist in Social Policy Domestic Social
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-10
... Andalusia, AL, as the Air Traffic Control Tower at South Alabama Regional Airport at Bill Benton Field has... Alabama Regional Airport at Bill Benton Field. This action also would update the geographic coordinates of... 1, Code of Federal Regulations, part 51, subject to the annual revision of FAA, Order 7400.9 and...
An efficient dictionary learning algorithm and its application to 3-D medical image denoising.
Li, Shutao; Fang, Leyuan; Yin, Haitao
2012-02-01
In this paper, we propose an efficient dictionary learning algorithm for sparse representation of given data and suggest a way to apply this algorithm to 3-D medical image denoising. Our learning approach is composed of two main parts: sparse coding and dictionary updating. On the sparse coding stage, an efficient algorithm named multiple clusters pursuit (MCP) is proposed. The MCP first applies a dictionary structuring strategy to cluster the atoms with high coherence together, and then employs a multiple-selection strategy to select several competitive atoms at each iteration. These two strategies can greatly reduce the computation complexity of the MCP and assist it to obtain better sparse solution. On the dictionary updating stage, the alternating optimization that efficiently approximates the singular value decomposition is introduced. Furthermore, in the 3-D medical image denoising application, a joint 3-D operation is proposed for taking the learning capabilities of the presented algorithm to simultaneously capture the correlations within each slice and correlations across the nearby slices, thereby obtaining better denoising results. The experiments on both synthetically generated data and real 3-D medical images demonstrate that the proposed approach has superior performance compared to some well-known methods. © 2011 IEEE
The New USGS Volcano Hazards Program Web Site
NASA Astrophysics Data System (ADS)
Venezky, D. Y.; Graham, S. E.; Parker, T. J.; Snedigar, S. F.
2008-12-01
The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) has launched a revised web site that uses a map-based interface to display hazards information for U.S. volcanoes. The web site is focused on better communication of hazards and background volcano information to our varied user groups by reorganizing content based on user needs and improving data display. The Home Page provides a synoptic view of the activity level of all volcanoes for which updates are written using a custom Google® Map. Updates are accessible by clicking on one of the map icons or clicking on the volcano of interest in the adjacent color-coded list of updates. The new navigation provides rapid access to volcanic activity information, background volcano information, images and publications, volcanic hazards, information about VHP, and the USGS volcano observatories. The Volcanic Activity section was tailored for emergency managers but provides information for all our user groups. It includes a Google® Map of the volcanoes we monitor, an Elevated Activity Page, a general status page, information about our Volcano Alert Levels and Aviation Color Codes, monitoring information, and links to monitoring data from VHP's volcano observatories: Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Long Valley Observatory (LVO), Hawaiian Volcano Observatory (HVO), and Yellowstone Volcano Observatory (YVO). The YVO web site was the first to move to the new navigation system and we are working on integrating the Long Valley Observatory web site next. We are excited to continue to implement new geospatial technologies to better display our hazards and supporting volcano information.
Morton, Lindsay M.; Linet, Martha S.; Clarke, Christina A.; Kadin, Marshall E.; Vajdic, Claire M.; Monnereau, Alain; Maynadié, Marc; Chiu, Brian C.-H.; Marcos-Gragera, Rafael; Costantini, Adele Seniori; Cerhan, James R.; Weisenburger, Dennis D.
2010-01-01
After publication of the updated World Health Organization (WHO) classification of tumors of hematopoietic and lymphoid tissues in 2008, the Pathology Working Group of the International Lymphoma Epidemiology Consortium (InterLymph) now presents an update of the hierarchical classification of lymphoid neoplasms for epidemiologic research based on the 2001 WHO classification, which we published in 2007. The updated hierarchical classification incorporates all of the major and provisional entities in the 2008 WHO classification, including newly defined entities based on age, site, certain infections, and molecular characteristics, as well as borderline categories, early and “in situ” lesions, disorders with limited capacity for clinical progression, lesions without current International Classification of Diseases for Oncology, 3rd Edition codes, and immunodeficiency-associated lymphoproliferative disorders. WHO subtypes are defined in hierarchical groupings, with newly defined groups for small B-cell lymphomas with plasmacytic differentiation and for primary cutaneous T-cell lymphomas. We suggest approaches for applying the hierarchical classification in various epidemiologic settings, including strategies for dealing with multiple coexisting lymphoma subtypes in one patient, and cases with incomplete pathologic information. The pathology materials useful for state-of-the-art epidemiology studies are also discussed. We encourage epidemiologists to adopt the updated InterLymph hierarchical classification, which incorporates the most recent WHO entities while demonstrating their relationship to older classifications. PMID:20699439
Turner, Jennifer J; Morton, Lindsay M; Linet, Martha S; Clarke, Christina A; Kadin, Marshall E; Vajdic, Claire M; Monnereau, Alain; Maynadié, Marc; Chiu, Brian C-H; Marcos-Gragera, Rafael; Costantini, Adele Seniori; Cerhan, James R; Weisenburger, Dennis D
2010-11-18
After publication of the updated World Health Organization (WHO) classification of tumors of hematopoietic and lymphoid tissues in 2008, the Pathology Working Group of the International Lymphoma Epidemiology Consortium (InterLymph) now presents an update of the hierarchical classification of lymphoid neoplasms for epidemiologic research based on the 2001 WHO classification, which we published in 2007. The updated hierarchical classification incorporates all of the major and provisional entities in the 2008 WHO classification, including newly defined entities based on age, site, certain infections, and molecular characteristics, as well as borderline categories, early and "in situ" lesions, disorders with limited capacity for clinical progression, lesions without current International Classification of Diseases for Oncology, 3rd Edition codes, and immunodeficiency-associated lymphoproliferative disorders. WHO subtypes are defined in hierarchical groupings, with newly defined groups for small B-cell lymphomas with plasmacytic differentiation and for primary cutaneous T-cell lymphomas. We suggest approaches for applying the hierarchical classification in various epidemiologic settings, including strategies for dealing with multiple coexisting lymphoma subtypes in one patient, and cases with incomplete pathologic information. The pathology materials useful for state-of-the-art epidemiology studies are also discussed. We encourage epidemiologists to adopt the updated InterLymph hierarchical classification, which incorporates the most recent WHO entities while demonstrating their relationship to older classifications.
Sollie, Annet; Sijmons, Rolf H; Lindhout, Dick; van der Ploeg, Ans T; Rubio Gozalbo, M Estela; Smit, G Peter A; Verheijen, Frans; Waterham, Hans R; van Weely, Sonja; Wijburg, Frits A; Wijburg, Rudolph; Visser, Gepke
2013-07-01
Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of rare and genetic disorders. This prevents the optimal registration of such patients in databases and thus data-sharing efforts. To improve care and to facilitate research for patients with metabolic disorders, we developed a new coding system for metabolic diseases with a dedicated group of clinical specialists. Next, we compared the resulting codes with those in ICD and SNOMED-CT. No matches were found in 76% of cases in ICD-10 and in 54% in SNOMED-CT. We conclude that there are sizable gaps in the SNOMED-CT and ICD coding systems for metabolic disorders. There may be similar gaps for other classes of rare and genetic disorders. We have demonstrated that expert groups can help in addressing such coding issues. Our coding system has been made available to the ICD and SNOMED-CT organizations as well as to the Orphanet and HPO organizations for further public application and updates will be published online (www.ddrmd.nl and www.cineas.org). © 2013 WILEY PERIODICALS, INC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Charley; Kamboj, Sunita; Wang, Cheng
2015-09-01
This handbook is an update of the 1993 version of the Data Collection Handbook and the Radionuclide Transfer Factors Report to support modeling the impact of radioactive material in soil. Many new parameters have been added to the RESRAD Family of Codes, and new measurement methodologies are available. A detailed review of available parameter databases was conducted in preparation of this new handbook. This handbook is a companion document to the user manuals when using the RESRAD (onsite) and RESRAD-OFFSITE code. It can also be used for RESRAD-BUILD code because some of the building-related parameters are included in this handbook.more » The RESRAD (onsite) has been developed for implementing U.S. Department of Energy Residual Radioactive Material Guidelines. Hydrogeological, meteorological, geochemical, geometrical (size, area, depth), crops and livestock, human intake, source characteristic, and building characteristic parameters are used in the RESRAD (onsite) code. The RESRAD-OFFSITE code is an extension of the RESRAD (onsite) code and can also model the transport of radionuclides to locations outside the footprint of the primary contamination. This handbook discusses parameter definitions, typical ranges, variations, and measurement methodologies. It also provides references for sources of additional information. Although this handbook was developed primarily to support the application of RESRAD Family of Codes, the discussions and values are valid for use of other pathway analysis models and codes.« less
Fortran code for SU(3) lattice gauge theory with and without MPI checkerboard parallelization
NASA Astrophysics Data System (ADS)
Berg, Bernd A.; Wu, Hao
2012-10-01
We document plain Fortran and Fortran MPI checkerboard code for Markov chain Monte Carlo simulations of pure SU(3) lattice gauge theory with the Wilson action in D dimensions. The Fortran code uses periodic boundary conditions and is suitable for pedagogical purposes and small scale simulations. For the Fortran MPI code two geometries are covered: the usual torus with periodic boundary conditions and the double-layered torus as defined in the paper. Parallel computing is performed on checkerboards of sublattices, which partition the full lattice in one, two, and so on, up to D directions (depending on the parameters set). For updating, the Cabibbo-Marinari heatbath algorithm is used. We present validations and test runs of the code. Performance is reported for a number of currently used Fortran compilers and, when applicable, MPI versions. For the parallelized code, performance is studied as a function of the number of processors. Program summary Program title: STMC2LSU3MPI Catalogue identifier: AEMJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26666 No. of bytes in distributed program, including test data, etc.: 233126 Distribution format: tar.gz Programming language: Fortran 77 compatible with the use of Fortran 90/95 compilers, in part with MPI extensions. Computer: Any capable of compiling and executing Fortran 77 or Fortran 90/95, when needed with MPI extensions. Operating system: Red Hat Enterprise Linux Server 6.1 with OpenMPI + pgf77 11.8-0, Centos 5.3 with OpenMPI + gfortran 4.1.2, Cray XT4 with MPICH2 + pgf90 11.2-0. Has the code been vectorised or parallelized?: Yes, parallelized using MPI extensions. Number of processors used: 2 to 11664 RAM: 200 Mega bytes per process. Classification: 11.5. Nature of problem: Physics of pure SU(3) Quantum Field Theory (QFT). This is relevant for our understanding of Quantum Chromodynamics (QCD). It includes the glueball spectrum, topological properties and the deconfining phase transition of pure SU(3) QFT. For instance, Relativistic Heavy Ion Collision (RHIC) experiments at the Brookhaven National Laboratory provide evidence that quarks confined in hadrons undergo at high enough temperature and pressure a transition into a Quark-Gluon Plasma (QGP). Investigations of its thermodynamics in pure SU(3) QFT are of interest. Solution method: Markov Chain Monte Carlo (MCMC) simulations of SU(3) Lattice Gauge Theory (LGT) with the Wilson action. This is a regularization of pure SU(3) QFT on a hypercubic lattice, which allows approaching the continuum SU(3) QFT by means of Finite Size Scaling (FSS) studies. Specifically, we provide updating routines for the Cabibbo-Marinari heatbath with and without checkerboard parallelization. While the first is suitable for pedagogical purposes and small scale projects, the latter allows for efficient parallel processing. Targetting the geometry of RHIC experiments, we have implemented a Double-Layered Torus (DLT) lattice geometry, which has previously not been used in LGT MCMC simulations and enables inside and outside layers at distinct temperatures, the lower-temperature layer acting as the outside boundary for the higher-temperature layer, where the deconfinement transition goes on. Restrictions: The checkerboard partition of the lattice makes the development of measurement programs more tedious than is the case for an unpartitioned lattice. Presently, only one measurement routine for Polyakov loops is provided. Unusual features: We provide three different versions for the send/receive function of the MPI library, which work for different operating system +compiler +MPI combinations. This involves activating the correct row in the last three rows of our latmpi.par parameter file. The underlying reason is distinct buffer conventions. Running time: For a typical run using an Intel i7 processor, it takes (1.8-6) E-06 seconds to update one link of the lattice, depending on the compiler used. For example, if we do a simulation on a small (4 * 83) DLT lattice with a statistics of 221 sweeps (i.e., update the two lattice layers of 4 * (4 * 83) links each 221 times), the total CPU time needed can be 2 * 4 * (4 * 83) * 221 * 3 E-06 seconds = 1.7 minutes, where 2 — two layers of lattice 4 — four dimensions 83 * 4 — lattice size 221 — sweeps of updating 6 E-06 s mdash; average time to update one link variable. If we divide the job into 8 parallel processes, then the real time is (for negligible communication overhead) 1.7 mins / 8 = 0.2 mins.
JWL equation of state coefficients for high explosives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, E.; Finger, M.; Collins, W.
1973-01-16
The compilation of equetions of state for high explosives now includes some 38 entries. Additions and revisions have recently introduced errors in Previous lists should be discarded. To avoid transcribing errors, we have computerized the list and will issue computer updates periodically. If you are maintaining equation of state files for hydrodynamic codes and would like IBM card records of our lists, we will be happy to send you a copy of our card deck. We have noted those entries where changes or corrections have been made. Of special note for t h i s update are the corrections tmore » o PBX-9404 and IX-04 from the most recent memo, dated August 23, 1972.« less
NASA Astrophysics Data System (ADS)
Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel
2015-04-01
We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity and Kernel Inversion), which provides a generalized interface to arbitrary external forward modelling codes. So far, the 3D spectral-element code SPECFEM3D (Tromp, Komatitsch and Liu, 2008) and the 1D semi-analytical code GEMINI (Friederich and Dalkolmo, 1995) in both, Cartesian and spherical framework are supported. The creation of interfaces to further forward codes is planned in the near future. ASKI is freely available under the terms of the GPL at www.rub.de/aski . Since the independent modules of ASKI must communicate via file output/input, large storage capacities need to be accessible conveniently. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion. In the presentation, we will show some aspects of the theory behind the full waveform inversion method and its practical realization by the software package ASKI, as well as synthetic and real-data applications from different scales and geometries.
User's Guide for RESRAD-OFFSITE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gnanapragasam, E.; Yu, C.
2015-04-01
The RESRAD-OFFSITE code can be used to model the radiological dose or risk to an offsite receptor. This User’s Guide for RESRAD-OFFSITE Version 3.1 is an update of the User’s Guide for RESRAD-OFFSITE Version 2 contained in the Appendix A of the User’s Manual for RESRAD-OFFSITE Version 2 (ANL/EVS/TM/07-1, DOE/HS-0005, NUREG/CR-6937). This user’s guide presents the basic information necessary to use Version 3.1 of the code. It also points to the help file and other documents that provide more detailed information about the inputs, the input forms and features/tools in the code; two of the features (overriding the source termmore » and computing area factors) are discussed in the appendices to this guide. Section 2 describes how to download and install the code and then verify the installation of the code. Section 3 shows ways to navigate through the input screens to simulate various exposure scenarios and to view the results in graphics and text reports. Section 4 has screen shots of each input form in the code and provides basic information about each parameter to increase the user’s understanding of the code. Section 5 outlines the contents of all the text reports and the graphical output. It also describes the commands in the two output viewers. Section 6 deals with the probabilistic and sensitivity analysis tools available in the code. Section 7 details the various ways of obtaining help in the code.« less
Cosmic-ray propagation with DRAGON2: I. numerical solver and astrophysical ingredients
NASA Astrophysics Data System (ADS)
Evoli, Carmelo; Gaggero, Daniele; Vittino, Andrea; Di Bernardo, Giuseppe; Di Mauro, Mattia; Ligorini, Arianna; Ullio, Piero; Grasso, Dario
2017-02-01
We present version 2 of the DRAGON code designed for computing realistic predictions of the CR densities in the Galaxy. The code numerically solves the interstellar CR transport equation (including inhomogeneous and anisotropic diffusion, either in space and momentum, advective transport and energy losses), under realistic conditions. The new version includes an updated numerical solver and several models for the astrophysical ingredients involved in the transport equation. Improvements in the accuracy of the numerical solution are proved against analytical solutions and in reference diffusion scenarios. The novel features implemented in the code allow to simulate the diverse scenarios proposed to reproduce the most recent measurements of local and diffuse CR fluxes, going beyond the limitations of the homogeneous galactic transport paradigm. To this end, several applications using DRAGON2 are presented as well. This new version facilitates the users to include their own physical models by means of a modular C++ structure.
Valdes-Abellan, Javier; Pachepsky, Yakov; Martinez, Gonzalo
2018-01-01
Data assimilation is becoming a promising technique in hydrologic modelling to update not only model states but also to infer model parameters, specifically to infer soil hydraulic properties in Richard-equation-based soil water models. The Ensemble Kalman Filter method is one of the most widely employed method among the different data assimilation alternatives. In this study the complete Matlab© code used to study soil data assimilation efficiency under different soil and climatic conditions is shown. The code shows the method how data assimilation through EnKF was implemented. Richards equation was solved by the used of Hydrus-1D software which was run from Matlab. •MATLAB routines are released to be used/modified without restrictions for other researchers•Data assimilation Ensemble Kalman Filter method code.•Soil water Richard equation flow solved by Hydrus-1D.
The TORSED method for construction of TORT boundary sources from external DORT flux files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhoades, W.A.
1993-08-01
The TORSED method provides a means of coupling cylindrical two-dimensional DORT fluxes or fluences to a three-dimensional TORT calculation in Cartesian geometry through construction of external boundary sources for TORT. This can be important for several reasons. The two-dimensional environment may be too large for TORT simulation. The two-dimensional environment may be truly cylindrical in nature, and thus, better treated in that geometry. It may be desired to use a single environment calculation to study numerous local perturbations. In Section I the TORSED code is described in detail and the diverse demonstration problems that accompany the code distribution are discussed.more » In Section II, an updated discussion of the VISA code is given. VISA is required to preprocess the DORT files for use in TORSED. In Section III, the references are listed.« less
Detecting and Characterizing Semantic Inconsistencies in Ported Code
NASA Technical Reports Server (NTRS)
Ray, Baishakhi; Kim, Miryung; Person, Suzette J.; Rungta, Neha
2013-01-01
Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (I) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows thai SPA can dell-oct porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points
Detecting and Characterizing Semantic Inconsistencies in Ported Code
NASA Technical Reports Server (NTRS)
Ray, Baishakhi; Kim, Miryung; Person,Suzette; Rungta, Neha
2013-01-01
Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (1) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows that SPA can detect porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points.
FY16 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Shemon, E. R.; Smith, M. A.
2016-09-30
The goal of the NEAMS neutronics effort is to develop a neutronics toolkit for use on sodium-cooled fast reactors (SFRs) which can be extended to other reactor types. The neutronics toolkit includes the high-fidelity deterministic neutron transport code PROTEUS and many supporting tools such as a cross section generation code MC 2-3, a cross section library generation code, alternative cross section generation tools, mesh generation and conversion utilities, and an automated regression test tool. The FY16 effort for NEAMS neutronics focused on supporting the release of the SHARP toolkit and existing and new users, continuing to develop PROTEUS functions necessarymore » for performance improvement as well as the SHARP release, verifying PROTEUS against available existing benchmark problems, and developing new benchmark problems as needed. The FY16 research effort was focused on further updates of PROTEUS-SN and PROTEUS-MOCEX and cross section generation capabilities as needed.« less
USDA-ARS?s Scientific Manuscript database
In the course of updating the scientific names of plant-associated fungi in the USDA-ARS U.S. National Fungus Collections Fungal Databases to conform with one scientific name for fungi as required by the International Code of Nomenclature for algae, fungi and plants (ICN, McNeill & al. in Regnum Veg...
Macromolecular Calculations for the XTAL-System of Crystallographic Programs
1989-06-01
INSTRUMENT DE NTiFiCAT C 1.1BE R ORG!A%,ZAT:ON1 (if applicable) Office of Naval Research ONR N00014-88-K-0323 8c A:):)R -S ( Citr . Sta te, and ZIP Code) 10...of prio, difference, and updated maps, in addition to the usual BDF handling, is simple but a fruitful source of confusion. For the usual iterative
High Productivity Computing Systems Analysis and Performance
2005-07-01
cubic grid Discrete Math Global Updates per second (GUP/S) RandomAccess Paper & Pencil Contact Bob Lucas (ISI) Multiple Precision none...can be found at the web site. One of the HPCchallenge codes, RandomAccess, is derived from the HPCS discrete math benchmarks that we released, and...Kernels Discrete Math … Graph Analysis … Linear Solvers … Signal Processi ng Execution Bounds Execution Indicators 6 Scalable Compact
77 FR 29322 - Updating State Residential Building Energy Efficiency Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-17
... supporting the change to the SHGC requirements in climate zone 4. Specifically, RECA supported the... to change Climate Zone 3 from R13 to either R20 or R13+5 ci.'' (CFEC, No. 2 at p. 2) In response, DOE... difference of 50 Pascals (5 ACH50) in climate zone 1 and climate zone 2; and 3 air changes/hour (3 ACH50) in...
NASA Technical Reports Server (NTRS)
Wey, Changju Thomas; Liu, Nan-Suey
2014-01-01
This paper summarizes the procedures of inserting a thin-layer mesh to existing inviscid polyhedral mesh either with or without hanging-node elements as well as presents sample results from its applications to the numerical solution of a single-element LDI combustor using a releasable edition of the National Combustion Code (NCC).
NASA Technical Reports Server (NTRS)
Wey, Thomas; Liu, Nan-Suey
2014-01-01
This paper summarizes the procedures of inserting a thin-layer mesh to existing inviscid polyhedral mesh either with or without hanging-node elements as well as presents sample results from its applications to the numerical solution of a single-element LDI combustor using a releasable edition of the National Combustion Code (NCC).
Sen. Sanders, Bernard [I-VT
2013-10-28
Senate - 10/30/2013 Committee on Veterans' Affairs. Hearings held. Hearings printed: S.Hrg. 113-280. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Munitions Classification Library Update and Expansion Blossom Point Data Collection Report
2015-02-10
ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 10-02-2015 2 . REPORT TYPE Technical 3. DATES COVERED (From - To) July 2014 – February 2015 4. TITLE AND...Denver, CO 80203 Naval Research Laboratory Code 6110 4555 Overlook Avenue, SW Washington, DC 20375-5320 N/A 9... 2 Figure 2 -1 – TX/RX coil combination that forms the basis for both the
Programmer's reference manual for the VAX-Gerber link software package. Revision 1. 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isobe, G.W.
1985-10-01
This guide provides the information necessary to edit, modify, and run the VAX-Gerber software link. Since the project is in the testing stage and still being modified, this guide discussess the final desired stage along with the current stage. The current stage is to set up as to allow the programmer to easily modify and update codes as necessary.
Update On the Status of the FLUKA Monte Carlo Transport Code*
NASA Technical Reports Server (NTRS)
Ferrari, A.; Lorenzo-Sentis, M.; Roesler, S.; Smirnov, G.; Sommerer, F.; Theis, C.; Vlachoudis, V.; Carboni, M.; Mostacci, A.; Pelliccioni, M.
2006-01-01
The FLUKA Monte Carlo transport code is a well-known simulation tool in High Energy Physics. FLUKA is a dynamic tool in the sense that it is being continually updated and improved by the authors. We review the progress achieved since the last CHEP Conference on the physics models, some technical improvements to the code and some recent applications. From the point of view of the physics, improvements have been made with the extension of PEANUT to higher energies for p, n, pi, pbar/nbar and for nbars down to the lowest energies, the addition of the online capability to evolve radioactive products and get subsequent dose rates, upgrading of the treatment of EM interactions with the elimination of the need to separately prepare preprocessed files. A new coherent photon scattering model, an updated treatment of the photo-electric effect, an improved pair production model, new photon cross sections from the LLNL Cullen database have been implemented. In the field of nucleus-- nucleus interactions the electromagnetic dissociation of heavy ions has been added along with the extension of the interaction models for some nuclide pairs to energies below 100 MeV/A using the BME approach, as well as the development of an improved QMD model for intermediate energies. Both DPMJET 2.53 and 3 remain available along with rQMD 2.4 for heavy ion interactions above 100 MeV/A. Technical improvements include the ability to use parentheses in setting up the combinatorial geometry, the introduction of pre-processor directives in the input stream. a new random number generator with full 64 bit randomness, new routines for mathematical special functions (adapted from SLATEC). Finally, work is progressing on the deployment of a user-friendly GUI input interface as well as a CAD-like geometry creation and visualization tool. On the application front, FLUKA has been used to extensively evaluate the potential space radiation effects on astronauts for future deep space missions, the activation dose for beam target areas, dose calculations for radiation therapy as well as being adapted for use in the simulation of events in the ALICE detector at the LHC.
Bowden, Deborah L; Vargas-Caro, Carolina; Ovenden, Jennifer R; Bennett, Michael B; Bustamante, Carlos
2016-11-01
The complete mitochondrial genome of the grey nurse shark Carcharias taurus is described from 25 963 828 sequences obtained using Illumina NGS technology. Total length of the mitogenome is 16 715 bp, consisting of 2 rRNAs, 13 protein-coding regions, 22 tRNA and 2 non-coding regions thus updating the previously published mitogenome for this species. The phylogenomic reconstruction inferred from the mitogenome of 15 species of Lamniform and Carcharhiniform sharks supports the inclusion of C. taurus in a clade with the Lamnidae and Cetorhinidae. This complete mitogenome contributes to ongoing investigation into the monophyly of the Family Odontaspididae.
Temporal parameters and time course of perceptual latency priming.
Scharlau, Ingrid; Neumann, Odmar
2003-06-01
Visual stimuli (primes) reduce the perceptual latency of a target appearing at the same location (perceptual latency priming, PLP). Three experiments assessed the time course of PLP by masked and, in Experiment 3, unmasked primes. Experiments 1 and 2 investigated the temporal parameters that determine the size of priming. Stimulus onset asynchrony was found to exert the main influence accompanied by a small effect of prime duration. Experiment 3 used a large range of priming onset asynchronies. We suggest to explain PLP by the Asynchronous Updating Model which relates it to the asynchrony of 2 central coding processes, preattentive coding of basic visual features and attentional orienting as a prerequisite for perceptual judgments and conscious perception.
Pretest mediction of Semiscale Test S-07-10 B. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobbe, C A
A best estimate prediction of Semiscale Test S-07-10B was performed at INEL by EG and G Idaho as part of the RELAP4/MOD6 code assessment effort and as the Nuclear Regulatory Commission pretest calculation for the Small Break Experiment. The RELAP4/MOD6 Update 4 and the RELAP4/MOD7 computer codes were used to analyze Semiscale Test S-07-10B, a 10% communicative cold leg break experiment. The Semiscale Mod-3 system utilized an electrially heated simulated core operating at a power level of 1.94 MW. The initial system pressure and temperature in the upper plenum was 2276 psia and 604/sup 0/F, respectively.
Computational Accelerator Physics. Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bisognano, J.J.; Mondelli, A.A.
1997-04-01
The sixty two papers appearing in this volume were presented at CAP96, the Computational Accelerator Physics Conference held in Williamsburg, Virginia from September 24{minus}27,1996. Science Applications International Corporation (SAIC) and the Thomas Jefferson National Accelerator Facility (Jefferson lab) jointly hosted CAP96, with financial support from the U.S. department of Energy`s Office of Energy Research and the Office of Naval reasearch. Topics ranged from descriptions of specific codes to advanced computing techniques and numerical methods. Update talks were presented on nearly all of the accelerator community`s major electromagnetic and particle tracking codes. Among all papers, thirty of them are abstracted formore » the Energy Science and Technology database.(AIP)« less
Jovian Plasma Modeling for Mission Design
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin
2015-01-01
The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and results of those modifications to the DG1 model to produce the new DG2 model presented here and the steps taken to integrate the DG2 predictions into Nascap-2k are described in this report
Jovian plasma modeling for mission design
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin
2015-01-01
The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and results of those modifications to the DG1 model to produce the new DG2 model presented here and the steps taken to integrate the DG2 predictions into Nascap-2k are described in this report.
Clinical code set engineering for reusing EHR data for research: A review.
Williams, Richard; Kontopantelis, Evangelos; Buchan, Iain; Peek, Niels
2017-06-01
The construction of reliable, reusable clinical code sets is essential when re-using Electronic Health Record (EHR) data for research. Yet code set definitions are rarely transparent and their sharing is almost non-existent. There is a lack of methodological standards for the management (construction, sharing, revision and reuse) of clinical code sets which needs to be addressed to ensure the reliability and credibility of studies which use code sets. To review methodological literature on the management of sets of clinical codes used in research on clinical databases and to provide a list of best practice recommendations for future studies and software tools. We performed an exhaustive search for methodological papers about clinical code set engineering for re-using EHR data in research. This was supplemented with papers identified by snowball sampling. In addition, a list of e-phenotyping systems was constructed by merging references from several systematic reviews on this topic, and the processes adopted by those systems for code set management was reviewed. Thirty methodological papers were reviewed. Common approaches included: creating an initial list of synonyms for the condition of interest (n=20); making use of the hierarchical nature of coding terminologies during searching (n=23); reviewing sets with clinician input (n=20); and reusing and updating an existing code set (n=20). Several open source software tools (n=3) were discovered. There is a need for software tools that enable users to easily and quickly create, revise, extend, review and share code sets and we provide a list of recommendations for their design and implementation. Research re-using EHR data could be improved through the further development, more widespread use and routine reporting of the methods by which clinical codes were selected. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Zhang, Shuting; Cui, Yan; Li, Lingxi; Li, Yuanyuan; Zhou, Peiyu; Luo, Lanxin; Sun, Baoshan
2015-12-01
Polymeric proanthocyanidins isolated from a grape seed phenolic extract were hydrolysed in the presence of phloroglucinol into monomer catechins and their nucleophile derivatives. Each of the phloroglucinolysis products was successfully separated and isolated in large amount by semi-preparative HSCCC technique under the optimized conditions based on a selection of suitable solvent system. The optimized solvent system consisted of n-hexane-ethyl acetate-water (1:80:80, v/v/v) with a combination of head-tail and tail-head elution modes. By only one-step HSCCC separation, the purity of each obtained phloroglucinolysis product, including monomer catechins and their nucleophile derivatives was above 76%, verified by UPLC. The structures of these products were tentatively identified by UPLC based on their retention time and further confirmed by MS and (1)H NMR analysis. Furthermore, by DPPH, ABTS and FRAP assays, it was verified that all these phloroglucinolysis products possessed strong antioxidant activities, being catechin-nucleophile derivatives more powerful than free catechins. Copyright © 2015 Elsevier Ltd. All rights reserved.
Beam-beam interaction study of medium energy eRHIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao,Y.; Litvinenko, V. N.; Ptitsyn, V.
Medium Energy eRHIC (MeRHIC), the first stage design of eRHIC, includes a multi-pass ERL that provides 4GeV high quality electron beam to collide with the ion beam of RHIC. It delivers a minimum luminosity of 10{sup 32} cm{sup -2}s{sup -1}. Beam-beam effects present one of major factors limiting the luminosity of colliders. In this paper, both beam-beam effects on the electron beam and the proton beam in MeRHIC are investigated. The beam-beam interaction can induce a head-tail type instability of the proton beam referred to as the kink instability. Thus, beam stability conditions should be established to avoid proton beammore » loss. Also, the electron beam transverse disruption by collisions has to be evaluated to ensure that the beam quality is good enough for the energy recovery pass. The relation of proton beam stability, electron disruption and consequential luminosity are carried out after thorough discussion.« less
Coherent Beam-Beam Instability in Collisions with a Large Crossing Angle
NASA Astrophysics Data System (ADS)
Ohmi, K.; Kuroo, N.; Oide, K.; Zhou, D.; Zimmermann, F.
2017-09-01
In recent years the "crab-waist collision" scheme [P. Raimondi, Proceedings of 2nd SuperB Workshop, Frascati, 2006.; M. Zobov et al., Phys. Rev. Lett. 104, 174801 (2010), 10.1103/PhysRevLett.104.174801] has become popular for circular e+ e- colliders. The designs of several future colliders are based on this scheme. So far the beam-beam effects for collisions under a large crossing angle with or without crab waist were mostly studied using weak-strong simulations. We present here strong-strong simulations showing a novel strong coherent head-tail instability, which can limit the performance of proposed future colliders. We explain the underlying instability mechanism starting from the "cross-wake force" induced by the beam-beam interaction. Using this beam-beam wake, the beam-beam head tail modes are studied by an eigenmode analysis. The instability may affect all collider designs based on the crab-waist scheme. We suggest an experimental verification at SuperKEKB during its commissioning phase II.
Self-assembling electroactive hydrogels for flexible display technology
NASA Astrophysics Data System (ADS)
Jones, Scott L.; Hou Wong, Kok; Thordarson, Pall; Ladouceur, François
2010-12-01
We have assessed the potential of self-assembling hydrogels for use in conformal displays. The self-assembling process can be used to alter the transparency of the material to all visible light due to scattering by fibres. The reversible transition is shown to be of low energy by differential scanning calorimetry. For use in technology it is imperative that this transition is controlled electrically. We have thus synthesized novel self-assembling hydrogelator molecules which contain an electroactive group. The well-known redox couple of anthraquinone/anthrahydroquinone has been used as the hydrophobic component for a series of small molecule gelators. They are further functionalized with peptide combinations of L-phenylalanine and glycine to provide the hydrophilic group to complete 'head-tail' models of self-assembling gels. The gelation and electroactive characteristics of the series were assessed. Cyclic voltammetry shows the reversible redox cycle to be only superficially altered by functionalization. Additionally, spectroelectrochemical measurements show a reversible transparency and colour change induced by the redox process.
Wang, Shaoying; Ji, Zhouxiang; Yan, Erfu; Haque, Farzin; Guo, Peixuan
2016-01-01
The DNA packaging motor of dsDNA bacterial viruses contains a head-tail connector with a channel for genome to enter during assembly and to exit during host infection. The DNA packaging motor of bacterial virus phi29 was recently reported to use the “One-way Revolution” mechanism for DNA packaging. This raises a question of how dsDNA is ejected during infection if the channel acts as a one-way inward valve. Here we report a three step conformational change of the portal channel that is common among DNA translocation motors of bacterial viruses T3, T4, SPP1, and phi29. The channels of these motors exercise three discrete steps of gating, as revealed by electrophysiological assays. It is proposed that the three step channel conformational changes occur during DNA entry process, resulting in a structural transition in preparation of DNA movement in the reverse direction during ejection. PMID:27181501
Design Analysis of SNS Target StationBiological Shielding Monoligh with Proton Power Uprate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bekar, Kursat B.; Ibrahim, Ahmad M.
2017-05-01
This report documents the analysis of the dose rate in the experiment area outside the Spallation Neutron Source (SNS) target station shielding monolith with proton beam energy of 1.3 GeV. The analysis implemented a coupled three dimensional (3D)/two dimensional (2D) approach that used both the Monte Carlo N-Particle Extended (MCNPX) 3D Monte Carlo code and the Discrete Ordinates Transport (DORT) two dimensional deterministic code. The analysis with proton beam energy of 1.3 GeV showed that the dose rate in continuously occupied areas on the lateral surface outside the SNS target station shielding monolith is less than 0.25 mrem/h, which compliesmore » with the SNS facility design objective. However, the methods and codes used in this analysis are out of date and unsupported, and the 2D approximation of the target shielding monolith does not accurately represent the geometry. We recommend that this analysis is updated with modern codes and libraries such as ADVANTG or SHIFT. These codes have demonstrated very high efficiency in performing full 3D radiation shielding analyses of similar and even more difficult problems.« less
Creation and utilization of a World Wide Web based space radiation effects code: SIREST
NASA Technical Reports Server (NTRS)
Singleterry, R. C. Jr; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.; Thibeault, S. A.; Noor, A. K.; Cucinotta, F. A.; Badavi, F. F.; Chang, C. K.; Qualls, G. D.;
2001-01-01
In order for humans and electronics to fully and safely operate in the space environment, codes like HZETRN (High Charge and Energy Transport) must be included in any designer's toolbox for design evaluation with respect to radiation damage. Currently, spacecraft designers do not have easy access to accurate radiation codes like HZETRN to evaluate their design for radiation effects on humans and electronics. Today, the World Wide Web is sophisticated enough to support the entire HZETRN code and all of the associated pre and post processing tools. This package is called SIREST (Space Ionizing Radiation Effects and Shielding Tools). There are many advantages to SIREST. The most important advantage is the instant update capability of the web. Another major advantage is the modularity that the web imposes on the code. Right now, the major disadvantage of SIREST will be its modularity inside the designer's system. This mostly comes from the fact that a consistent interface between the designer and the computer system to evaluate the design is incomplete. This, however, is to be solved in the Intelligent Synthesis Environment (ISE) program currently being funded by NASA.
Energy Storage System Safety: Plan Review and Inspection Checklist
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Pam C.; Conover, David R.
Codes, standards, and regulations (CSR) governing the design, construction, installation, commissioning, and operation of the built environment are intended to protect the public health, safety, and welfare. While these documents change over time to address new technology and new safety challenges, there is generally some lag time between the introduction of a technology into the market and the time it is specifically covered in model codes and standards developed in the voluntary sector. After their development, there is also a timeframe of at least a year or two until the codes and standards are adopted. Until existing model codes andmore » standards are updated or new ones are developed and then adopted, one seeking to deploy energy storage technologies or needing to verify the safety of an installation may be challenged in trying to apply currently implemented CSRs to an energy storage system (ESS). The Energy Storage System Guide for Compliance with Safety Codes and Standards1 (CG), developed in June 2016, is intended to help address the acceptability of the design and construction of stationary ESSs, their component parts, and the siting, installation, commissioning, operations, maintenance, and repair/renovation of ESS within the built environment.« less
User Manual for the NASA Glenn Ice Accretion Code LEWICE: Version 2.0
NASA Technical Reports Server (NTRS)
Wright, William B.
1999-01-01
A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.0 of this code, which is called LEWICE. This version differs from previous releases due to its robustness and its ability to reproduce results accurately for different spacing and time step criteria across computing platform. It also differs in the extensive effort undertaken to compare the results against the database of ice shapes which have been generated in the NASA Glenn Icing Research Tunnel (IRT) 1. This report will only describe the features of the code related to the use of the program. The report will not describe the inner working of the code or the physical models used. This information is available in the form of several unpublished documents which will be collectively referred to as a Programmers Manual for LEWICE 2 in this report. These reports are intended as an update/replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.
Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems
NASA Technical Reports Server (NTRS)
Rojdev, Kristina; Koontz, Steve; Atwell, William; Boeder, Paul
2014-01-01
NASA's future missions are focused on long-duration deep space missions for human exploration which offers no options for a quick emergency return to Earth. The combination of long mission duration with no quick emergency return option leads to unprecedented spacecraft system safety and reliability requirements. It is important that spacecraft avionics systems for human deep space missions are not susceptible to Single Event Effect (SEE) failures caused by space radiation (primarily the continuous galactic cosmic ray background and the occasional solar particle event) interactions with electronic components and systems. SEE effects are typically managed during the design, development, and test (DD&T) phase of spacecraft development by using heritage hardware (if possible) and through extensive component level testing, followed by system level failure analysis tasks that are both time consuming and costly. The ultimate product of the SEE DD&T program is a prediction of spacecraft avionics reliability in the flight environment produced using various nuclear reaction and transport codes in combination with the component and subsystem level radiation test data. Previous work by Koontz, et al.1 utilized FLUKA, a Monte Carlo nuclear reaction and transport code, to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data for a variety of spacecraft and space flight environments. However, FLUKA has a long run-time (on the order of days). CREME962, an easy to use deterministic code offering short run times, was also compared with FLUKA predictions and in-flight data. CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Thus, this paper will investigate the use of HZETRN 20103, a fast and easy to use deterministic transport code, similar to CREME96, that was developed at NASA Langley Research Center primarily for flight crew ionizing radiation dose assessments. HZETRN 2010 includes updates to address secondary particle shower effects more accurately, and might be used as another tool to verify spacecraft avionics system reliability in space flight SEE environments.
miRTarBase update 2018: a resource for experimentally validated microRNA-target interactions.
Chou, Chih-Hung; Shrestha, Sirjana; Yang, Chi-Dung; Chang, Nai-Wen; Lin, Yu-Ling; Liao, Kuang-Wen; Huang, Wei-Chi; Sun, Ting-Hsuan; Tu, Siang-Jyun; Lee, Wei-Hsiang; Chiew, Men-Yee; Tai, Chun-San; Wei, Ting-Yen; Tsai, Tzi-Ren; Huang, Hsin-Tzu; Wang, Chung-Yu; Wu, Hsin-Yi; Ho, Shu-Yi; Chen, Pin-Rong; Chuang, Cheng-Hsun; Hsieh, Pei-Jung; Wu, Yi-Shin; Chen, Wen-Liang; Li, Meng-Ju; Wu, Yu-Chun; Huang, Xin-Yi; Ng, Fung Ling; Buddhakosai, Waradee; Huang, Pei-Chun; Lan, Kuan-Chun; Huang, Chia-Yen; Weng, Shun-Long; Cheng, Yeong-Nan; Liang, Chao; Hsu, Wen-Lian; Huang, Hsien-Da
2018-01-04
MicroRNAs (miRNAs) are small non-coding RNAs of ∼ 22 nucleotides that are involved in negative regulation of mRNA at the post-transcriptional level. Previously, we developed miRTarBase which provides information about experimentally validated miRNA-target interactions (MTIs). Here, we describe an updated database containing 422 517 curated MTIs from 4076 miRNAs and 23 054 target genes collected from over 8500 articles. The number of MTIs curated by strong evidence has increased ∼1.4-fold since the last update in 2016. In this updated version, target sites validated by reporter assay that are available in the literature can be downloaded. The target site sequence can extract new features for analysis via a machine learning approach which can help to evaluate the performance of miRNA-target prediction tools. Furthermore, different ways of browsing enhance user browsing specific MTIs. With these improvements, miRTarBase serves as more comprehensively annotated, experimentally validated miRNA-target interactions databases in the field of miRNA related research. miRTarBase is available at http://miRTarBase.mbc.nctu.edu.tw/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu
2015-07-21
Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy.
Siregar, S; Pouw, M E; Moons, K G M; Versteegh, M I M; Bots, M L; van der Graaf, Y; Kalkman, C J; van Herwerden, L A; Groenwold, R H H
2014-01-01
Objective To compare the accuracy of data from hospital administration databases and a national clinical cardiac surgery database and to compare the performance of the Dutch hospital standardised mortality ratio (HSMR) method and the logistic European System for Cardiac Operative Risk Evaluation, for the purpose of benchmarking of mortality across hospitals. Methods Information on all patients undergoing cardiac surgery between 1 January 2007 and 31 December 2010 in 10 centres was extracted from The Netherlands Association for Cardio-Thoracic Surgery database and the Hospital Discharge Registry. The number of cardiac surgery interventions was compared between both databases. The European System for Cardiac Operative Risk Evaluation and hospital standardised mortality ratio models were updated in the study population and compared using the C-statistic, calibration plots and the Brier-score. Results The number of cardiac surgery interventions performed could not be assessed using the administrative database as the intervention code was incorrect in 1.4–26.3%, depending on the type of intervention. In 7.3% no intervention code was registered. The updated administrative model was inferior to the updated clinical model with respect to discrimination (c-statistic of 0.77 vs 0.85, p<0.001) and calibration (Brier Score of 2.8% vs 2.6%, p<0.001, maximum score 3.0%). Two average performing hospitals according to the clinical model became outliers when benchmarking was performed using the administrative model. Conclusions In cardiac surgery, administrative data are less suitable than clinical data for the purpose of benchmarking. The use of either administrative or clinical risk-adjustment models can affect the outlier status of hospitals. Risk-adjustment models including procedure-specific clinical risk factors are recommended. PMID:24334377
Ex-Vessel Core Melt Modeling Comparison between MELTSPREAD-CORQUENCH and MELCOR 2.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robb, Kevin R.; Farmer, Mitchell; Francis, Matthew W.
System-level code analyses by both United States and international researchers predict major core melting, bottom head failure, and corium-concrete interaction for Fukushima Daiichi Unit 1 (1F1). Although system codes such as MELCOR and MAAP are capable of capturing a wide range of accident phenomena, they currently do not contain detailed models for evaluating some ex-vessel core melt behavior. However, specialized codes containing more detailed modeling are available for melt spreading such as MELTSPREAD as well as long-term molten corium-concrete interaction (MCCI) and debris coolability such as CORQUENCH. In a preceding study, Enhanced Ex-Vessel Analysis for Fukushima Daiichi Unit 1: Meltmore » Spreading and Core-Concrete Interaction Analyses with MELTSPREAD and CORQUENCH, the MELTSPREAD-CORQUENCH codes predicted the 1F1 core melt readily cooled in contrast to predictions by MELCOR. The user community has taken notice and is in the process of updating their systems codes; specifically MAAP and MELCOR, to improve and reduce conservatism in their ex-vessel core melt models. This report investigates why the MELCOR v2.1 code, compared to the MELTSPREAD and CORQUENCH 3.03 codes, yield differing predictions of ex-vessel melt progression. To accomplish this, the differences in the treatment of the ex-vessel melt with respect to melt spreading and long-term coolability are examined. The differences in modeling approaches are summarized, and a comparison of example code predictions is provided.« less
Validation of the WIMSD4M cross-section generation code with benchmark results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leal, L.C.; Deen, J.R.; Woodruff, W.L.
1995-02-01
The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment for Research and Test (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the procedure to generatemore » cross-section libraries for reactor analyses and calculations utilizing the WIMSD4M code. To do so, the results of calculations performed with group cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory(ORNL) unreflected critical spheres, the TRX critical experiments, and calculations of a modified Los Alamos highly-enriched heavy-water moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.« less
Overcoming Challenges in Kinetic Modeling of Magnetized Plasmas and Vacuum Electronic Devices
NASA Astrophysics Data System (ADS)
Omelchenko, Yuri; Na, Dong-Yeop; Teixeira, Fernando
2017-10-01
We transform the state-of-the art of plasma modeling by taking advantage of novel computational techniques for fast and robust integration of multiscale hybrid (full particle ions, fluid electrons, no displacement current) and full-PIC models. These models are implemented in 3D HYPERS and axisymmetric full-PIC CONPIC codes. HYPERS is a massively parallel, asynchronous code. The HYPERS solver does not step fields and particles synchronously in time but instead executes local variable updates (events) at their self-adaptive rates while preserving fundamental conservation laws. The charge-conserving CONPIC code has a matrix-free explicit finite-element (FE) solver based on a sparse-approximate inverse (SPAI) algorithm. This explicit solver approximates the inverse FE system matrix (``mass'' matrix) using successive sparsity pattern orders of the original matrix. It does not reduce the set of Maxwell's equations to a vector-wave (curl-curl) equation of second order but instead utilizes the standard coupled first-order Maxwell's system. We discuss the ability of our codes to accurately and efficiently account for multiscale physical phenomena in 3D magnetized space and laboratory plasmas and axisymmetric vacuum electronic devices.
NASA Technical Reports Server (NTRS)
Radhakrishnan, K.
1984-01-01
The efficiency and accuracy of several algorithms recently developed for the efficient numerical integration of stiff ordinary differential equations are compared. The methods examined include two general-purpose codes, EPISODE and LSODE, and three codes (CHEMEQ, CREK1D, and GCKP84) developed specifically to integrate chemical kinetic rate equations. The codes are applied to two test problems drawn from combustion kinetics. The comparisons show that LSODE is the fastest code currently available for the integration of combustion kinetic rate equations. An important finding is that an interactive solution of the algebraic energy conservation equation to compute the temperature does not result in significant errors. In addition, this method is more efficient than evaluating the temperature by integrating its time derivative. Significant reductions in computational work are realized by updating the rate constants (k = at(supra N) N exp(-E/RT) only when the temperature change exceeds an amount delta T that is problem dependent. An approximate expression for the automatic evaluation of delta T is derived and is shown to result in increased efficiency.
Robust Pedestrian Tracking and Recognition from FLIR Video: A Unified Approach via Sparse Coding
Li, Xin; Guo, Rui; Chen, Chao
2014-01-01
Sparse coding is an emerging method that has been successfully applied to both robust object tracking and recognition in the vision literature. In this paper, we propose to explore a sparse coding-based approach toward joint object tracking-and-recognition and explore its potential in the analysis of forward-looking infrared (FLIR) video to support nighttime machine vision systems. A key technical contribution of this work is to unify existing sparse coding-based approaches toward tracking and recognition under the same framework, so that they can benefit from each other in a closed-loop. On the one hand, tracking the same object through temporal frames allows us to achieve improved recognition performance through dynamical updating of template/dictionary and combining multiple recognition results; on the other hand, the recognition of individual objects facilitates the tracking of multiple objects (i.e., walking pedestrians), especially in the presence of occlusion within a crowded environment. We report experimental results on both the CASIAPedestrian Database and our own collected FLIR video database to demonstrate the effectiveness of the proposed joint tracking-and-recognition approach. PMID:24961216
Evaluation of candidate working fluid formulations for the electrothermal-chemical wind tunnel
NASA Technical Reports Server (NTRS)
Akyurtlu, Jale F.; Akyurtlu, Ates
1993-01-01
A new hypersonic test facility which can simulate conditions typical of atmospheric flight at Mach numbers up to 20 is currently under study at the NASA/LaRC Hypersonic Propulsion Branch. In the proposed research, it was suggested that a combustion augmented electrothermal wind tunnel concept may be applied to the planned hypersonic testing facility. The purpose of the current investigation is to evaluate some candidate working fluid formulations which may be used in the chemical-electrothermal wind. The efforts in the initial phase of this research were concentrated on acquiring the code used by GASL to model the electrothermal wind tunnel and testing it using the conditions of GASL simulation. The early version of the general chemical kinetics code (GCKP84) was obtained from NASA and the latest updated version of the code (LSENS) was obtained from the author Dr. Bittker. Both codes are installed on a personal computer with a 486 25 MHz processor and 16 Mbyte RAM. Since the available memory was not sufficient to debug LSENS, for the current work GCKP84 was used.
2007-10-02
still widely used — or Dalits .161 Although these categories are understood throughout India , they describe reality only in the most general terms...extent of sexual violence against Dalit women . That U.N. committee itself issued a March 2007 report which criticized the “frequent failure” of Indian law...Order Code RL33529 India -U.S. Relations Updated October 2, 2007 K. Alan Kronstadt Specialist in South Asian Affairs Foreign Affairs, Defense, and
Stata Hybrids: Updates and Ideas
NASA Technical Reports Server (NTRS)
Fieldler, James
2014-01-01
At last year's Stata conference I presented two projects for using Python with Stata: a plugin that embeds the Python programming language within Stata and code for using Stata data sets in Python. In this talk I will describe some small improvements being made to these projects, and I will present other ideas for combining tools with Stata. Some of these ideas use Python, some use JavaScript and a web browser.
Focused Logistics, Joint Vision 2010: A Joint Logistics Roadmap
2010-01-01
AIS). AIT devices include bar codes for individual items, optical memory cards for multipacks and containers, radio frequency tags for containers and...Fortezza Card and Firewall technologies are being developed to prevent unau- thorized access. As for infrastructure, DISA has already made significant in...radio frequency tags and optical memory cards , to continuously update the JTAV database. By September 1998, DSS will be deployed in all wholesale
NASA Technical Reports Server (NTRS)
Wey, Thomas; Liu, Nan-Suey
2015-01-01
This paper summarizes the procedures of (1) generating control volumes anchored at the nodes of a mesh; and (2) generating staggered control volumes via mesh reconstructions, in terms of either mesh realignment or mesh refinement, as well as presents sample results from their applications to the numerical solution of a single-element LDI combustor using a releasable edition of the National Combustion Code (NCC).
Globalization, Worker Insecurity, and Policy Approaches
2007-07-24
Order Code RL34091 Globalization, Worker Insecurity , and Policy Approaches Updated July 24, 2007 Raymond J. Ahearn Specialist in International Trade...SUBTITLE Globalization, Worker Insecurity , and Policy Approaches 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Globalization, Worker Insecurity , and Policy Approaches Summary Today’s global economy
Reprint Filing: A Profile-Based Solution
Gass, David A.; Putnam, R. Wayne
1983-01-01
A reprint filing system based on practice profiles can give family physicians easy access to relevant medical information. The use of the ICHPPC classification and some supplemental categories provides a more practical coding mechanism than organ systems, textbook chapter titles or even Index Medicus subject headings. The system can be simply maintained, updated and improved, but users must regularly weed out unused information, and read widely to keep the reprints current. PMID:21283301
The Republic of the Philippines: Background and U.S. Relations
2007-08-10
cost of living. 21 The HDI ranks countries according to human development indicators of life expectancy, education, literacy, and gross domestic...sovereignty over Mischief Reef, which is one of approximately 100 reefs and islands disputed by five Southeast Asian countries . A Visiting Forces...Order Code RL33233 The Republic of the Philippines: Background and U.S. Relations Updated August 10, 2007 Thomas Lum Specialist in Asian Affairs
Improving Earth Science Metadata: Modernizing ncISO
NASA Astrophysics Data System (ADS)
O'Brien, K.; Schweitzer, R.; Neufeld, D.; Burger, E. F.; Signell, R. P.; Arms, S. C.; Wilcox, K.
2016-12-01
ncISO is a package of tools developed at NOAA's National Center for Environmental Information (NCEI) that facilitates the generation of ISO 19115-2 metadata from NetCDF data sources. The tool currently exists in two iterations: a command line utility and a web-accessible service within the THREDDS Data Server (TDS). Several projects, including NOAA's Unified Access Framework (UAF), depend upon ncISO to generate the ISO-compliant metadata from their data holdings and use the resulting information to populate discovery tools such as NCEI's ESRI Geoportal and NOAA's data.noaa.gov CKAN system. In addition to generating ISO 19115-2 metadata, the tool calculates a rubric score based on how well the dataset follows the Attribute Conventions for Dataset Discovery (ACDD). The result of this rubric calculation, along with information about what has been included and what is missing is displayed in an HTML document generated by the ncISO software package. Recently ncISO has fallen behind in terms of supporting updates to conventions such updates to the ACDD. With the blessing of the original programmer, NOAA's UAF has been working to modernize the ncISO software base. In addition to upgrading ncISO to utilize version1.3 of the ACDD, we have been working with partners at Unidata and IOOS to unify the tool's code base. In essence, we are merging the command line capabilities into the same software that will now be used by the TDS service, allowing easier updates when conventions such as ACDD are updated in the future. In this presentation, we will discuss the work the UAF project has done to support updated conventions within ncISO, as well as describe how the updated tool is helping to improve metadata throughout the earth and ocean sciences.
Updated Chemical Kinetics and Sensitivity Analysis Code
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan
2005-01-01
An updated version of the General Chemical Kinetics and Sensitivity Analysis (LSENS) computer code has become available. A prior version of LSENS was described in "Program Helps to Determine Chemical-Reaction Mechanisms" (LEW-15758), NASA Tech Briefs, Vol. 19, No. 5 (May 1995), page 66. To recapitulate: LSENS solves complex, homogeneous, gas-phase, chemical-kinetics problems (e.g., combustion of fuels) that are represented by sets of many coupled, nonlinear, first-order ordinary differential equations. LSENS has been designed for flexibility, convenience, and computational efficiency. The present version of LSENS incorporates mathematical models for (1) a static system; (2) steady, one-dimensional inviscid flow; (3) reaction behind an incident shock wave, including boundary layer correction; (4) a perfectly stirred reactor; and (5) a perfectly stirred reactor followed by a plug-flow reactor. In addition, LSENS can compute equilibrium properties for the following assigned states: enthalpy and pressure, temperature and pressure, internal energy and volume, and temperature and volume. For static and one-dimensional-flow problems, including those behind an incident shock wave and following a perfectly stirred reactor calculation, LSENS can compute sensitivity coefficients of dependent variables and their derivatives, with respect to the initial values of dependent variables and/or the rate-coefficient parameters of the chemical reactions.
Baker, Nancy T.; Stone, Wesley W.
2013-01-01
This report provides preliminary estimates of annual agricultural use of 374 pesticide compounds in counties of the conterminous United States in 2010 and 2011, compiled by means of methods described in Thelin and Stone (2013). U.S. Department of Agriculture (USDA) county-level data for harvested-crop acreage were used in conjunction with proprietary Crop Reporting District (CRD)-level pesticide-use data to estimate county-level pesticide use. Estimated pesticide use (EPest) values were calculated with both the EPest-high and EPest-low methods. The distinction between the EPest-high method and the EPest-low method is that there are more counties with estimated pesticide use for EPest-high compared to EPest-low, owing to differing assumptions about missing survey data (Thelin and Stone, 2013). Preliminary estimates in this report will be revised upon availability of updated crop acreages in the 2012 Agricultural Census, to be published by the USDA in 2014. In addition, estimates for 2008 and 2009 previously published by Stone (2013) will be updated subsequent to the 2012 Agricultural Census release. Estimates of annual agricultural pesticide use are provided as downloadable, tab-delimited files, which are organized by compound, year, state Federal Information Processing Standard (FIPS) code, county FIPS code, and kg (amount in kilograms).
Huang, Dandan; Yi, Xianfu; Zhang, Shijie; Zheng, Zhanye; Wang, Panwen; Xuan, Chenghao; Sham, Pak Chung; Wang, Junwen; Li, Mulin Jun
2018-05-16
Genome-wide association studies have generated over thousands of susceptibility loci for many human complex traits, and yet for most of these associations the true causal variants remain unknown. Tissue/cell type-specific prediction and prioritization of non-coding regulatory variants will facilitate the identification of causal variants and underlying pathogenic mechanisms for particular complex diseases and traits. By leveraging recent large-scale functional genomics/epigenomics data, we develop an intuitive web server, GWAS4D (http://mulinlab.tmu.edu.cn/gwas4d or http://mulinlab.org/gwas4d), that systematically evaluates GWAS signals and identifies context-specific regulatory variants. The updated web server includes six major features: (i) updates the regulatory variant prioritization method with our new algorithm; (ii) incorporates 127 tissue/cell type-specific epigenomes data; (iii) integrates motifs of 1480 transcriptional regulators from 13 public resources; (iv) uniformly processes Hi-C data and generates significant interactions at 5 kb resolution across 60 tissues/cell types; (v) adds comprehensive non-coding variant functional annotations; (vi) equips a highly interactive visualization function for SNP-target interaction. Using a GWAS fine-mapped set for 161 coronary artery disease risk loci, we demonstrate that GWAS4D is able to efficiently prioritize disease-causal regulatory variants.
NASA Astrophysics Data System (ADS)
Huhn, William Paul; Lange, Björn; Yu, Victor; Blum, Volker; Lee, Seyong; Yoon, Mina
Density-functional theory has been well established as the dominant quantum-mechanical computational method in the materials community. Large accurate simulations become very challenging on small to mid-scale computers and require high-performance compute platforms to succeed. GPU acceleration is one promising approach. In this talk, we present a first implementation of all-electron density-functional theory in the FHI-aims code for massively parallel GPU-based platforms. Special attention is paid to the update of the density and to the integration of the Hamiltonian and overlap matrices, realized in a domain decomposition scheme on non-uniform grids. The initial implementation scales well across nodes on ORNL's Titan Cray XK7 supercomputer (8 to 64 nodes, 16 MPI ranks/node) and shows an overall speed up in runtime due to utilization of the K20X Tesla GPUs on each Titan node of 1.4x, with the charge density update showing a speed up of 2x. Further acceleration opportunities will be discussed. Work supported by the LDRD Program of ORNL managed by UT-Battle, LLC, for the U.S. DOE and by the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.
Construction of FuzzyFind Dictionary using Golay Coding Transformation for Searching Applications
NASA Astrophysics Data System (ADS)
Kowsari, Kamram
2015-03-01
searching through a large volume of data is very critical for companies, scientists, and searching engines applications due to time complexity and memory complexity. In this paper, a new technique of generating FuzzyFind Dictionary for text mining was introduced. We simply mapped the 23 bits of the English alphabet into a FuzzyFind Dictionary or more than 23 bits by using more FuzzyFind Dictionary, and reflecting the presence or absence of particular letters. This representation preserves closeness of word distortions in terms of closeness of the created binary vectors within Hamming distance of 2 deviations. This paper talks about the Golay Coding Transformation Hash Table and how it can be used on a FuzzyFind Dictionary as a new technology for using in searching through big data. This method is introduced by linear time complexity for generating the dictionary and constant time complexity to access the data and update by new data sets, also updating for new data sets is linear time depends on new data points. This technique is based on searching only for letters of English that each segment has 23 bits, and also we have more than 23-bit and also it could work with more segments as reference table.
Risk Assessment Update: Russian Segment
NASA Technical Reports Server (NTRS)
Christiansen, Eric; Lear, Dana; Hyde, James; Bjorkman, Michael; Hoffman, Kevin
2012-01-01
BUMPER-II version 1.95j source code was provided to RSC-E- and Khrunichev at January 2012 MMOD TIM in Moscow. MEMCxP and ORDEM 3.0 environments implemented as external data files. NASA provided a sample ORDEM 3.0 g."key" & "daf" environment file set for demonstration and benchmarking BUMPER -II v1.95j installation at the Jan-12 TIM. ORDEM 3.0 has been completed and is currently in beta testing. NASA will provide a preliminary set of ORDEM 3.0 ".key" & ".daf" environment files for the years 2012 through 2028. Bumper output files produced using the new ORDEM 3.0 data files are intended for internal use only, not for requirements verification. Output files will contain these words ORDEM FILE DESCRIPTION = PRELIMINARY VERSION: not for production. The projectile density term in many BUMPER-II ballistic limit equations will need to be updated. Cube demo scripts and output files delivered at the Jan-12 TIM have been updated for the new ORDEM 3.0 data files. Risk assessment results based on ORDEM 3.0 and MEM will be presented for the Russian Segment (RS) of ISS.
Howard, James D.
2017-01-01
Goal-directed behavior is sensitive to the current value of expected outcomes. This requires independent representations of specific rewards, which have been linked to orbitofrontal cortex (OFC) function. However, the mechanisms by which the human brain updates specific goals on the fly, and translates those updates into choices, have remained unknown. Here we implemented selective devaluation of appetizing food odors in combination with pattern-based neuroimaging and a decision-making task. We found that in a hungry state, participants chose to smell high-intensity versions of two value-matched food odor rewards. After eating a meal corresponding to one of the two odors, participants switched choices toward the low intensity of the sated odor but continued to choose the high intensity of the nonsated odor. This sensory-specific behavioral effect was mirrored by pattern-based changes in fMRI signal in lateral posterior OFC, where specific reward identity representations were altered after the meal for the sated food odor but retained for the nonsated counterpart. In addition, changes in functional connectivity between the OFC and general value coding in ventromedial prefrontal cortex (vmPFC) predicted individual differences in satiety-related choice behavior. These findings demonstrate how flexible representations of specific rewards in the OFC are updated by devaluation, and how functional connections to vmPFC reflect the current value of outcomes and guide goal-directed behavior. SIGNIFICANCE STATEMENT The orbitofrontal cortex (OFC) is critical for goal-directed behavior. A recent proposal is that OFC fulfills this function by representing a variety of state and task variables (“cognitive maps”), including a conjunction of expected reward identity and value. Here we tested how identity-specific representations of food odor reward are updated by satiety. We found that fMRI pattern-based signatures of reward identity in lateral posterior OFC were modulated after selective devaluation, and that connectivity between this region and general value coding ventromedial prefrontal cortex (vmPFC) predicted choice behavior. These results provide evidence for a mechanism by which devaluation modulates a cognitive map of expected reward in OFC and thereby alters general value signals in vmPFC to guide goal-directed behavior. PMID:28159906
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-01
...The Coast Guard proposes to amend the existing regulations that implement the International Convention on Standards of Training, Certification and Watchkeeping for Seafarers, 1978, as amended (STCW Convention), as well as the Seafarer's Training, Certification and Watchkeeping Code (STCW Code). The changes proposed in this Supplemental Notice of Proposed Rulemaking (SNPRM) address the comments received from the public response to the Notice of Proposed Rulemaking (NPRM), in most cases through revisions based on those comments, and propose to incorporate the 2010 amendments to the STCW Convention that will come into force on January 1, 2012. In addition, this SNPRM proposes to make other non-STCW changes necessary to reorganize, clarify, and update these regulations.
A secure RFID authentication protocol adopting error correction code.
Chen, Chien-Ming; Chen, Shuai-Min; Zheng, Xinying; Chen, Pei-Yu; Sun, Hung-Min
2014-01-01
RFID technology has become popular in many applications; however, most of the RFID products lack security related functionality due to the hardware limitation of the low-cost RFID tags. In this paper, we propose a lightweight mutual authentication protocol adopting error correction code for RFID. Besides, we also propose an advanced version of our protocol to provide key updating. Based on the secrecy of shared keys, the reader and the tag can establish a mutual authenticity relationship. Further analysis of the protocol showed that it also satisfies integrity, forward secrecy, anonymity, and untraceability. Compared with other lightweight protocols, the proposed protocol provides stronger resistance to tracing attacks, compromising attacks and replay attacks. We also compare our protocol with previous works in terms of performance.
King, Andrew M Q; Lefkowitz, Elliot J; Mushegian, Arcady R; Adams, Michael J; Dutilh, Bas E; Gorbalenya, Alexander E; Harrach, Balázs; Harrison, Robert L; Junglen, Sandra; Knowles, Nick J; Kropinski, Andrew M; Krupovic, Mart; Kuhn, Jens H; Nibert, Max L; Rubino, Luisa; Sabanadzovic, Sead; Sanfaçon, Hélène; Siddell, Stuart G; Simmonds, Peter; Varsani, Arvind; Zerbini, Francisco Murilo; Davison, Andrew J
2018-05-12
This article lists the changes to virus taxonomy approved and ratified by the International Committee on Taxonomy of Viruses in February 2018. A total of 451 species, 69 genera, 11 subfamilies, 9 families and one new order were added to the taxonomy. The current totals at each taxonomic level now stand at 9 orders, 131 families, 46 subfamilies, 803 genera and 4853 species. A change was made to the International Code of Virus Classification and Nomenclature to allow the use of the names of people in taxon names under appropriate circumstances. An updated Master Species List incorporating the approved changes was released in March 2018 ( https://talk.ictvonline.org/taxonomy/ ).
International code of nomenclature of prokaryotes
Garrity, George M.; Parker, Charles T.; Tindall, Brian J.
2015-11-20
Here, this volume contains the edition of the International Code of Nomenclature of Prokaryotes that was presented in draft form and available for comment at the Plenary Session of the Fourteenth International Congress of Bacteriology and Applied Microbiology (BAM), Montréal, 2014, together with updated lists of conserved and rejected bacterial names and of Opinions issued by the Judicial Commission. As in the past it brings together those changes accepted, published and documented by the ICSP and the Judicial Commission since the last revision was published. Several new appendices have been added to this edition. Appendix 11 addresses the appropriate applicationmore » of the Candidatus concept, Appendix 12 contains the history of the van Niel Prize, and Appendix 13 contains the summaries of Congresses.« less
International code of nomenclature of prokaryotes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garrity, George M.; Parker, Charles T.; Tindall, Brian J.
Here, this volume contains the edition of the International Code of Nomenclature of Prokaryotes that was presented in draft form and available for comment at the Plenary Session of the Fourteenth International Congress of Bacteriology and Applied Microbiology (BAM), Montréal, 2014, together with updated lists of conserved and rejected bacterial names and of Opinions issued by the Judicial Commission. As in the past it brings together those changes accepted, published and documented by the ICSP and the Judicial Commission since the last revision was published. Several new appendices have been added to this edition. Appendix 11 addresses the appropriate applicationmore » of the Candidatus concept, Appendix 12 contains the history of the van Niel Prize, and Appendix 13 contains the summaries of Congresses.« less
Numerical simulation of ion charge breeding in electron beam ion source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, L., E-mail: zhao@far-tech.com; Kim, Jin-Soo
2014-02-15
The Electron Beam Ion Source particle-in-cell code (EBIS-PIC) tracks ions in an EBIS electron beam while updating electric potential self-consistently and atomic processes by the Monte Carlo method. Recent improvements to the code are reported in this paper. The ionization module has been improved by using experimental ionization energies and shell effects. The acceptance of injected ions and the emittance of extracted ion beam are calculated by extending EBIS-PIC to the beam line transport region. An EBIS-PIC simulation is performed for a Cs charge-breeding experiment at BNL. The charge state distribution agrees well with experiments, and additional simulation results ofmore » radial profiles and velocity space distributions of the trapped ions are presented.« less
The Environment-Power System Analysis Tool development program. [for spacecraft power supplies
NASA Technical Reports Server (NTRS)
Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.
1989-01-01
The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.
A Secure RFID Authentication Protocol Adopting Error Correction Code
Zheng, Xinying; Chen, Pei-Yu
2014-01-01
RFID technology has become popular in many applications; however, most of the RFID products lack security related functionality due to the hardware limitation of the low-cost RFID tags. In this paper, we propose a lightweight mutual authentication protocol adopting error correction code for RFID. Besides, we also propose an advanced version of our protocol to provide key updating. Based on the secrecy of shared keys, the reader and the tag can establish a mutual authenticity relationship. Further analysis of the protocol showed that it also satisfies integrity, forward secrecy, anonymity, and untraceability. Compared with other lightweight protocols, the proposed protocol provides stronger resistance to tracing attacks, compromising attacks and replay attacks. We also compare our protocol with previous works in terms of performance. PMID:24959619
Women deans' perceptions of the gender gap in American medical deanships.
Humberstone, Elizabeth
2017-01-01
: Women account for 16% of deans of American medical schools. To investigate this gender gap, female deans were interviewed about the barriers facing women advancing toward deanships. The author conducted semi-structured interviews with eight women deans. Interviews were analyzed using provisional coding and sub coding techniques. Four main themes emerged during the interviews: (1) the role of relationships in personal and career development, (2) leadership challenges, (3) barriers between women and leadership advancement, and (4) recommendations for improvement. Recommendations included allocating resources, mentorship, career flexibility, faculty development, updating the criteria for deanships, and restructuring search committees. The barriers identified by the deans are similar to those found in previous studies on female faculty and department chairs, suggesting limited improvement in gender equity progress.
Enhancement of the Earth Science and Remote Sensing Group's Website and Related Projects
NASA Technical Reports Server (NTRS)
Coffin, Ashley; Vanderbloemen, Lisa
2014-01-01
The major problem addressed throughout the term was the need to update the group's current website, as it was outdated and required streamlining and modernization. The old Gateway to Astronaut Photography of the Earth website had multiple components, many of which involved searches through expansive databases. The amount of work required to update the website was large and due to a desired release date, assistance was needed to help build new pages and to transfer old information. Additionally, one of the tools listed on the website called Image Detective had been underutilized in the past. It was important to address why the public was not using the tool and how it could potentially become more of a resource for the team. In order to help with updating the website, it was necessary to first learn HTML. After assisting with small edits, I began creating new pages. I utilized the "view page source" and "developer" tools in the internet browser to observe how other websites created their features and to test changes without editing the code. I then edited the code to create an interactive feature on the new page. For the Image Detective Page I began an evaluation of the current page. I also asked my fellow interns and friends at my University to offer their input. I took all of the opinions into account and wrote up a document regarding my recommendations. The recommendations will be considered as I help to improve the Image Detective page for the updated website. In addition to the website, other projects included the need for additional, and updated image collections, along with various project requests. The image collections have been used by educators in the classroom and the impact crater collection was highly requested. The glaciers collection focused mostly on South American glaciers and needed to include more of the earth's many glaciers. The collections had not been updated or created due to the fact that related imagery had not been catalogued. The process of cataloging involves identifying the center point location of the image and feature identification. Other project needs included collecting night images of India in for publishing. Again, many of the images were not catalogued and the database was lacking in night time imagery for that region. The last project was to calculate the size of mega fans in South Africa. Calculating the fan sizes involved several steps. To expedite the study, calculations needed to be made after the base maps had been created. Using data files that included an outline of the mega fans on a topographic map, I opened the file in Photoshop, determined the number of pixels within the outlined area, created a one degree squared box, determined the pixels within the box, converted the pixels within the box to kilometers, and then calculated the fan size using this information. Overall, the internship has been a learning experience for me. I have learned how to use new programs and I developed new skills. These These skills can help me as I enter into the next phase of my career. Learning Photoshop and HTML in addition to coding in Dreamweaver are highly sought after skills that are used in a variety of fields. Additionally, the exposure to different aspects of the team and working with different people helped me to gain a broader set of skills and allowed me to work with people with different experiences. The various projects I have worked on this summer have directly benefitted the team whether it was completing projects they did not have the time to do, or by helping the team reach deadlines sooner. The new website will be the best place to see all of my work as it will include the newly designed pages and will feature my updates to collections.
Validation of the WIMSD4M cross-section generation code with benchmark results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deen, J.R.; Woodruff, W.L.; Leal, L.E.
1995-01-01
The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section librariesmore » for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.« less
LOINC, a universal standard for identifying laboratory observations: a 5-year update.
McDonald, Clement J; Huff, Stanley M; Suico, Jeffrey G; Hill, Gilbert; Leavelle, Dennis; Aller, Raymond; Forrey, Arden; Mercer, Kathy; DeMoor, Georges; Hook, John; Williams, Warren; Case, James; Maloney, Pat
2003-04-01
The Logical Observation Identifier Names and Codes (LOINC) database provides a universal code system for reporting laboratory and other clinical observations. Its purpose is to identify observations in electronic messages such as Health Level Seven (HL7) observation messages, so that when hospitals, health maintenance organizations, pharmaceutical manufacturers, researchers, and public health departments receive such messages from multiple sources, they can automatically file the results in the right slots of their medical records, research, and/or public health systems. For each observation, the database includes a code (of which 25 000 are laboratory test observations), a long formal name, a "short" 30-character name, and synonyms. The database comes with a mapping program called Regenstrief LOINC Mapping Assistant (RELMA(TM)) to assist the mapping of local test codes to LOINC codes and to facilitate browsing of the LOINC results. Both LOINC and RELMA are available at no cost from http://www.regenstrief.org/loinc/. The LOINC medical database carries records for >30 000 different observations. LOINC codes are being used by large reference laboratories and federal agencies, e.g., the CDC and the Department of Veterans Affairs, and are part of the Health Insurance Portability and Accountability Act (HIPAA) attachment proposal. Internationally, they have been adopted in Switzerland, Hong Kong, Australia, and Canada, and by the German national standards organization, the Deutsches Instituts für Normung. Laboratories should include LOINC codes in their outbound HL7 messages so that clinical and research clients can easily integrate these results into their clinical and research repositories. Laboratories should also encourage instrument vendors to deliver LOINC codes in their instrument outputs and demand LOINC codes in HL7 messages they get from reference laboratories to avoid the need to lump so many referral tests under the "send out lab" code.
Standardized verification of fuel cycle modeling
Feng, B.; Dixon, B.; Sunny, E.; ...
2016-04-05
A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwon, Kyung; Fan, Liang-Shih; Zhou, Qiang
A new and efficient direct numerical method with second-order convergence accuracy was developed for fully resolved simulations of incompressible viscous flows laden with rigid particles. The method combines the state-of-the-art immersed boundary method (IBM), the multi-direct forcing method, and the lattice Boltzmann method (LBM). First, the multi-direct forcing method is adopted in the improved IBM to better approximate the no-slip/no-penetration (ns/np) condition on the surface of particles. Second, a slight retraction of the Lagrangian grid from the surface towards the interior of particles with a fraction of the Eulerian grid spacing helps increase the convergence accuracy of the method. Anmore » over-relaxation technique in the procedure of multi-direct forcing method and the classical fourth order Runge-Kutta scheme in the coupled fluid-particle interaction were applied. The use of the classical fourth order Runge-Kutta scheme helps the overall IB-LBM achieve the second order accuracy and provides more accurate predictions of the translational and rotational motion of particles. The preexistent code with the first-order convergence rate is updated so that the updated new code can resolve the translational and rotational motion of particles with the second-order convergence rate. The updated code has been validated with several benchmark applications. The efficiency of IBM and thus the efficiency of IB-LBM were improved by reducing the number of the Lagragian markers on particles by using a new formula for the number of Lagrangian markers on particle surfaces. The immersed boundary-lattice Boltzmann method (IBLBM) has been shown to predict correctly the angular velocity of a particle. Prior to examining drag force exerted on a cluster of particles, the updated IB-LBM code along with the new formula for the number of Lagrangian markers has been further validated by solving several theoretical problems. Moreover, the unsteadiness of the drag force is examined when a fluid is accelerated from rest by a constant average pressure gradient toward a steady Stokes flow. The simulation results agree well with the theories for the short- and long-time behavior of the drag force. Flows through non-rotational and rotational spheres in simple cubic arrays and random arrays are simulated over the entire range of packing fractions, and both low and moderate particle Reynolds numbers to compare the simulated results with the literature results and develop a new drag force formula, a new lift force formula, and a new torque formula. Random arrays of solid particles in fluids are generated with Monte Carlo procedure and Zinchenko's method to avoid crystallization of solid particles over high solid volume fractions. A new drag force formula was developed with extensive simulated results to be closely applicable to real processes over the entire range of packing fractions and both low and moderate particle Reynolds numbers. The simulation results indicate that the drag force is barely affected by rotational Reynolds numbers. Drag force is basically unchanged as the angle of the rotating axis varies.« less
Users manual for updated computer code for axial-flow compressor conceptual design
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
An existing computer code that determines the flow path for an axial-flow compressor either for a given number of stages or for a given overall pressure ratio was modified for use in air-breathing engine conceptual design studies. This code uses a rapid approximate design methodology that is based on isentropic simple radial equilibrium. Calculations are performed at constant-span-fraction locations from tip to hub. Energy addition per stage is controlled by specifying the maximum allowable values for several aerodynamic design parameters. New modeling was introduced to the code to overcome perceived limitations. Specific changes included variable rather than constant tip radius, flow path inclination added to the continuity equation, input of mass flow rate directly rather than indirectly as inlet axial velocity, solution for the exact value of overall pressure ratio rather than for any value that met or exceeded it, and internal computation of efficiency rather than the use of input values. The modified code was shown to be capable of computing efficiencies that are compatible with those of five multistage compressors and one fan that were tested experimentally. This report serves as a users manual for the revised code, Compressor Spanline Analysis (CSPAN). The modeling modifications, including two internal loss correlations, are presented. Program input and output are described. A sample case for a multistage compressor is included.
Uppal, Shitanshu; Shahin, Mark S; Rathbun, Jill A; Goff, Barbara A
2017-02-01
In 2015, there was an 18% reduction in the Relative Value Units (RVUs) that the Center for Medicare and Medicaid Services (CMS) assigned to the Current Procedural Terminology (CPT) code 58571 (Laparoscopy, surgical, with total hysterectomy, for uterus 250g or less; with removal of tube(s) and/or ovary(s)→TLH+BSO). The other CPT codes for laparoscopic hysterectomy and laparoscopic supracervical hysterectomy (58541-58544 and 58570-58573) lost between 12 and 23% of their assigned RVUs. In 2016, the laparoscopic lymph node dissection codes 38570 (Laparoscopy, surgical; with retroperitoneal lymph node sampling (biopsy), single or multiple), 38571 (Laparoscopy, surgical; with bilateral total pelvic lymphadenectomy), and 38572 (Laparoscopy, surgical; with bilateral total pelvic lymphadenectomy and para-aortic lymph node sampling (biopsy), single or multiple) lost between 5.5 and 16.3% of their RVU's. The goals of this article from the Society of Gynecologic Oncology (SGO) Task force on Coding and Reimbursement are 1) to inform the SGO members on why CMS identified these codes as a part of their misvalued services screening program and then finalized a reduction in their payment levels; and 2) outline the role individual providers have in CMS' methodology used to determine the reimbursement of a surgical procedure. Copyright © 2016 Elsevier Inc. All rights reserved.
Author Correction: Intergenerational equity can help to prevent climate change and extinction.
Treves, Adrian; Artelle, Kyle A; Darimont, Chris T; Lynn, William S; Paquet, Paul; Santiago-Ávila, Francisco J; Shaw, Rance; Wood, Mary C
2018-05-01
The original Article mistakenly coded the constitutional rights of Australia as containing a governmental duty to protect the environment (blue in the figures); this has been corrected to containing no explicit mention of environmental protection (orange in the figures). The original Article also neglected to code the constitutional rights of the Cayman Islands (no data; yellow in the figures); this has been corrected to containing a governmental duty to protect the environment (blue in the figures).Although no inferences changed as a result of these errors, many values changed slightly and have been corrected. The proportion of the world's nations having constitutional rights to a healthy environment changed from 75% to 74%. The proportions of nations in different categories given in the Fig. 1 caption all changed except purple countries (3.1%): green countries changed from 47.2% to 46.9%; blue countries changed from 24.4% to 24.2%; and orange countries changed from 25.3% to 25.8%. The proportion of the global atmospheric CO 2 emitted by the 144 nations changed from 72.6% to 74.4%; the proportion of the world's population represented by the 144 nations changed from 84.9% to 85%. The values of annual average CO 2 emissions for blue countries changed from 363,000 Gg to 353,000 Gg and for orange countries from 195,000 Gg to 201,000 Gg. The proportion of threatened mammals endemic to a single country represented by the 144 countries changed from 91% to 84%. Figures 1-3 have been updated to show the correct values and map colours and the Supplementary Information has been updated to give the correct country codes.
Knowledge Data Base for Amorphous Metals
2007-07-26
not programmatic, updates. Over 100 custom SQL statements that maintain the domain specific data are attached to the workflow entries in a generic...for the form by populating the SQL and run generation tables. Application data may be prepared in different ways for two steps that invoke the same form...run generation mode). There is a single table of SQL commands. Each record has a user-definable ID, the SQL code, and a comment. The run generation
26TH AFOSR Chemical & Atmospheric Sciences Program Review FY81.
1982-03-01
AFOSR-80-0020, 2310/A2 N. Larsen Department of Electrical Engineering Cornell University Ithaca, New York 14853 Light Scattering and Absorption Kuo-Nan...0011; University of Florida 80-0015 (To MRO Contract DAAM 1816 NW G Street 29-78-G-0024), 2310/Al Gainesville, FL 32601 Atmospheric Absorption of...parameters for use in the theoretical spectroscopy, for updating the transmission/emission codes, and for computing molecular absorption /emission line
MODIS Cloud Microphysics Product (MOD_PR06OD) Data Collection 6 Updates
NASA Technical Reports Server (NTRS)
Wind, Gala; Platnick, Steven; King, Michael D.
2014-01-01
The MODIS Cloud Optical and Microphysical Product (MOD_PR060D) for Data Collection 6 has entered full scale production. Aqua reprocessing is almost completed and Terra reprocessing will begin shortly. Unlike previous collections, the CHIMAERA code base allows for simultaneous processing for multiple sensors and the operational CHIMAERA 6.0.76 stream is also available for VIIRS and SEVIRI sensors and for our E-MAS airborne platform.
Whales and Sonar: Environmental Exemptions for the Navy’s Mid-Frequency Active Sonar Training
2008-11-14
Balaenoptera musculus E Finback whale Balaenoptera physalus E Humpback whale Megaptera novaeangliae E Killer Southern whale Resident DPS Orcinus orca...Salmo) mykiss T Steelhead south central CA coast Oncorhynchus (=Salmo) mykiss E Steelhead southern CA coast Oncorhynchus (=Salmo) mykiss E Blue whale ...Order Code RL34403 Whales and Sonar: Environmental Exemptions for the Navy’s Mid-Frequency Active Sonar Training Updated November 14, 2008 Kristina
Integrated Nuclear Communications Assessment (INCA). Circuit Restoral Assessment Module
1979-09-07
from Report) t 18 . SUPPLEMENTARY NOTES This work sponsored by the Defense Nuclear Agency under RDT&E RMSS Code B363078464 O909QAXCAlO6O6 H2590D. 79. KEY...Connectivity ... 16 AS oun1. .i S - Update Theater Connectivity Matrix ...................................... 18 2.5 Subroutine 1.2 - Set CCSD Status...71 2- 18 Subroutine 5.1.3 - Compute Effectiveness ........ 77 2-19 Subroutine 5.2 - Assign Restoral Times ........ 80 2-20
ERIC Educational Resources Information Center
Zamani, A. Rahman, Ed.; Evinger, Sara, Ed.
2007-01-01
This curriculum was first published in June 1998 to be used by qualified health and safety trainers to fulfill part of the learning needs and licensing requirements of child care providers (Health and Safety Code, Section 1596.866) in California. This second and updated edition of Module 2, Prevention of Injuries, covers the content of the…
The Uniformed and Overseas Citizens Absentee Voting Act: Background and Issues
2007-03-08
Order Code RS20764 Updated March 8, 2007 The Uniformed and Overseas Citizens Absentee Voting Act: Background and Issues Kevin J. Coleman Analyst in...register and vote absentee in federal elections under the provisions of the Uniformed and Overseas Citizens Absentee Voting Act (UOCAVA) of 1986. The law was...enacted to improve absentee registration and voting for this group of voters and to consolidate existing laws. Since 1942, several federal laws have
An integrated knowledge system for wind tunnel testing - Project Engineers' Intelligent Assistant
NASA Technical Reports Server (NTRS)
Lo, Ching F.; Shi, George Z.; Hoyt, W. A.; Steinle, Frank W., Jr.
1993-01-01
The Project Engineers' Intelligent Assistant (PEIA) is an integrated knowledge system developed using artificial intelligence technology, including hypertext, expert systems, and dynamic user interfaces. This system integrates documents, engineering codes, databases, and knowledge from domain experts into an enriched hypermedia environment and was designed to assist project engineers in planning and conducting wind tunnel tests. PEIA is a modular system which consists of an intelligent user-interface, seven modules and an integrated tool facility. Hypermedia technology is discussed and the seven PEIA modules are described. System maintenance and updating is very easy due to the modular structure and the integrated tool facility provides user access to commercial software shells for documentation, reporting, or database updating. PEIA is expected to provide project engineers with technical information, increase efficiency and productivity, and provide a realistic tool for personnel training.
NASA Astrophysics Data System (ADS)
Green, Richard F.; Diaz Castro, Javier; Allen, Lori; Alvarez del Castillo, Elizabeth; Corbally, Christopher J.; Davis, Donald; Falco, Emilio; Gabor, Paul; Hall, Jeffrey C.; Monrad, Christian Karl; Williams, G. Grant
2015-08-01
Some of the world's largest telescopes and largest concentrations of telescopes are on sites in Arizona and the Canary Islands. Active site protection efforts are underway in both regions; the common challenge is getting out ahead of the LED revolution in outdoor lighting. We review the work with local, regional, and national government bodies, with many successful updates of outdoor lighting codes. A successful statewide conference was held in Arizona to raise awareness of public officials about issues of light pollution for astronomy, safety, wildlife, and public health. We also highlight interactions with key entities near critical sites, including mines and prisons, leading to upgrades of their lighting to more astronomy-friendly form. We describe ongoing and planned sky monitoring efforts, noting their importance in quantifying the "impact on astronomy" increasingly requested by regulators.
2013-08-06
This final rule updates the payment rates used under the prospective payment system for skilled nursing facilities (SNFs) for fiscal year (FY) 2014. In addition, it revises and rebases the SNF market basket, revises and updates the labor related share, and makes certain technical and conforming revisions in the regulations text. This final rule also includes a policy for reporting the SNF market basket forecast error in certain limited circumstances and adds a new item to the Minimum Data Set (MDS), Version 3.0 for reporting the number of distinct therapy days. Finally, this final rule adopts a change to the diagnosis code used to determine which residents will receive the AIDS add-on payment, effective for services provided on or after the October 1, 2014 implementation date for conversion to ICD-10-CM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.
2016-02-16
Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.
2016-03-01
Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less
ISPOR Code of Ethics 2017 (4th Edition).
Santos, Jessica; Palumbo, Francis; Molsen-David, Elizabeth; Willke, Richard J; Binder, Louise; Drummond, Michael; Ho, Anita; Marder, William D; Parmenter, Louise; Sandhu, Gurmit; Shafie, Asrul A; Thompson, David
2017-12-01
As the leading health economics and outcomes research (HEOR) professional society, ISPOR has a responsibility to establish a uniform, harmonized international code for ethical conduct. ISPOR has updated its 2008 Code of Ethics to reflect the current research environment. This code addresses what is acceptable and unacceptable in research, from inception to the dissemination of its results. There are nine chapters: 1 - Introduction; 2 - Ethical Principles respect, beneficence and justice with reference to a non-exhaustive compilation of international, regional, and country-specific guidelines and standards; 3 - Scope HEOR definitions and how HEOR and the Code relate to other research fields; 4 - Research Design Considerations primary and secondary data related issues, e.g., participant recruitment, population and research setting, sample size/site selection, incentive/honorarium, administration databases, registration of retrospective observational studies and modeling studies; 5 - Data Considerations privacy and data protection, combining, verification and transparency of research data, scientific misconduct, etc.; 6 - Sponsorship and Relationships with Others (roles of researchers, sponsors, key opinion leaders and advisory board members, research participants and institutional review boards (IRBs) / independent ethics committees (IECs) approval and responsibilities); 7 - Patient Centricity and Patient Engagement new addition, with explanation and guidance; 8 - Publication and Dissemination; and 9 - Conclusion and Limitations. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Pujar, Shashikant; O’Leary, Nuala A; Farrell, Catherine M; Mudge, Jonathan M; Wallin, Craig; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bult, Carol J; Frankish, Adam; Pruitt, Kim D
2018-01-01
Abstract The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. PMID:29126148
MultitaskProtDB-II: an update of a database of multitasking/moonlighting proteins
Franco-Serrano, Luís; Hernández, Sergio; Calvo, Alejandra; Severi, María A; Ferragut, Gabriela; Pérez-Pons, JosepAntoni; Piñol, Jaume; Pich, Òscar; Mozo-Villarias, Ángel; Amela, Isaac
2018-01-01
Abstract Multitasking, or moonlighting, is the capability of some proteins to execute two or more biological functions. MultitaskProtDB-II is a database of multifunctional proteins that has been updated. In the previous version, the information contained was: NCBI and UniProt accession numbers, canonical and additional biological functions, organism, monomeric/oligomeric states, PDB codes and bibliographic references. In the present update, the number of entries has been increased from 288 to 694 moonlighting proteins. MultitaskProtDB-II is continually being curated and updated. The new database also contains the following information: GO descriptors for the canonical and moonlighting functions, three-dimensional structure (for those proteins lacking PDB structure, a model was made using Itasser and Phyre), the involvement of the proteins in human diseases (78% of human moonlighting proteins) and whether the protein is a target of a current drug (48% of human moonlighting proteins). These numbers highlight the importance of these proteins for the analysis and explanation of human diseases and target-directed drug design. Moreover, 25% of the proteins of the database are involved in virulence of pathogenic microorganisms, largely in the mechanism of adhesion to the host. This highlights their importance for the mechanism of microorganism infection and vaccine design. MultitaskProtDB-II is available at http://wallace.uab.es/multitaskII. PMID:29136215
NASA Astrophysics Data System (ADS)
Suleiman, R. M.; Chance, K.; Liu, X.; Kurosu, T. P.; Gonzalez Abad, G.
2014-12-01
We present and discuss a detailed description of the retrieval algorithms for the OMI BrO product. The BrO algorithms are based on direct fitting of radiances from 319.0-347.5 nm. Radiances are modeled from the solar irradiance, attenuated and adjusted by contributions from the target gas and interfering gases, rotational Raman scattering, undersampling, additive and multiplicative closure polynomials and a common mode spectrum. The version of the algorithm used for both BrO includes relevant changes with respect to the operational code, including the fit of the O2-O2 collisional complex, updates in the high resolution solar reference spectrum, updates in spectroscopy, an updated Air Mass Factor (AMF) calculation scheme, and the inclusion of scattering weights and vertical profiles in the level 2 products. Updates to the algorithms include accurate scattering weights and air mass factor calculations, scattering weights and profiles in outputs and available cross sections. We include retrieval parameter and window optimization to reduce the interference from O3, HCHO, O2-O2, SO2, improve fitting accuracy and uncertainty, reduce striping, and improve the long-term stability. We validate OMI BrO with ground-based measurements from Harestua and with chemical transport model simulations. We analyze the global distribution and seasonal variation of BrO and investigate BrO emissions from volcanoes and salt lakes.
Enhanced attention amplifies face adaptation.
Rhodes, Gillian; Jeffery, Linda; Evangelista, Emma; Ewing, Louise; Peters, Marianne; Taylor, Libby
2011-08-15
Perceptual adaptation not only produces striking perceptual aftereffects, but also enhances coding efficiency and discrimination by calibrating coding mechanisms to prevailing inputs. Attention to simple stimuli increases adaptation, potentially enhancing its functional benefits. Here we show that attention also increases adaptation to faces. In Experiment 1, face identity aftereffects increased when attention to adapting faces was increased using a change detection task. In Experiment 2, figural (distortion) face aftereffects increased when attention was increased using a snap game (detecting immediate repeats) during adaptation. Both were large effects. Contributions of low-level adaptation were reduced using free viewing (both experiments) and a size change between adapt and test faces (Experiment 2). We suggest that attention may enhance adaptation throughout the entire cortical visual pathway, with functional benefits well beyond the immediate advantages of selective processing of potentially important stimuli. These results highlight the potential to facilitate adaptive updating of face-coding mechanisms by strategic deployment of attentional resources. Copyright © 2011 Elsevier Ltd. All rights reserved.
A dynamic code for economic object valuation in prefrontal cortex neurons
Tsutsui, Ken-Ichiro; Grabenhorst, Fabian; Kobayashi, Shunsuke; Schultz, Wolfram
2016-01-01
Neuronal reward valuations provide the physiological basis for economic behaviour. Yet, how such valuations are converted to economic decisions remains unclear. Here we show that the dorsolateral prefrontal cortex (DLPFC) implements a flexible value code based on object-specific valuations by single neurons. As monkeys perform a reward-based foraging task, individual DLPFC neurons signal the value of specific choice objects derived from recent experience. These neuronal object values satisfy principles of competitive choice mechanisms, track performance fluctuations and follow predictions of a classical behavioural model (Herrnstein’s matching law). Individual neurons dynamically encode both, the updating of object values from recently experienced rewards, and their subsequent conversion to object choices during decision-making. Decoding from unselected populations enables a read-out of motivational and decision variables not emphasized by individual neurons. These findings suggest a dynamic single-neuron and population value code in DLPFC that advances from reward experiences to economic object values and future choices. PMID:27618960
Reference results for time-like evolution up to
NASA Astrophysics Data System (ADS)
Bertone, Valerio; Carrazza, Stefano; Nocera, Emanuele R.
2015-03-01
We present high-precision numerical results for time-like Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution in the factorisation scheme, for the first time up to next-to-next-to-leading order accuracy in quantum chromodynamics. First, we scrutinise the analytical expressions of the splitting functions available in the literature, in both x and N space, and check their mutual consistency. Second, we implement time-like evolution in two publicly available, entirely independent and conceptually different numerical codes, in x and N space respectively: the already existing APFEL code, which has been updated with time-like evolution, and the new MELA code, which has been specifically developed to perform the study in this work. Third, by means of a model for fragmentation functions, we provide results for the evolution in different factorisation schemes, for different ratios between renormalisation and factorisation scales and at different final scales. Our results are collected in the format of benchmark tables, which could be used as a reference for global determinations of fragmentation functions in the future.
Modernizing the ATLAS simulation infrastructure
NASA Astrophysics Data System (ADS)
Di Simone, A.; CollaborationAlbert-Ludwigs-Universitt Freiburg, ATLAS; Institut, Physikalisches; Br., 79104 Freiburg i.; Germany
2017-10-01
The ATLAS Simulation infrastructure has been used to produce upwards of 50 billion proton-proton collision events for analyses ranging from detailed Standard Model measurements to searches for exotic new phenomena. In the last several years, the infrastructure has been heavily revised to allow intuitive multithreading and significantly improved maintainability. Such a massive update of a legacy code base requires careful choices about what pieces of code to completely rewrite and what to wrap or revise. The initialization of the complex geometry was generalized to allow new tools and geometry description languages, popular in some detector groups. The addition of multithreading requires Geant4-MT and GaudiHive, two frameworks with fundamentally different approaches to multithreading, to work together. It also required enforcing thread safety throughout a large code base, which required the redesign of several aspects of the simulation, including truth, the record of particle interactions with the detector during the simulation. These advances were possible thanks to close interactions with the Geant4 developers.
Cosmic-ray propagation with DRAGON2: I. numerical solver and astrophysical ingredients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evoli, Carmelo; Gaggero, Daniele; Vittino, Andrea
2017-02-01
We present version 2 of the DRAGON code designed for computing realistic predictions of the CR densities in the Galaxy. The code numerically solves the interstellar CR transport equation (including inhomogeneous and anisotropic diffusion, either in space and momentum, advective transport and energy losses), under realistic conditions. The new version includes an updated numerical solver and several models for the astrophysical ingredients involved in the transport equation. Improvements in the accuracy of the numerical solution are proved against analytical solutions and in reference diffusion scenarios. The novel features implemented in the code allow to simulate the diverse scenarios proposed tomore » reproduce the most recent measurements of local and diffuse CR fluxes, going beyond the limitations of the homogeneous galactic transport paradigm. To this end, several applications using DRAGON2 are presented as well. This new version facilitates the users to include their own physical models by means of a modular C++ structure.« less
Kotakis, Christos
2015-01-01
Ars longa, vita brevis -Hippocrates Chloroplasts and mitochondria are genetically semi-autonomous organelles inside the plant cell. These constructions formed after endosymbiosis and keep evolving throughout the history of life. Experimental evidence is provided for active non-coding RNAs (ncRNAs) in these prokaryote-like structures, and a possible functional imprinting on cellular electrophysiology by those RNA entities is described. Furthermore, updated knowledge on RNA metabolism of organellar genomes uncovers novel inter-communication bridges with the nucleus. This class of RNA molecules is considered as a unique ontogeny which transforms their biological role as a genetic rheostat into a synchronous biochemical one that can affect the energetic charge and redox homeostasis inside cells. A hypothesis is proposed where such modulation by non-coding RNAs is integrated with genetic signals regulating gene transfer. The implications of this working hypothesis are discussed, with particular reference to ncRNAs involvement in the organellar and nuclear genomes evolution since their integrity is functionally coupled with redox signals in photosynthetic organisms.
RELAP5-3D Resolution of Known Restart/Backup Issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mesina, George L.; Anderson, Nolan A.
2014-12-01
The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less
Rhodes, Gillian; Jeffery, Linda; Taylor, Libby; Ewing, Louise
2013-11-01
Our ability to discriminate and recognize thousands of faces despite their similarity as visual patterns relies on adaptive, norm-based, coding mechanisms that are continuously updated by experience. Reduced adaptive coding of face identity has been proposed as a neurocognitive endophenotype for autism, because it is found in autism and in relatives of individuals with autism. Autistic traits can also extend continuously into the general population, raising the possibility that reduced adaptive coding of face identity may be more generally associated with autistic traits. In the present study, we investigated whether adaptive coding of face identity decreases as autistic traits increase in an undergraduate population. Adaptive coding was measured using face identity aftereffects, and autistic traits were measured using the Autism-Spectrum Quotient (AQ) and its subscales. We also measured face and car recognition ability to determine whether autistic traits are selectively related to face recognition difficulties. We found that men who scored higher on levels of autistic traits related to social interaction had reduced adaptive coding of face identity. This result is consistent with the idea that atypical adaptive face-coding mechanisms are an endophenotype for autism. Autistic traits were also linked with face-selective recognition difficulties in men. However, there were some unexpected sex differences. In women, autistic traits were linked positively, rather than negatively, with adaptive coding of identity, and were unrelated to face-selective recognition difficulties. These sex differences indicate that autistic traits can have different neurocognitive correlates in men and women and raise the intriguing possibility that endophenotypes of autism can differ in males and females. © 2013 Elsevier Ltd. All rights reserved.
Improvements to the construction of binary black hole initial data
NASA Astrophysics Data System (ADS)
Ossokine, Serguei; Foucart, Francois; Pfeiffer, Harald P.; Boyle, Michael; Szilágyi, Béla
2015-12-01
Construction of binary black hole initial data is a prerequisite for numerical evolutions of binary black holes. This paper reports improvements to the binary black hole initial data solver in the spectral Einstein code, to allow robust construction of initial data for mass-ratio above 10:1, and for dimensionless black hole spins above 0.9, while improving efficiency for lower mass-ratios and spins. We implement a more flexible domain decomposition, adaptive mesh refinement and an updated method for choosing free parameters. We also introduce a new method to control and eliminate residual linear momentum in initial data for precessing systems, and demonstrate that it eliminates gravitational mode mixing during the evolution. Finally, the new code is applied to construct initial data for hyperbolic scattering and for binaries with very small separation.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-18
...The United States Patent and Trademark Office (Office or USPTO) proposes to align the USPTO's professional responsibility rules with those of most other U.S. jurisdictions by replacing the current Patent and Trademark Office Code of Professional Responsibility, adopted in 1985, based on the 1980 version of the Model Code of Professional Responsibility of the American Bar Association (``ABA''), with new USPTO Rules of Professional Conduct, which are based on the Model Rules of Professional Conduct of the ABA, which were published in 1983, substantially revised in 2003 and updated through 2011. Changes approved by the ABA House of Delegates in August 2012 have not been incorporated in these proposed rules. The Office also proposes to revise the existing procedural rules governing disciplinary investigations and proceedings.
PROMIS (Procurement Management Information System)
NASA Technical Reports Server (NTRS)
1987-01-01
The PROcurement Management Information System (PROMIS) provides both detailed and summary level information on all procurement actions performed within NASA's procurement offices at Marshall Space Flight Center (MSFC). It provides not only on-line access, but also schedules procurement actions, monitors their progress, and updates Forecast Award Dates. Except for a few computational routines coded in FORTRAN, the majority of the systems is coded in a high level language called NATURAL. A relational Data Base Management System called ADABAS is utilized. Certain fields, called descriptors, are set up on each file to allow the selection of records based on a specified value or range of values. The use of like descriptors on different files serves as the link between the falls, thus producing a relational data base. Twenty related files are currently being maintained on PROMIS.
Analysis of a Hovering Rotor in Icing Conditions
NASA Technical Reports Server (NTRS)
Narducci, Robert; Kreeger, Richard E.
2012-01-01
A high fidelity analysis method is proposed to evaluate the ice accumulation and the ensuing rotor performance degradation for a helicopter flying through an icing cloud. The process uses computational fluid dynamics (CFD) coupled to a rotorcraft comprehensive code to establish the aerodynamic environment of a trimmed rotor prior to icing. Based on local aerodynamic conditions along the rotor span and accounting for the azimuthal variation, an ice accumulation analysis using NASA's Lewice3D code is made to establish the ice geometry. Degraded rotor performance is quantified by repeating the high fidelity rotor analysis with updates which account for ice shape and mass. The process is applied on a full-scale UH-1H helicopter in hover using data recorded during the Helicopter Icing Flight Test Program.
Cosmic Rays and Their Radiative Processes in Numerical Cosmology
NASA Technical Reports Server (NTRS)
Ryu, Dongsu; Miniati, Francesco; Jones, Tom W.; Kang, Hyesung
2000-01-01
A cosmological hydrodynamic code is described, which includes a routine to compute cosmic ray acceleration and transport in a simplified way. The routine was designed to follow explicitly diffusive, acceleration at shocks, and second-order Fermi acceleration and adiabatic loss in smooth flows. Synchrotron cooling of the electron population can also be followed. The updated code is intended to be used to study the properties of nonthermal synchrotron emission and inverse Compton scattering from electron cosmic rays in clusters of galaxies, in addition to the properties of thermal bremsstrahlung emission from hot gas. The results of a test simulation using a grid of 128 (exp 3) cells are presented, where cosmic rays and magnetic field have been treated passively and synchrotron cooling of cosmic ray electrons has not been included.