Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-26
... SECURITIES AND EXCHANGE COMMISSION [File No. 500-1] In the Matter of Digital Video Systems, Inc., Geocom Resources, Inc., and GoldMountain Exploration Corp., and Real Data, Inc. (a/k/a Galtech... securities of Digital Video Systems, Inc. because it has not filed any periodic reports since the period...
1984-08-01
COLLFCTIVF PAPTTCLE ACCELERATOR VIA NUMERICAL MODFLINC WITH THF MAGIC CODE Robert 1. Darker Auqust 19F4 Final Report for Period I April. qI84 - 30...NUMERICAL MODELING WITH THE MAGIC CODE Robert 3. Barker August 1984 Final Report for Period 1 April 1984 - 30 September 1984 Prepared for: Scientific...Collective Final Report Particle Accelerator VIA Numerical Modeling with April 1 - September-30, 1984 MAGIC Code. 6. PERFORMING ORG. REPORT NUMBER MRC/WDC-R
Development of code evaluation criteria for assessing predictive capability and performance
NASA Technical Reports Server (NTRS)
Lin, Shyi-Jang; Barson, S. L.; Sindir, M. M.; Prueger, G. H.
1993-01-01
Computational Fluid Dynamics (CFD), because of its unique ability to predict complex three-dimensional flows, is being applied with increasing frequency in the aerospace industry. Currently, no consistent code validation procedure is applied within the industry. Such a procedure is needed to increase confidence in CFD and reduce risk in the use of these codes as a design and analysis tool. This final contract report defines classifications for three levels of code validation, directly relating the use of CFD codes to the engineering design cycle. Evaluation criteria by which codes are measured and classified are recommended and discussed. Criteria for selecting experimental data against which CFD results can be compared are outlined. A four phase CFD code validation procedure is described in detail. Finally, the code validation procedure is demonstrated through application of the REACT CFD code to a series of cases culminating in a code to data comparison on the Space Shuttle Main Engine High Pressure Fuel Turbopump Impeller.
HZETRN: A heavy ion/nucleon transport code for space radiations
NASA Technical Reports Server (NTRS)
Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.
1991-01-01
The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1995-01-01
This report focuses on the results obtained during the PI's recent sabbatical leave at the Swiss Federal Institute of Technology (ETH) in Zurich, Switzerland, from January 1, 1995 through June 30, 1995. Two projects investigated various properties of TURBO codes, a new form of concatenated coding that achieves near channel capacity performance at moderate bit error rates. The performance of TURBO codes is explained in terms of the code's distance spectrum. These results explain both the near capacity performance of the TURBO codes and the observed 'error floor' for moderate and high signal-to-noise ratios (SNR's). A semester project, entitled 'The Realization of the Turbo-Coding System,' involved a thorough simulation study of the performance of TURBO codes and verified the results claimed by previous authors. A copy of the final report for this project is included as Appendix A. A diploma project, entitled 'On the Free Distance of Turbo Codes and Related Product Codes,' includes an analysis of TURBO codes and an explanation for their remarkable performance. A copy of the final report for this project is included as Appendix B.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-10-01
Huffman codes, comma-free codes, and block codes with shift indicators are important candidate-message compression codes for improving the efficiency of communications systems. This study was undertaken to determine if these codes could be used to increase the thruput of the fixed very-low-frequency (FVLF) communication system. This applications involves the use of compression codes in a channel with errors.
2016-05-04
This final rule will amend the fire safety standards for Medicare and Medicaid participating hospitals, critical access hospitals (CAHs), long-term care facilities, intermediate care facilities for individuals with intellectual disabilities (ICF-IID), ambulatory surgery centers (ASCs), hospices which provide inpatient services, religious non-medical health care institutions (RNHCIs), and programs of all-inclusive care for the elderly (PACE) facilities. Further, this final rule will adopt the 2012 edition of the Life Safety Code (LSC) and eliminate references in our regulations to all earlier editions of the Life Safety Code. It will also adopt the 2012 edition of the Health Care Facilities Code, with some exceptions.
2014-08-06
This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2015 as required by the statute. This final rule finalizes a policy to collect data on the amount and mode (that is, Individual, Concurrent, Group, and Co-Treatment) of therapy provided in the IRF setting according to therapy discipline, revises the list of diagnosis and impairment group codes that presumptively meet the "60 percent rule'' compliance criteria, provides a way for IRFs to indicate on the Inpatient Rehabilitation Facility-Patient Assessment Instrument (IRF-PAI) form whether the prior treatment and severity requirements have been met for arthritis cases to presumptively meet the "60 percent rule'' compliance criteria, and revises and updates quality measures and reporting requirements under the IRF quality reporting program (QRP). This rule also delays the effective date for the revisions to the list of diagnosis codes that are used to determine presumptive compliance under the "60 percent rule'' that were finalized in FY 2014 IRF PPS final rule and adopts the revisions to the list of diagnosis codes that are used to determine presumptive compliance under the "60 percent rule'' that are finalized in this rule. This final rule also addresses the implementation of the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM), for the IRF prospective payment system (PPS), which will be effective when ICD-10-CM becomes the required medical data code set for use on Medicare claims and IRF-PAI submissions.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-05
...This final rule adopts the standard for a national unique health plan identifier (HPID) and establishes requirements for the implementation of the HPID. In addition, it adopts a data element that will serve as an other entity identifier (OEID), or an identifier for entities that are not health plans, health care providers, or individuals, but that need to be identified in standard transactions. This final rule also specifies the circumstances under which an organization covered health care provider must require certain noncovered individual health care providers who are prescribers to obtain and disclose a National Provider Identifier (NPI). Lastly, this final rule changes the compliance date for the International Classification of Diseases, 10th Revision, Clinical Modification (ICD- 10-CM) for diagnosis coding, including the Official ICD-10-CM Guidelines for Coding and Reporting, and the International Classification of Diseases, 10th Revision, Procedure Coding System (ICD-10-PCS) for inpatient hospital procedure coding, including the Official ICD-10-PCS Guidelines for Coding and Reporting, from October 1, 2013 to October 1, 2014.
System Design for FEC in Aeronautical Telemetry
2012-03-12
rate punctured convolutional codes for soft decision Viterbi...below follows that given in [8]. The final coding rate of exactly 2/3 is achieved by puncturing the rate -1/2 code as follows. We begin with the buffer c1...concatenated convolutional code (SCCC). The contributions of this paper are on the system-design level. One major contribution is to design a SCCC code
Final Evaluation of MIPS M/500
1987-11-01
recognizing common subexpressions by changing the code to read: acke (n,m) If (, - 0) return *+I; return a ker(n-1, 0 ? 1 aaker (n,.-1)); I the total code...INSTITUTE JPO PTTTSBURCH. PA 15213 N/A N/A N/O 11 TITLE (Inciude Security Class.iication) Final Evaluation of MIPS M/500 12. PERSONAL AUTHOR(S) Daniel V
A Survey of Progress in Coding Theory in the Soviet Union. Final Report.
ERIC Educational Resources Information Center
Kautz, William H.; Levitt, Karl N.
The results of a comprehensive technical survey of all published Soviet literature in coding theory and its applications--over 400 papers and books appearing before March 1967--are described in this report. Noteworthy Soviet contributions are discussed, including codes for the noiseless channel, codes that correct asymetric errors, decoding for…
Current and anticipated uses of thermal-hydraulic codes in NFI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsuda, K.; Takayasu, M.
1997-07-01
This paper presents the thermal-hydraulic codes currently used in NFI for the LWR fuel development and licensing application including transient and design basis accident analyses of LWR plants. The current status of the codes are described in the context of code capability, modeling feature, and experience of code application related to the fuel development and licensing. Finally, the anticipated use of the future thermal-hydraulic code in NFI is briefly given.
Amendments to excepted benefits. Final rules.
2014-10-01
This document contains final regulations that amend the regulations regarding excepted benefits under the Employee Retirement Income Security Act of 1974, the Internal Revenue Code (the Code), and the Public Health Service Act. Excepted benefits are generally exempt from the health reform requirements that were added to those laws by the Health Insurance Portability and Accountability Act and the Patient Protection and Affordable Care Act. In addition, eligibility for excepted benefits does not preclude an individual from eligibility for a premium tax credit under section 36B of the Code if an individual chooses to enroll in coverage under a Qualified Health Plan through an Affordable Insurance Exchange. These regulations finalize some but not all of the proposed rules with minor modifications; additional guidance on limited wraparound coverage is forthcoming.
Enhancement of the Probabilistic CEramic Matrix Composite ANalyzer (PCEMCAN) Computer Code
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2000-01-01
This report represents a final technical report for Order No. C-78019-J entitled "Enhancement of the Probabilistic Ceramic Matrix Composite Analyzer (PCEMCAN) Computer Code." The scope of the enhancement relates to including the probabilistic evaluation of the D-Matrix terms in MAT2 and MAT9 material properties card (available in CEMCAN code) for the MSC/NASTRAN. Technical activities performed during the time period of June 1, 1999 through September 3, 1999 have been summarized, and the final version of the enhanced PCEMCAN code and revisions to the User's Manual is delivered along with. Discussions related to the performed activities were made to the NASA Project Manager during the performance period. The enhanced capabilities have been demonstrated using sample problems.
77 FR 12202 - Public Inspection of Material Relating to Tax-Exempt Organizations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-29
...This document contains final regulations pertaining to the public inspection of material relating to tax-exempt organizations and final regulations pertaining to the public inspection of written determinations and background file documents. These regulations are necessary to clarify rules relating to information and materials made available by the IRS for public inspection under the Internal Revenue Code (Code). The final regulations affect certain organizations exempt from Federal income tax, organizations that were exempt but are no longer exempt from Federal income tax, and organizations that were denied tax-exempt status.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.601... means a State law or local building code or similar ordinance, or part thereof, that establishes... designee. Certification of equivalency means a final certification that a code meets or exceeds the minimum...
Code of Federal Regulations, 2012 CFR
2012-07-01
... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.601... means a State law or local building code or similar ordinance, or part thereof, that establishes... designee. Certification of equivalency means a final certification that a code meets or exceeds the minimum...
Code of Federal Regulations, 2014 CFR
2014-07-01
... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.601... means a State law or local building code or similar ordinance, or part thereof, that establishes... designee. Certification of equivalency means a final certification that a code meets or exceeds the minimum...
Analysis of Phenix end-of-life natural convection test with the MARS-LMR code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, H. Y.; Ha, K. S.; Lee, K. L.
The end-of-life test of Phenix reactor performed by the CEA provided an opportunity to have reliable and valuable test data for the validation and verification of a SFR system analysis code. KAERI joined this international program for the analysis of Phenix end-of-life natural circulation test coordinated by the IAEA from 2008. The main objectives of this study were to evaluate the capability of existing SFR system analysis code MARS-LMR and to identify any limitation of the code. The analysis was performed in three stages: pre-test analysis, blind posttest analysis, and final post-test analysis. In the pre-test analysis, the design conditionsmore » provided by the CEA were used to obtain a prediction of the test. The blind post-test analysis was based on the test conditions measured during the tests but the test results were not provided from the CEA. The final post-test analysis was performed to predict the test results as accurate as possible by improving the previous modeling of the test. Based on the pre-test analysis and blind test analysis, the modeling for heat structures in the hot pool and cold pool, steel structures in the core, heat loss from roof and vessel, and the flow path at core outlet were reinforced in the final analysis. The results of the final post-test analysis could be characterized into three different phases. In the early phase, the MARS-LMR simulated the heat-up process correctly due to the enhanced heat structure modeling. In the mid phase before the opening of SG casing, the code reproduced the decrease of core outlet temperature successfully. Finally, in the later phase the increase of heat removal by the opening of the SG opening was well predicted with the MARS-LMR code. (authors)« less
Recent advances in lossless coding techniques
NASA Astrophysics Data System (ADS)
Yovanof, Gregory S.
Current lossless techniques are reviewed with reference to both sequential data files and still images. Two major groups of sequential algorithms, dictionary and statistical techniques, are discussed. In particular, attention is given to Lempel-Ziv coding, Huffman coding, and arithmewtic coding. The subject of lossless compression of imagery is briefly discussed. Finally, examples of practical implementations of lossless algorithms and some simulation results are given.
NR-code: Nonlinear reconstruction code
NASA Astrophysics Data System (ADS)
Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming
2018-04-01
NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.
ERIC Educational Resources Information Center
Ogihara, Saeko
2010-01-01
This dissertation is a typological study of verb-final languages, the purpose of which is to examine various grammatical phenomena in verb-final languages to discover whether there are correlations between the final position of the verb and other aspects of grammar. It examines how finality of the verb interacts with argument coding in simple…
Long distance quantum communication with quantum Reed-Solomon codes
NASA Astrophysics Data System (ADS)
Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang; Jianggroup Team
We study the construction of quantum Reed Solomon codes from classical Reed Solomon codes and show that they achieve the capacity of quantum erasure channel for multi-level quantum systems. We extend the application of quantum Reed Solomon codes to long distance quantum communication, investigate the local resource overhead needed for the functioning of one-way quantum repeaters with these codes, and numerically identify the parameter regime where these codes perform better than the known quantum polynomial codes and quantum parity codes . Finally, we discuss the implementation of these codes into time-bin photonic states of qubits and qudits respectively, and optimize the performance for one-way quantum repeaters.
Low-Density Parity-Check (LDPC) Codes Constructed from Protographs
NASA Astrophysics Data System (ADS)
Thorpe, J.
2003-08-01
We introduce a new class of low-density parity-check (LDPC) codes constructed from a template called a protograph. The protograph serves as a blueprint for constructing LDPC codes of arbitrary size whose performance can be predicted by analyzing the protograph. We apply standard density evolution techniques to predict the performance of large protograph codes. Finally, we use a randomized search algorithm to find good protographs.
Building Energy Codes: Policy Overview and Good Practices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Sadie
2016-02-19
Globally, 32% of total final energy consumption is attributed to the building sector. To reduce energy consumption, energy codes set minimum energy efficiency standards for the building sector. With effective implementation, building energy codes can support energy cost savings and complementary benefits associated with electricity reliability, air quality improvement, greenhouse gas emission reduction, increased comfort, and economic and social development. This policy brief seeks to support building code policymakers and implementers in designing effective building code programs.
ERIC Educational Resources Information Center
Nolan, Carson Y., Ed.
The second of a three-volume final report presents results of three studies on indexing systems for tape recordings used by blind persons. Study I is explained to have compared five tonal index codes in order to identify a code that required minimal display time, that had easily discriminable characters, and that could be easily learned. Results…
The Gift Code User Manual. Volume I. Introduction and Input Requirements
1975-07-01
REPORT & PERIOD COVERED ‘TII~ GIFT CODE USER MANUAL; VOLUME 1. INTRODUCTION AND INPUT REQUIREMENTS FINAL 6. PERFORMING ORG. REPORT NUMBER ?. AuTHOR(#) 8...reverua side if neceaeary and identify by block number] (k St) The GIFT code is a FORTRANcomputerprogram. The basic input to the GIFT ode is data called
A GPL Relativistic Hydrodynamical Code
NASA Astrophysics Data System (ADS)
Olvera, D.; Mendoza, S.
We are currently building a free (in the sense of a GNU GPL license) 2DRHD code in order to be used for different astrophysical situations. Our final target will be to include strong gravitational fields and magnetic fields. We intend to form a large group of developers as it is usually done for GPL codes.
International Code of Marketing of Breast-Milk Substitutes.
ERIC Educational Resources Information Center
World Health Organization, Geneva (Switzerland).
The World Health Organization's final draft of the "International Code of Marketing of Breast-milk Substitutes" is presented in its entirety. Recognizing that breast-feeding is an unequalled way of providing ideal food for the healthy growth and development of infants, the Code's aim is to contribute to the safe and adequate nutrition of…
MIFT: GIFT Combinatorial Geometry Input to VCS Code
1977-03-01
r-w w-^ H ^ß0318is CQ BRL °RCUMr REPORT NO. 1967 —-S: ... MIFT: GIFT COMBINATORIAL GEOMETRY INPUT TO VCS CODE Albert E...TITLE (and Subtitle) MIFT: GIFT Combinatorial Geometry Input to VCS Code S. TYPE OF REPORT & PERIOD COVERED FINAL 6. PERFORMING ORG. REPORT NUMBER...Vehicle Code System (VCS) called MORSE was modified to accept the GIFT combinatorial geometry package. GIFT , as opposed to the geometry package
78 FR 15011 - Environmental Impacts Statements; Notice of Availability
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-08
..., Final EIS, DOE, TX, W.A. Parish Post-Combustion CO 2 Capture and Sequestration Project, Review Period.... 20130055, Final EIS, NPS, IA, Effigy Mounds National Monument Final General Management Plan, Review Period...] BILLING CODE 6560-50-P ...
On a Mathematical Theory of Coded Exposure
2014-08-01
formulae that give the MSE and SNR of the final crisp image 1. Assumes the Shannon-Whittaker framework that i) requires band limited (with a fre...represents the ideal crisp image, i.e., the image that one would observed if there were no noise whatsoever, no motion, with a perfect optical system...discrete. In addition, the image obtained by a coded exposure camera requires to undergo a deconvolution to get the final crisp image. Note that the
Soft-decision decoding techniques for linear block codes and their error performance analysis
NASA Technical Reports Server (NTRS)
Lin, Shu
1996-01-01
The first paper presents a new minimum-weight trellis-based soft-decision iterative decoding algorithm for binary linear block codes. The second paper derives an upper bound on the probability of block error for multilevel concatenated codes (MLCC). The bound evaluates difference in performance for different decompositions of some codes. The third paper investigates the bit error probability code for maximum likelihood decoding of binary linear codes. The fourth and final paper included in this report is concerns itself with the construction of multilevel concatenated block modulation codes using a multilevel concatenation scheme for the frequency non-selective Rayleigh fading channel.
Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing
2008-01-01
complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING
76 FR 11339 - Update to NFPA 101, Life Safety Code, for State Home Facilities
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... DEPARTMENT OF VETERANS AFFAIRS 38 CFR Part 51 RIN 2900-AN59 Update to NFPA 101, Life Safety Code..., Life Safety Code. The change is designed to assure that State Home facilities meet current industry- wide standards regarding life safety and fire safety. DATES: Effective Date: This final rule is...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-28
... e-Tag Information to Commission Staff; Notice Specifying webRegistry Code In Order No. 771,\\1\\ the... stated that, ``following issuance of this Final Rule and the Commission's registration in the OATI web... in the Purchasing-Seller Entity section of OATI webRegistry. This code should be used to designate...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-16
... Interest and Penalty Suspension Provisions Under Section 6404(g) of the Internal Revenue Code AGENCY.... SUMMARY: This document contains final regulations under section 6404(g)(2)(E) of the Internal Revenue Code... Procedure and Administration Regulations (26 CFR part 301) by adding rules under section 6404(g) relating to...
Nonlinear wave vacillation in the atmosphere
NASA Technical Reports Server (NTRS)
Antar, Basil N.
1987-01-01
The problem of vacillation in a baroclinically unstable flow field is studied through the time evolution of a single nonlinearly unstable wave. To this end a computer code is being developed to solve numerically for the time evolution of the amplitude of such a wave. The final working code will be the end product resulting from the development of a heirarchy of codes with increasing complexity. The first code in this series was completed and is undergoing several diagnostic analyses to verify its validity. The development of this code is detailed.
Coding tools investigation for next generation video coding based on HEVC
NASA Astrophysics Data System (ADS)
Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin
2015-09-01
The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.
CORESAFE: A Formal Approach against Code Replacement Attacks on Cyber Physical Systems
2018-04-19
AFRL-AFOSR-JP-TR-2018-0035 CORESAFE:A Formal Approach against Code Replacement Attacks on Cyber Physical Systems Sandeep Shukla INDIAN INSTITUTE OF...Formal Approach against Code Replacement Attacks on Cyber Physical Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA2386-16-1-4099 5c. PROGRAM ELEMENT...Institute of Technology Kanpur India Final Report for AOARD Grant “CORESAFE: A Formal Approach against Code Replacement Attacks on Cyber Physical
A VHDL Interface for Altera Design Files
1990-01-01
this requirement dictated that all prototype products developed during this research would have to mirror standard VHDL code . In fact, the final... product would have to meet the 20 syntactic and semantic requirements of standard VHDL . The coding style used to create the transformation program was the...Transformed Decoder File ....................... 47 C. Supplemental VHDL Package Source Code ........... 54 Altpk.vhd .................................... 54 D
Admiralty Inlet Advanced Turbulence Measurements: final data and code archive
Kilcher, Levi (ORCID:0000000183851131); Thomson, Jim (ORCID:0000000289290088); Harding, Samuel
2011-02-01
Data and code that is not already in a public location that is used in Kilcher, Thomson, Harding, and Nylund (2017) "Turbulence Measurements from Compliant Moorings - Part II: Motion Correction" doi: 10.1175/JTECH-D-16-0213.1. The links point to Python source code used in the publication. All other files are source data used in the publication.
Schütz, U; Reichel, H; Dreinhöfer, K
2007-01-01
We introduce a grouping system for clinical practice which allows the separation of DRG coding in specific orthopaedic groups based on anatomic regions, operative procedures, therapeutic interventions and morbidity equivalent diagnosis groups. With this, a differentiated aim-oriented analysis of illustrated internal DRG data becomes possible. The group-specific difference of the coding quality between the DRG groups following primary coding by the orthopaedic surgeon and final coding by the medical controlling is analysed. In a consecutive series of 1600 patients parallel documentation and group-specific comparison of the relevant DRG parameters were carried out in every case after primary and final coding. Analysing the group-specific share in the additional CaseMix coding, the group "spine surgery" dominated, closely followed by the groups "arthroplasty" and "surgery due to infection, tumours, diabetes". Altogether, additional cost-weight-relevant coding was necessary most frequently in the latter group (84%), followed by group "spine surgery" (65%). In DRGs representing conservative orthopaedic treatment documented procedures had nearly no influence on the cost weight. The introduced system of case group analysis in internal DRG documentation can lead to the detection of specific problems in primary coding and cost-weight relevant changes of the case mix. As an instrument for internal process control in the orthopaedic field, it can serve as a communicative interface between an economically oriented classification of the hospital performance and a specific problem solution of the medical staff involved in the department management.
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.
Code of Ethics for Electrical Engineers
NASA Astrophysics Data System (ADS)
Matsuki, Junya
The Institute of Electrical Engineers of Japan (IEEJ) has established the rules of practice for its members recently, based on its code of ethics enacted in 1998. In this paper, first, the characteristics of the IEEJ 1998 ethical code are explained in detail compared to the other ethical codes for other fields of engineering. Secondly, the contents which shall be included in the modern code of ethics for electrical engineers are discussed. Thirdly, the newly-established rules of practice and the modified code of ethics are presented. Finally, results of questionnaires on the new ethical code and rules which were answered on May 23, 2007, by 51 electrical and electronic students of the University of Fukui are shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vance, J.N.; Holderness, J.H.; James, D.W.
1992-12-01
Waste stream scaling factors based on sampling programs are vulnerable to one or more of the following factors: sample representativeness, analytic accuracy, and measurement sensitivity. As an alternative to sample analyses or as a verification of the sampling results, this project proposes the use of the RADSOURCE code, which accounts for the release of fuel-source radionuclides. Once the release rates of these nuclides from fuel are known, the code develops scaling factors for waste streams based on easily measured Cobalt-60 (Co-60) and Cesium-137 (Cs-137). The project team developed mathematical models to account for the appearance rate of 10CFR61 radionuclides inmore » reactor coolant. They based these models on the chemistry and nuclear physics of the radionuclides involved. Next, they incorporated the models into a computer code that calculates plant waste stream scaling factors based on reactor coolant gamma- isotopic data. Finally, the team performed special sampling at 17 reactors to validate the models in the RADSOURCE code.« less
2005-03-01
codes speed up consumer shopping, package shipping, and inventory tracking. RFID offers many advantages over bar codes, as the table below shows...sunlight” (Accenture, 2001, p. 4). Finally, one of the most significant advantages of RFID is the advent of anti-collision. Anti-collision allows an...RFID reader to read and/or write to multiple tags at one time, which is not possible for bar codes. Despite the many advantages RFID over bar codes
EVALUATION OF AN INDIVIDUALLY PACED COURSE FOR AIRBORNE RADIO CODE OPERATORS. FINAL REPORT.
ERIC Educational Resources Information Center
BALDWIN, ROBERT O.; JOHNSON, KIRK A.
IN THIS STUDY COMPARISONS WERE MADE BETWEEN AN INDIVIDUALLY PACED VERSION OF THE AIRBORNE RADIO CODE OPERATOR (ARCO) COURSE AND TWO VERSIONS OF THE COURSE IN WHICH THE STUDENTS PROGRESSED AT A FIXED PACE. THE ARCO COURSE IS A CLASS C SCHOOL IN WHICH THE STUDENT LEARNS TO SEND AND RECEIVE MILITARY MESSAGES USING THE INTERNATIONAL MORSE CODE. THE…
NASA Technical Reports Server (NTRS)
Cartier, D. E.
1973-01-01
A convolutional coding theory is given for the IME and the Heliocentric spacecraft. The amount of coding gain needed by the mission is determined. Recommendations are given for an encoder/decoder system to provide the gain along with an evaluation of the impact of the system on the space network in terms of costs and complexity.
2000-12-21
NASA is issuing new regulations entitled "International Space Station Crew," to implement certain provisions of the International Space Station (ISS) Intergovernmental Agreement (IGA) regarding ISS crewmembers' observance of an ISS Code of Conduct.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-19
...This document contains final regulations that provide guidance on the application of sections 162(a) and 263(a) of the Internal Revenue Code (Code) to amounts paid to acquire, produce, or improve tangible property. The final regulations clarify and expand the standards in the current regulations under sections 162(a) and 263(a). These final regulations replace and remove temporary regulations under sections 162(a) and 263(a) and withdraw proposed regulations that cross referenced the text of those temporary regulations. This document also contains final regulations under section 167 regarding accounting for and retirement of depreciable property and final regulations under section 168 regarding accounting for property under the Modified Accelerated Cost Recovery System (MACRS) other than general asset accounts. The final regulations will affect all taxpayers that acquire, produce, or improve tangible property. These final regulations do not finalize or remove the 2011 temporary regulations under section 168 regarding general asset accounts and disposition of property subject to section 168, which are addressed in the notice of proposed rulemaking on this subject in the Proposed Rules section in this issue of the Federal Register.
Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cerjan, Charles J.; Shi, Xizeng
The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executablemore » code.« less
A Spanish version for the new ERA-EDTA coding system for primary renal disease.
Zurriaga, Óscar; López-Briones, Carmen; Martín Escobar, Eduardo; Saracho-Rotaeche, Ramón; Moina Eguren, Íñigo; Pallardó Mateu, Luis; Abad Díez, José María; Sánchez Miret, José Ignacio
2015-01-01
The European Renal Association and the European Dialysis and Transplant Association (ERA-EDTA) have issued an English-language new coding system for primary kidney disease (PKD) aimed at solving the problems that were identified in the list of "Primary renal diagnoses" that has been in use for over 40 years. In the context of Registro Español de Enfermos Renales (Spanish Registry of Renal Patients, [REER]), the need for a translation and adaptation of terms, definitions and notes for the new ERA-EDTA codes was perceived in order to help those who have Spanish as their working language when using such codes. Bilingual nephrologists contributed a professional translation and were involved in a terminological adaptation process, which included a number of phases to contrast translation outputs. Codes, paragraphs, definitions and diagnostic criteria were reviewed and agreements and disagreements aroused for each term were labelled. Finally, the version that was accepted by a majority of reviewers was agreed. A wide agreement was reached in the first review phase, with only 5 points of discrepancy remaining, which were agreed on in the final phase. Translation and adaptation into Spanish represent an improvement that will help to introduce and use the new coding system for PKD, as it can help reducing the time devoted to coding and also the period of adaptation of health workers to the new codes. Copyright © 2015 The Authors. Published by Elsevier España, S.L.U. All rights reserved.
A program for undergraduate research into the mechanisms of sensory coding and memory decay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calin-Jageman, R J
This is the final technical report for this DOE project, entitltled "A program for undergraduate research into the mechanisms of sensory coding and memory decay". The report summarizes progress on the three research aims: 1) to identify phyisological and genetic correlates of long-term habituation, 2) to understand mechanisms of olfactory coding, and 3) to foster a world-class undergraduate neuroscience program. Progress on the first aim has enabled comparison of learning-regulated transcripts across closely related learning paradigms and species, and results suggest that only a small core of transcripts serve truly general roles in long-term memory. Progress on the second aimmore » has enabled testing of several mutant phenotypes for olfactory behaviors, and results show that responses are not fully consistent with the combinitoral coding hypothesis. Finally, 14 undergraduate students participated in this research, the neuroscience program attracted extramural funding, and we completed a successful summer program to enhance transitions for community-college students into 4-year colleges to persue STEM fields.« less
Rapid Prediction of Unsteady Three-Dimensional Viscous Flows in Turbopump Geometries
NASA Technical Reports Server (NTRS)
Dorney, Daniel J.
1998-01-01
A program is underway to improve the efficiency of a three-dimensional Navier-Stokes code and generalize it for nozzle and turbopump geometries. Code modifications have included the implementation of parallel processing software, incorporation of new physical models and generalization of the multiblock capability. The final report contains details of code modifications, numerical results for several nozzle and turbopump geometries, and the implementation of the parallelization software.
Model-Driven Engineering: Automatic Code Generation and Beyond
2015-03-01
and Weblogic as well as cloud environments such as Mi- crosoft Azure and Amazon Web Services®. Finally, while the generated code has dependencies on...code generation in the context of the full system lifecycle from development to sustainment. Acquisition programs in govern- ment or large commercial...Acquirers are concerned with the full system lifecycle, and they need confidence that the development methods will enable the system to meet the functional
Cantwell, Kate; Morgans, Amee; Smith, Karen; Livingston, Michael; Dietze, Paul
2014-02-01
This paper aims to examine whether an adaptation of the International Classification of Disease (ICD) coding system can be applied retrospectively to final paramedic assessment data in an ambulance dataset with a view to developing more fine-grained, clinically relevant case definitions than are available through point-of-call data. Over 1.2 million case records were extracted from the Ambulance Victoria data warehouse. Data fields included dispatch code, cause (CN) and final primary assessment (FPA). Each FPA was converted to an ICD-10-AM code using word matching or best fit. ICD-10-AM codes were then converted into Major Diagnostic Categories (MDC). CN was aligned with the ICD-10-AM codes for external cause of morbidity and mortality. The most accurate results were obtained when ICD-10-AM codes were assigned using information from both FPA and CN. Comparison of cases coded as unconscious at point-of-call with the associated paramedic assessment highlighted the extra clinical detail obtained when paramedic assessment data are used. Ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Coding of ambulance data using ICD-10-AM allows for comparison of not only ambulance service users but also with other population groups. WHAT IS KNOWN ABOUT THE TOPIC? There is no reliable and standard coding and categorising system for paramedic assessment data contained in ambulance service databases. WHAT DOES THIS PAPER ADD? This study demonstrates that ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Representation of ambulance case types using ICD-10-AM-coded information obtained after paramedic assessment is more fine grained and clinically relevant than point-of-call data, which uses caller information before ambulance attendance. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? This paper describes a model of coding using an internationally recognised standard coding and categorising system to support analysis of paramedic assessment. Ambulance data coded using ICD-10-AM allows for reliable reporting and comparison within the prehospital setting and across the healthcare industry.
Issues and opportunities: beam simulations for heavy ion fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, A
1999-07-15
UCRL- JC- 134975 PREPRINT code offering 3- D, axisymmetric, and ''transverse slice'' (steady flow) geometries, with a hierarchy of models for the ''lattice'' of focusing, bending, and accelerating elements. Interactive and script- driven code steering is afforded through an interpreter interface. The code runs with good parallel scaling on the T3E. Detailed simulations of machine segments and of complete small experiments, as well as simplified full- system runs, have been carried out, partially benchmarking the code. A magnetoinductive model, with module impedance and multi- beam effects, is under study. experiments, including an injector scalable to multi- beam arrays, a high-more » current beam transport and acceleration experiment, and a scaled final- focusing experiment. These ''phase I'' projects are laying the groundwork for the next major step in HIF development, the Integrated Research Experiment (IRE). Simulations aimed directly at the IRE must enable us to: design a facility with maximum power on target at minimal cost; set requirements for hardware tolerances, beam steering, etc.; and evaluate proposed chamber propagation modes. Finally, simulations must enable us to study all issues which arise in the context of a fusion driver, and must facilitate the assessment of driver options. In all of this, maximum advantage must be taken of emerging terascale computer architectures, requiring an aggressive code development effort. An organizing principle should be pursuit of the goal of integrated and detailed source- to- target simulation. methods for analysis of the beam dynamics in the various machine concepts, using moment- based methods for purposes of design, waveform synthesis, steering algorithm synthesis, etc. Three classes of discrete- particle models should be coupled: (1) electrostatic/ magnetoinductive PIC simulations should track the beams from the source through the final- focusing optics, passing details of the time- dependent distribution function to (2) electromagnetic or magnetoinductive PIC or hybrid PIG/ fluid simulations in the fusion chamber (which would finally pass their particle trajectory information to the radiation- hydrodynamics codes used for target design); in parallel, (3) detailed PIC, delta- f, core/ test- particle, and perhaps continuum Vlasov codes should be used to study individual sections of the driver and chamber very carefully; consistency may be assured by linking data from the PIC sequence, and knowledge gained may feed back into that sequence.« less
78 FR 664 - Establishment of Drug Codes for 26 Substances
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-04
... DEPARTMENT OF JUSTICE Drug Enforcement Administration 21 CFR Part 1308 [Docket No. DEA-368] Establishment of Drug Codes for 26 Substances AGENCY: Drug Enforcement Administration (DEA), Department of Justice. ACTION: Final rule. SUMMARY: On July 9, 2012, the President signed into law the Synthetic Drug...
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L.M.; Hochstedler, R.D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less
Fiedler, Jan; Baker, Andrew H; Dimmeler, Stefanie; Heymans, Stephane; Mayr, Manuel; Thum, Thomas
2018-05-23
Non-coding RNAs are increasingly recognized not only as regulators of various biological functions but also as targets for a new generation of RNA therapeutics and biomarkers. We hereby review recent insights relating to non-coding RNAs including microRNAs (e.g. miR-126, miR-146a), long non-coding RNAs (e.g. MIR503HG, GATA6-AS, SMILR) and circular RNAs (e.g. cZNF292) and their role in vascular diseases. This includes identification and therapeutic use of hypoxia-regulated non-coding RNAs and endogenous non-coding RNAs that regulate intrinsic smooth muscle cell signalling, age-related non-coding RNAs and non-coding RNAs involved in the regulation of mitochondrial biology and metabolic control. Finally, we discuss non-coding RNA species with biomarker potential.
Nonlinear Transient Problems Using Structure Compatible Heat Transfer Code
NASA Technical Reports Server (NTRS)
Hou, Gene
2000-01-01
The report documents the recent effort to enhance a transient linear heat transfer code so as to solve nonlinear problems. The linear heat transfer code was originally developed by Dr. Kim Bey of NASA Largely and called the Structure-Compatible Heat Transfer (SCHT) code. The report includes four parts. The first part outlines the formulation of the heat transfer problem of concern. The second and the third parts give detailed procedures to construct the nonlinear finite element equations and the required Jacobian matrices for the nonlinear iterative method, Newton-Raphson method. The final part summarizes the results of the numerical experiments on the newly enhanced SCHT code.
Differential Cross Section Kinematics for 3-dimensional Transport Codes
NASA Technical Reports Server (NTRS)
Norbury, John W.; Dick, Frank
2008-01-01
In support of the development of 3-dimensional transport codes, this paper derives the relevant relativistic particle kinematic theory. Formulas are given for invariant, spectral and angular distributions in both the lab (spacecraft) and center of momentum frames, for collisions involving 2, 3 and n - body final states.
77 FR 18716 - Transportation Security Administration Postal Zip Code Change; Technical Amendment
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-28
... organizational changes and it has no substantive effect on the public. DATES: Effective March 28, 2012. FOR... No. 1572-9] Transportation Security Administration Postal Zip Code Change; Technical Amendment AGENCY: Transportation Security Administration, DHS. ACTION: Final rule. SUMMARY: This rule is a technical change to...
VizieR Online Data Catalog: FAMA code for stellar parameters and abundances (Magrini+, 2013)
NASA Astrophysics Data System (ADS)
Magrini, L.; Randich, S.; Friel, E.; Spina, L.; Jacobson, H.; Cantat-Gaudin, T.; Donati, P.; Baglioni, R.; Maiorca, E.; Bragaglia, A.; Sordo, R.; Vallenari, A.
2013-07-01
FAMA v.1, July 2013, distributed with MOOGv2013 and Kurucz models. Perl Codes: read_out2.pl read_final.pl driver.pl sclipping_26.0.pl sclipping_final.pl sclipping_26.1.pl confronta.pl fama.pl Model atmopheres and interpolator (Kurucz models): MODEL_ATMO MOOG_files: files to compile MOOG (the most recent version of MOOG can be obtained from http://www.as.utexas.edu/~chris/moog.html) FAMAmoogfiles: files to update when compiling MOOG OUTPUT: directory in which the results will be stored, contains a sm macro to produce final plots automoog.par: files with parameters for FAMA 1) OUTPUTdir 2) MOOGdir 3) modelsdir 4) 1.0 (default) percentage of the dispersion of FeI abundances to be considered to compute the errors on the stellar parameters, 1.0 means 100%, thus to compute e.g., the error on Teff we allow to code to find the Teff corresponding to a slope given by σ(FeI)/range(EP). 5) 1.2 (default) σ clipping for FeI lines 6) 1.0 (default) σ clipping for FeII lines 7) 1.0 (default) σ clipping for the other elements 8) 1.0 (default) value of the QP parameter, higher values mean less strong convergence criteria. star.iron: EWs in the correct format to test the code sun.par: initial parameters for the test (1 data file).
Aerothermo-Structural Analysis of Low Cost Composite Nozzle/Inlet Components
NASA Technical Reports Server (NTRS)
Shivakumar, Kuwigai; Challa, Preeli; Sree, Dave; Reddy, D.
1999-01-01
This research is a cooperative effort among the Turbomachinery and Propulsion Division of NASA Glenn, CCMR of NC A&T State University, and the Tuskegee University. The NC A&T is the lead center and Tuskegee University is the participating institution. Objectives of the research were to develop an integrated aerodynamic, thermal and structural analysis code for design of aircraft engine components, such as, nozzles and inlets made of textile composites; conduct design studies on typical inlets for hypersonic transportation vehicles and setup standards test examples and finally manufacture a scaled down composite inlet. These objectives are accomplished through the following seven tasks: (1) identify the relevant public domain codes for all three types of analysis; (2) evaluate the codes for the accuracy of results and computational efficiency; (3) develop aero-thermal and thermal structural mapping algorithms; (4) integrate all the codes into one single code; (5) write a graphical user interface to improve the user friendliness of the code; (6) conduct test studies for rocket based combined-cycle engine inlet; and finally (7) fabricate a demonstration inlet model using textile preform composites. Tasks one, two and six are being pursued. Selected and evaluated NPARC for flow field analysis, CSTEM for in-depth thermal analysis of inlets and nozzles and FRAC3D for stress analysis. These codes have been independently verified for accuracy and performance. In addition, graphical user interface based on micromechanics analysis for laminated as well as textile composites was developed. Demonstration of this code will be made at the conference. A rocket based combined cycle engine was selected for test studies. Flow field analysis of various inlet geometries were studied. Integration of codes is being continued. The codes developed are being applied to a candidate example of trailblazer engine proposed for space transportation. A successful development of the code will provide a simpler, faster and user-friendly tool for conducting design studies of aircraft and spacecraft engines, applicable in high speed civil transport and space missions.
Normative lessons: codes of conduct, self-regulation and the law.
Parker, Malcolm H
2010-06-07
Good medical practice: a code of conduct for doctors in Australia provides uniform standards to be applied in relation to complaints about doctors to the new Medical Board of Australia. The draft Code was criticised for being prescriptive. The final Code employs apparently less authoritative wording than the draft Code, but the implicit obligations it contains are no less prescriptive. Although the draft Code was thought to potentially undermine trust in doctors, and stifle professional judgement in relation to individual patients, its general obligations always allowed for flexibility of application, depending on the circumstances of individual patients. Professional codes may contain some aspirational statements, but they always contain authoritative ones, and they share this feature with legal codes. In successfully diluting the apparent prescriptivity of the draft Code, the profession has lost an opportunity to demonstrate its commitment to the raison d'etre of self-regulation - the protection of patients. Professional codes are not opportunities for reflection, consideration and debate, but are outcomes of these activities.
Optical image encryption based on real-valued coding and subtracting with the help of QR code
NASA Astrophysics Data System (ADS)
Deng, Xiaopeng
2015-08-01
A novel optical image encryption based on real-valued coding and subtracting is proposed with the help of quick response (QR) code. In the encryption process, the original image to be encoded is firstly transformed into the corresponding QR code, and then the corresponding QR code is encoded into two phase-only masks (POMs) by using basic vector operations. Finally, the absolute values of the real or imaginary parts of the two POMs are chosen as the ciphertexts. In decryption process, the QR code can be approximately restored by recording the intensity of the subtraction between the ciphertexts, and hence the original image can be retrieved without any quality loss by scanning the restored QR code with a smartphone. Simulation results and actual smartphone collected results show that the method is feasible and has strong tolerance to noise, phase difference and ratio between intensities of the two decryption light beams.
2017-04-13
modelling code, a parallel benchmark , and a communication avoiding version of the QR algorithm. Further, several improvements to the OmpSs model were...movement; and a port of the dynamic load balancing library to OmpSs. Finally, several updates to the tools infrastructure were accomplished, including: an...OmpSs: a basic algorithm on image processing applications, a mini application representative of an ocean modelling code, a parallel benchmark , and a
Image Transmission via Spread Spectrum Techniques. Part A
1976-01-01
Code 408 DR. EDWIN H. WRENCH (714-225-6871) Code 408 and HARPER J. WHITEHOUSE (714:225-6315), Code 4002 Naval Undersea Center San Diego. California...progress report appears in two parts. Part A is a summary of work done in support of this program at the Naval Undersea Center. Part B contains final...a technical description of the bandwidth compression system developed at the Naval Undersea Center. This paper is an excerpt from the specifications
Further Developments in the Communication Link and Error Analysis (CLEAN) Simulator
NASA Technical Reports Server (NTRS)
Ebel, William J.; Ingels, Frank M.
1995-01-01
During the period 1 July 1993 - 30 June 1994, significant developments to the Communication Link and Error ANalysis (CLEAN) simulator were completed. Many of these were reported in the Semi-Annual report dated December 1993 which has been included in this report in Appendix A. Since December 1993, a number of additional modules have been added involving Unit-Memory Convolutional codes (UMC). These are: (1) Unit-Memory Convolutional Encoder module (UMCEncd); (2) Hard decision Unit-Memory Convolutional Decoder using the Viterbi decoding algorithm (VitUMC); and (3) a number of utility modules designed to investigate the performance of LTMC's such as LTMC column distance function (UMCdc), UMC free distance function (UMCdfree), UMC row distance function (UMCdr), and UMC Transformation (UMCTrans). The study of UMC's was driven, in part, by the desire to investigate high-rate convolutional codes which are better suited as inner codes for a concatenated coding scheme. A number of high-rate LTMC's were found which are good candidates for inner codes. Besides the further developments of the simulation, a study was performed to construct a table of the best known Unit-Memory Convolutional codes. Finally, a preliminary study of the usefulness of the Periodic Convolutional Interleaver (PCI) was completed and documented in a Technical note dated March 17, 1994. This technical note has also been included in this final report.
Methods of alleviation of ionospheric scintillation effects on digital communications
NASA Technical Reports Server (NTRS)
Massey, J. L.
1974-01-01
The degradation of the performance of digital communication systems because of ionospheric scintillation effects can be reduced either by diversity techniques or by coding. The effectiveness of traditional space-diversity, frequency-diversity and time-diversity techniques is reviewed and design considerations isolated. Time-diversity signaling is then treated as an extremely simple form of coding. More advanced coding methods, such as diffuse threshold decoding and burst-trapping decoding, which appear attractive in combatting scintillation effects are discussed and design considerations noted. Finally, adaptive coding techniques appropriate when the general state of the channel is known are discussed.
Cracking the Code: Synchronizing Policy and Practice for Performance-Based Learning
ERIC Educational Resources Information Center
Patrick, Susan; Sturgis, Chris
2011-01-01
Performance-based learning is one of the keys to cracking open the assumptions that undergird the current educational codes, structures, and practices. By finally moving beyond the traditions of a time-based system, greater customized educational services can flourish, preparing more and more students for college and careers. This proposed policy…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-06
... 52.203-13, Contractor Code of Business Ethics and Conduct. This final rule corrects two omissions in... Subjects in 48 CFR Parts 203 and 252 Government procurement. Ynette R. Shelkin, Editor, Defense Acquisition... clause 52.203-13, Contractor Code of Business Ethics and Conduct. * * * * * PART 252--SOLICITATION...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-19
...) which specify that Illinois' surface coating VOC emission limitations shall not apply to touch-up and... Administrative Code (Ill. Adm. Code) by adding a ``small container exemption'' for pleasure craft surface coating... technology (RACT) policy. DATES: This final rule is effective on May 20, 2013. ADDRESSES: EPA has established...
Transfer reaction code with nonlocal interactions
Titus, L. J.; Ross, A.; Nunes, F. M.
2016-07-14
We present a suite of codes (NLAT for nonlocal adiabatic transfer) to calculate the transfer cross section for single-nucleon transfer reactions, (d,N)(d,N) or (N,d)(N,d), including nonlocal nucleon–target interactions, within the adiabatic distorted wave approximation. For this purpose, we implement an iterative method for solving the second order nonlocal differential equation, for both scattering and bound states. The final observables that can be obtained with NLAT are differential angular distributions for the cross sections of A(d,N)BA(d,N)B or B(N,d)AB(N,d)A. Details on the implementation of the TT-matrix to obtain the final cross sections within the adiabatic distorted wave approximation method are also provided.more » This code is suitable to be applied for deuteron induced reactions in the range of View the MathML sourceEd=10–70MeV, and provides cross sections with 4% accuracy.« less
NASA Technical Reports Server (NTRS)
Jordan, Kevin
1999-01-01
The following contains the final report on the activities related to the Cooperative Agreement between the human factors research group at NASA Ames Research Center and the Psychology Department at San Jose State University. The participating NASA Ames division has been, as the organization has changed, the Aerospace Human Factors Research Division (ASHFRD and Code FL), the Flight Management and Human Factors Research Division (Code AF), and the Human Factors Research and Technology Division (Code IH). The inclusive dates for the report are November 1, 1984 to January 31, 1999. Throughout the years, approximately 170 persons worked on the cooperative agreements in one capacity or another. The Cooperative Agreement provided for research personnel to collaborate with senior scientists in ongoing NASA ARC research. Finally, many post-MA/MS and post-doctoral personnel contributed to the projects. It is worth noting that 10 former cooperative agreement personnel were hired into civil service positions directly from the agreements.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1991-01-01
Shannon's capacity bound shows that coding can achieve large reductions in the required signal to noise ratio per information bit (E sub b/N sub 0 where E sub b is the energy per bit and (N sub 0)/2 is the double sided noise density) in comparison to uncoded schemes. For bandwidth efficiencies of 2 bit/sym or greater, these improvements were obtained through the use of Trellis Coded Modulation and Block Coded Modulation. A method of obtaining these high efficiencies using multidimensional Multiple Phase Shift Keying (MPSK) and Quadrature Amplitude Modulation (QAM) signal sets with trellis coding is described. These schemes have advantages in decoding speed, phase transparency, and coding gain in comparison to other trellis coding schemes. Finally, a general parity check equation for rotationally invariant trellis codes is introduced from which non-linear codes for two dimensional MPSK and QAM signal sets are found. These codes are fully transparent to all rotations of the signal set.
On decoding of multi-level MPSK modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Gupta, Alok Kumar
1990-01-01
The decoding problem of multi-level block modulation codes is investigated. The hardware design of soft-decision Viterbi decoder for some short length 8-PSK block modulation codes is presented. An effective way to reduce the hardware complexity of the decoder by reducing the branch metric and path metric, using a non-uniform floating-point to integer mapping scheme, is proposed and discussed. The simulation results of the design are presented. The multi-stage decoding (MSD) of multi-level modulation codes is also investigated. The cases of soft-decision and hard-decision MSD are considered and their performance are evaluated for several codes of different lengths and different minimum squared Euclidean distances. It is shown that the soft-decision MSD reduces the decoding complexity drastically and it is suboptimum. The hard-decision MSD further simplifies the decoding while still maintaining a reasonable coding gain over the uncoded system, if the component codes are chosen properly. Finally, some basic 3-level 8-PSK modulation codes using BCH codes as component codes are constructed and their coding gains are found for hard decision multistage decoding.
TEMPEST code simulations of hydrogen distribution in reactor containment structures. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trent, D.S.; Eyler, L.L.
The mass transport version of the TEMPEST computer code was used to simulate hydrogen distribution in geometric configurations relevant to reactor containment structures. Predicted results of Battelle-Frankfurt hydrogen distribution tests 1 to 6, and 12 are presented. Agreement between predictions and experimental data is good. Best agreement is obtained using the k-epsilon turbulence model in TEMPEST in flow cases where turbulent diffusion and stable stratification are dominant mechanisms affecting transport. The code's general analysis capabilities are summarized.
Implementing a strand of a scalable fault-tolerant quantum computing fabric.
Chow, Jerry M; Gambetta, Jay M; Magesan, Easwar; Abraham, David W; Cross, Andrew W; Johnson, B R; Masluk, Nicholas A; Ryan, Colm A; Smolin, John A; Srinivasan, Srikanth J; Steffen, M
2014-06-24
With favourable error thresholds and requiring only nearest-neighbour interactions on a lattice, the surface code is an error-correcting code that has garnered considerable attention. At the heart of this code is the ability to perform a low-weight parity measurement of local code qubits. Here we demonstrate high-fidelity parity detection of two code qubits via measurement of a third syndrome qubit. With high-fidelity gates, we generate entanglement distributed across three superconducting qubits in a lattice where each code qubit is coupled to two bus resonators. Via high-fidelity measurement of the syndrome qubit, we deterministically entangle the code qubits in either an even or odd parity Bell state, conditioned on the syndrome qubit state. Finally, to fully characterize this parity readout, we develop a measurement tomography protocol. The lattice presented naturally extends to larger networks of qubits, outlining a path towards fault-tolerant quantum computing.
NASA Astrophysics Data System (ADS)
Heller, René
2018-03-01
The SETI Encryption code, written in Python, creates a message for use in testing the decryptability of a simulated incoming interstellar message. The code uses images in a portable bit map (PBM) format, then writes the corresponding bits into the message, and finally returns both a PBM image and a text (TXT) file of the entire message. The natural constants (c, G, h) and the wavelength of the message are defined in the first few lines of the code, followed by the reading of the input files and their conversion into 757 strings of 359 bits to give one page. Each header of a page, i.e. the little-endian binary code translation of the tempo-spatial yardstick, is calculated and written on-the-fly for each page.
77 FR 72913 - Defining Larger Participants of the Consumer Debt Collection Market; Correction
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-07
...;to and codified in the Code of Federal Regulations, which is published #0;under 50 titles pursuant to 44 U.S.C. 1510. #0; #0;The Code of Federal Regulations is sold by the Superintendent of Documents. #0... collection. The final rule contained four typographical errors, which this document corrects. Three of these...
Code Pulse: Software Assurance (SWA) Visual Analytics for Dynamic Analysis of Code
2014-09-01
31 4.5.1 Market Analysis...competitive market analysis to assess the tool potential. The final transition targets were selected and expressed along with our research on the topic...public release milestones. Details of our testing methodology is in our Software Test Plan deliv- erable, CP- STP -0001. A summary of this approach is
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-20
... Supplement; Positive Law Codification of Title 41 U.S.C. (DFARS Case 2011-D036) AGENCY: Defense Acquisition... DFARS to the new Codification of Title 41, United States Code, ``Public Contracts.'' DATES: Effective... of Title 41, United States Code (U.S.C.), entitled ``Public Contracts.'' The purpose of this final...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-10
... certification pursuant to section 401 of the Clean Water Act (CWA) from Hawaii and a final response on the... Robin Danesi at EPA Headquarters, Office of Water, Office of Wastewater Management, Mail Code 4203M... Headquarters, Office of Water, Office of Wastewater Management, Mail Code 4203M, 1200 Pennsylvania Ave., NW...
ERIC Educational Resources Information Center
Duffy, Thomas; And Others
This supplementary volume presents appendixes A-E associated with a 1-year study which determined what secondary school students were doing as they engaged in the Chelsea Bank computer software simulation activities. Appendixes present the SCANS Analysis Coding Sheet; coding problem analysis of 50 video segments; student and teacher interview…
40 CFR 272.1151 - State-administered program: Final authorization.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., P.O. Box 64526, St. Paul, Minnesota 55164-0526. (ii) Michigan Administrative Code, Rules 299.9101... seq. (c) Statement of Legal Authority. The Michigan Attorney General's Statements for final authorization signed by the Attorney General of Michigan on October 25, 1985, and supplements to that Statement...
Final Report for Geometric Observers and Particle Filtering for Controlled Active Vision
2016-12-15
code) 15-12-2016 Final Report 01Sep06 - 09May11 Final Report for Geometric Observers & Particle Filtering for Controlled Active Vision 49414-NS.1Allen...Observers and Particle Filtering for Controlled Active Vision by Allen R. Tannenbaum School of Electrical and Computer Engineering Georgia Institute of...7 2.2.4 Conformal Area Minimizing Flows . . . . . . . . . . . . . . . . . . . . . . . 8 2.3 Particle Filters
High-efficiency reconciliation for continuous variable quantum key distribution
NASA Astrophysics Data System (ADS)
Bai, Zengliang; Yang, Shenshen; Li, Yongmin
2017-04-01
Quantum key distribution (QKD) is the most mature application of quantum information technology. Information reconciliation is a crucial step in QKD and significantly affects the final secret key rates shared between two legitimate parties. We analyze and compare various construction methods of low-density parity-check (LDPC) codes and design high-performance irregular LDPC codes with a block length of 106. Starting from these good codes and exploiting the slice reconciliation technique based on multilevel coding and multistage decoding, we realize high-efficiency Gaussian key reconciliation with efficiency higher than 95% for signal-to-noise ratios above 1. Our demonstrated method can be readily applied in continuous variable QKD.
Incorporating Manual and Autonomous Code Generation
NASA Technical Reports Server (NTRS)
McComas, David
1998-01-01
Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.
Convolutional encoding of self-dual codes
NASA Technical Reports Server (NTRS)
Solomon, G.
1994-01-01
There exist almost complete convolutional encodings of self-dual codes, i.e., block codes of rate 1/2 with weights w, w = 0 mod 4. The codes are of length 8m with the convolutional portion of length 8m-2 and the nonsystematic information of length 4m-1. The last two bits are parity checks on the two (4m-1) length parity sequences. The final information bit complements one of the extended parity sequences of length 4m. Solomon and van Tilborg have developed algorithms to generate these for the Quadratic Residue (QR) Codes of lengths 48 and beyond. For these codes and reasonable constraint lengths, there are sequential decodings for both hard and soft decisions. There are also possible Viterbi-type decodings that may be simple, as in a convolutional encoding/decoding of the extended Golay Code. In addition, the previously found constraint length K = 9 for the QR (48, 24;12) Code is lowered here to K = 8.
76 FR 78465 - Home Mortgage Disclosure (Regulation C)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-19
... 44 U.S.C. 1510. #0; #0;The Code of Federal Regulations is sold by the Superintendent of Documents. #0... Mortgage Disclosure (Regulation C) AGENCY: Bureau of Consumer Financial Protection. ACTION: Interim final... a new Regulation C (Home Mortgage Disclosure). This interim final rule does not impose any new...
Ultralow Noise Monolithic Quantum Dot Photonic Oscillators
2013-10-28
HBCU/MI) ULTRALOW NOISE MONOLITHIC QUANTUM DOT PHOTONIC OSCILLATORS LUKE LESTER UNIVERSITY OF NEW MEXICO 10/28/2013 Final Report DISTRIBUTION A...TELEPHONE NUMBER (Include area code) 24-10-2013 Final 01-06-2010 to 31-05-2013 Ultralow Noise Monolithic Quantum Dot Photonic Oscillators FA9550-10-1-0276...277-7647 Reset Grant Title: ULTRALOW NOISE MONOLITHIC QUANTUM DOT PHOTONIC OSCILLATORS Grant/Contract Number: FA9550-10-1-0276 Final Performance
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-26
... costs and benefits of the rule and to identify any relevant changes in technology that have occurred... access to care; Whether the public health benefits of an action have been realized; Whether the public or... reviewing under E.O. 13563 is the Bar Code Final Rule. The Agency plans to reassess its costs and benefits...
Design of an Orbital Inspection Satellite
1986-12-01
ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNITELEMENT NO. NO. NO. CCESSION NO. 11. TITLE (include...Captain, USAF Dh t ibutioni Availabiity Codes Avail adlor Dist [Special December 1986 Approved for public release; distribution...lends itself to the technique of multi -objective analysis. The final step is planning for action. This communicates the entire systems engineering
LLNL Mercury Project Trinity Open Science Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Shawn A.
The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.
CFD analysis of turbopump volutes
NASA Technical Reports Server (NTRS)
Ascoli, Edward P.; Chan, Daniel C.; Darian, Armen; Hsu, Wayne W.; Tran, Ken
1993-01-01
An effort is underway to develop a procedure for the regular use of CFD analysis in the design of turbopump volutes. Airflow data to be taken at NASA Marshall will be used to validate the CFD code and overall procedure. Initial focus has been on preprocessing (geometry creation, translation, and grid generation). Volute geometries have been acquired electronically and imported into the CATIA CAD system and RAGGS (Rockwell Automated Grid Generation System) via the IGES standard. An initial grid topology has been identified and grids have been constructed for turbine inlet and discharge volutes. For CFD analysis of volutes to be used regularly, a procedure must be defined to meet engineering design needs in a timely manner. Thus, a compromise must be established between making geometric approximations, the selection of grid topologies, and possible CFD code enhancements. While the initial grid developed approximated the volute tongue with a zero thickness, final computations should more accurately account for the geometry in this region. Additionally, grid topologies will be explored to minimize skewness and high aspect ratio cells that can affect solution accuracy and slow code convergence. Finally, as appropriate, code modifications will be made to allow for new grid topologies in an effort to expedite the overall CFD analysis process.
Medical image classification based on multi-scale non-negative sparse coding.
Zhang, Ruijie; Shen, Jian; Wei, Fushan; Li, Xiong; Sangaiah, Arun Kumar
2017-11-01
With the rapid development of modern medical imaging technology, medical image classification has become more and more important in medical diagnosis and clinical practice. Conventional medical image classification algorithms usually neglect the semantic gap problem between low-level features and high-level image semantic, which will largely degrade the classification performance. To solve this problem, we propose a multi-scale non-negative sparse coding based medical image classification algorithm. Firstly, Medical images are decomposed into multiple scale layers, thus diverse visual details can be extracted from different scale layers. Secondly, for each scale layer, the non-negative sparse coding model with fisher discriminative analysis is constructed to obtain the discriminative sparse representation of medical images. Then, the obtained multi-scale non-negative sparse coding features are combined to form a multi-scale feature histogram as the final representation for a medical image. Finally, SVM classifier is combined to conduct medical image classification. The experimental results demonstrate that our proposed algorithm can effectively utilize multi-scale and contextual spatial information of medical images, reduce the semantic gap in a large degree and improve medical image classification performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Polyanskiy, Mikhail N.
2015-01-01
We describe a computer code for simulating the amplification of ultrashort mid-infrared laser pulses in CO 2 amplifiers and their propagation through arbitrary optical systems. This code is based on a comprehensive model that includes an accurate consideration of the CO 2 active medium and a physical optics propagation algorithm, and takes into account the interaction of the laser pulse with the material of the optical elements. Finally, the application of the code for optimizing an isotopic regenerative amplifier is described.
NASA Technical Reports Server (NTRS)
Riddick, Stephen E.; Hinton, David A.
2000-01-01
A study has been performed on a computer code modeling an aircraft wake vortex spacing system during final approach. This code represents an initial engineering model of a system to calculate reduced approach separation criteria needed to increase airport productivity. This report evaluates model sensitivity toward various weather conditions (crosswind, crosswind variance, turbulent kinetic energy, and thermal gradient), code configurations (approach corridor option, and wake demise definition), and post-processing techniques (rounding of provided spacing values, and controller time variance).
Model-Driven Engineering of Machine Executable Code
NASA Astrophysics Data System (ADS)
Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira
Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.
NASA Astrophysics Data System (ADS)
Braunmueller, F.; Tran, T. M.; Vuillemin, Q.; Alberti, S.; Genoud, J.; Hogge, J.-Ph.; Tran, M. Q.
2015-06-01
A new gyrotron simulation code for simulating the beam-wave interaction using a monomode time-dependent self-consistent model is presented. The new code TWANG-PIC is derived from the trajectory-based code TWANG by describing the electron motion in a gyro-averaged one-dimensional Particle-In-Cell (PIC) approach. In comparison to common PIC-codes, it is distinguished by its computation speed, which makes its use in parameter scans and in experiment interpretation possible. A benchmark of the new code is presented as well as a comparative study between the two codes. This study shows that the inclusion of a time-dependence in the electron equations, as it is the case in the PIC-approach, is mandatory for simulating any kind of non-stationary oscillations in gyrotrons. Finally, the new code is compared with experimental results and some implications of the violated model assumptions in the TWANG code are disclosed for a gyrotron experiment in which non-stationary regimes have been observed and for a critical case that is of interest in high power gyrotron development.
Statistical properties of DNA sequences
NASA Technical Reports Server (NTRS)
Peng, C. K.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Simons, M.; Stanley, H. E.
1995-01-01
We review evidence supporting the idea that the DNA sequence in genes containing non-coding regions is correlated, and that the correlation is remarkably long range--indeed, nucleotides thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene. We resolve the problem of the "non-stationarity" feature of the sequence of base pairs by applying a new algorithm called detrended fluctuation analysis (DFA). We address the claim of Voss that there is no difference in the statistical properties of coding and non-coding regions of DNA by systematically applying the DFA algorithm, as well as standard FFT analysis, to every DNA sequence (33301 coding and 29453 non-coding) in the entire GenBank database. Finally, we describe briefly some recent work showing that the non-coding sequences have certain statistical features in common with natural and artificial languages. Specifically, we adapt to DNA the Zipf approach to analyzing linguistic texts. These statistical properties of non-coding sequences support the possibility that non-coding regions of DNA may carry biological information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braunmueller, F., E-mail: falk.braunmueller@epfl.ch; Tran, T. M.; Alberti, S.
A new gyrotron simulation code for simulating the beam-wave interaction using a monomode time-dependent self-consistent model is presented. The new code TWANG-PIC is derived from the trajectory-based code TWANG by describing the electron motion in a gyro-averaged one-dimensional Particle-In-Cell (PIC) approach. In comparison to common PIC-codes, it is distinguished by its computation speed, which makes its use in parameter scans and in experiment interpretation possible. A benchmark of the new code is presented as well as a comparative study between the two codes. This study shows that the inclusion of a time-dependence in the electron equations, as it is themore » case in the PIC-approach, is mandatory for simulating any kind of non-stationary oscillations in gyrotrons. Finally, the new code is compared with experimental results and some implications of the violated model assumptions in the TWANG code are disclosed for a gyrotron experiment in which non-stationary regimes have been observed and for a critical case that is of interest in high power gyrotron development.« less
Final report for the Tera Computer TTI CRADA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, G.S.; Pavlakos, C.; Silva, C.
1997-01-01
Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less
NASA Technical Reports Server (NTRS)
Bade, W. L.; Yos, J. M.
1975-01-01
The present, third volume of the final report is a programmer's manual for the code. It provides a listing of the FORTRAN 4 source program; a complete glossary of FORTRAN symbols; a discussion of the purpose and method of operation of each subroutine (including mathematical analyses of special algorithms); and a discussion of the operation of the code on IBM/360 and UNIVAC 1108 systems, including required control cards and the overlay structure used to accommodate the code to the limited core size of the 1108. In addition, similar information is provided to document the programming of the NOZFIT code, which is employed to set up nozzle profile curvefits for use in NATA.
Potential flow theory and operation guide for the panel code PMARC
NASA Technical Reports Server (NTRS)
Ashby, Dale L.; Dudley, Michael R.; Iguchi, Steve K.; Browne, Lindsey; Katz, Joseph
1991-01-01
The theoretical basis for PMARC, a low-order potential-flow panel code for modeling complex three-dimensional geometries, is outlined. Several of the advanced features currently included in the code, such as internal flow modeling, a simple jet model, and a time-stepping wake model, are discussed in some detail. The code is written using adjustable size arrays so that it can be easily redimensioned for the size problem being solved and the computer hardware being used. An overview of the program input is presented, with a detailed description of the input available in the appendices. Finally, PMARC results for a generic wing/body configuration are compared with experimental data to demonstrate the accuracy of the code. The input file for this test case is given in the appendices.
Gene and genon concept: coding versus regulation
2007-01-01
We analyse here the definition of the gene in order to distinguish, on the basis of modern insight in molecular biology, what the gene is coding for, namely a specific polypeptide, and how its expression is realized and controlled. Before the coding role of the DNA was discovered, a gene was identified with a specific phenotypic trait, from Mendel through Morgan up to Benzer. Subsequently, however, molecular biologists ventured to define a gene at the level of the DNA sequence in terms of coding. As is becoming ever more evident, the relations between information stored at DNA level and functional products are very intricate, and the regulatory aspects are as important and essential as the information coding for products. This approach led, thus, to a conceptual hybrid that confused coding, regulation and functional aspects. In this essay, we develop a definition of the gene that once again starts from the functional aspect. A cellular function can be represented by a polypeptide or an RNA. In the case of the polypeptide, its biochemical identity is determined by the mRNA prior to translation, and that is where we locate the gene. The steps from specific, but possibly separated sequence fragments at DNA level to that final mRNA then can be analysed in terms of regulation. For that purpose, we coin the new term “genon”. In that manner, we can clearly separate product and regulative information while keeping the fundamental relation between coding and function without the need to introduce a conceptual hybrid. In mRNA, the program regulating the expression of a gene is superimposed onto and added to the coding sequence in cis - we call it the genon. The complementary external control of a given mRNA by trans-acting factors is incorporated in its transgenon. A consequence of this definition is that, in eukaryotes, the gene is, in most cases, not yet present at DNA level. Rather, it is assembled by RNA processing, including differential splicing, from various pieces, as steered by the genon. It emerges finally as an uninterrupted nucleic acid sequence at mRNA level just prior to translation, in faithful correspondence with the amino acid sequence to be produced as a polypeptide. After translation, the genon has fulfilled its role and expires. The distinction between the protein coding information as materialised in the final polypeptide and the processing information represented by the genon allows us to set up a new information theoretic scheme. The standard sequence information determined by the genetic code expresses the relation between coding sequence and product. Backward analysis asks from which coding region in the DNA a given polypeptide originates. The (more interesting) forward analysis asks in how many polypeptides of how many different types a given DNA segment is expressed. This concerns the control of the expression process for which we have introduced the genon concept. Thus, the information theoretic analysis can capture the complementary aspects of coding and regulation, of gene and genon. PMID:18087760
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzpatrick, Richard
2007-09-24
Dr. Fitzpatrick has written an MHD code in order to investigate the interaction of tearing modes with flow and external magnetic perturbations, which has been successfully benchmarked against both linear and nonlinear theory and used to investigate error-field penetration in flowing plasmas. The same code was used to investigate the so-called Taylor problem. He employed the University of Chicago's FLASH code to further investigate the Taylor problem, discovering a new aspect of the problem. Dr. Fitzpatrick has written a 2-D Hall MHD code and used it to investigate the collisionless Taylor problem. Dr. Waelbroeck has performed an investigation of themore » scaling of the error-field penetration threshold in collisionless plasmas. Paul Watson and Dr. Fitzpatrick have written a fully-implicit extended-MHD code using the PETSC framework. Five publications have resulted from this grant work.« less
Biometrics encryption combining palmprint with two-layer error correction codes
NASA Astrophysics Data System (ADS)
Li, Hengjian; Qiu, Jian; Dong, Jiwen; Feng, Guang
2017-07-01
To bridge the gap between the fuzziness of biometrics and the exactitude of cryptography, based on combining palmprint with two-layer error correction codes, a novel biometrics encryption method is proposed. Firstly, the randomly generated original keys are encoded by convolutional and cyclic two-layer coding. The first layer uses a convolution code to correct burst errors. The second layer uses cyclic code to correct random errors. Then, the palmprint features are extracted from the palmprint images. Next, they are fused together by XORing operation. The information is stored in a smart card. Finally, the original keys extraction process is the information in the smart card XOR the user's palmprint features and then decoded with convolutional and cyclic two-layer code. The experimental results and security analysis show that it can recover the original keys completely. The proposed method is more secure than a single password factor, and has higher accuracy than a single biometric factor.
NASA Astrophysics Data System (ADS)
Wootton, James R.; Loss, Daniel
2018-05-01
The repetition code is an important primitive for the techniques of quantum error correction. Here we implement repetition codes of at most 15 qubits on the 16 qubit ibmqx3 device. Each experiment is run for a single round of syndrome measurements, achieved using the standard quantum technique of using ancilla qubits and controlled operations. The size of the final syndrome is small enough to allow for lookup table decoding using experimentally obtained data. The results show strong evidence that the logical error rate decays exponentially with code distance, as is expected and required for the development of fault-tolerant quantum computers. The results also give insight into the nature of noise in the device.
Solutions for Digital Video Transmission Technology Final Report CRADA No. TC02068.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, A. T.; Rivers, W.
This Project aimed at development of software for seismic data processing based on the Geotool code developed by the American company Multimax., Inc. The Geotool code was written in early 90-es for the UNIX platform. Under Project# 2821, functions of the old Geotool code were transferred into a commercial version for the Microsoft XP and Vista platform with addition of new capabilities on visualization and data processing. The developed new version of the Geotool+ was implemented using the up-to-date tool Microsoft Visual Studio 2005 and uses capabilities of the .NET platform. C++ was selected as the main programming language formore » the Geotool+. The two-year Project was extended by six months and funding levels increased from 600,000 to $670,000. All tasks were successfully completed and all deliverables were met for the project even though both the industrial partner and LLNL principal investigator left the project before its final report.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-30
... Department of Agriculture AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: The Environmental Protection Agency (EPA) is taking final action to approve North Dakota Department of Agriculture's... Department of Agriculture (iv) The initials RMP mean Risk Management Plan (v) The initials CFR mean Code of...
Nonlinear to Linear Elastic Code Coupling in 2-D Axisymmetric Media.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preston, Leiph
Explosions within the earth nonlinearly deform the local media, but at typical seismological observation distances, the seismic waves can be considered linear. Although nonlinear algorithms can simulate explosions in the very near field well, these codes are computationally expensive and inaccurate at propagating these signals to great distances. A linearized wave propagation code, coupled to a nonlinear code, provides an efficient mechanism to both accurately simulate the explosion itself and to propagate these signals to distant receivers. To this end we have coupled Sandia's nonlinear simulation algorithm CTH to a linearized elastic wave propagation code for 2-D axisymmetric media (axiElasti)more » by passing information from the nonlinear to the linear code via time-varying boundary conditions. In this report, we first develop the 2-D axisymmetric elastic wave equations in cylindrical coordinates. Next we show how we design the time-varying boundary conditions passing information from CTH to axiElasti, and finally we demonstrate the coupling code via a simple study of the elastic radius.« less
INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Javier Ortensi; Sonat Sen
2013-09-01
The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible formore » defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III results of all other international participants in 2014, while the remaining Phase II transient case results will be reported in 2015.« less
A Large Scale Code Resolution Service Network in the Internet of Things
Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan
2012-01-01
In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT's advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS. PMID:23202207
A large scale code resolution service network in the Internet of Things.
Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan
2012-11-07
In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.
Advanced coding and modulation schemes for TDRSS
NASA Technical Reports Server (NTRS)
Harrell, Linda; Kaplan, Ted; Berman, Ted; Chang, Susan
1993-01-01
This paper describes the performance of the Ungerboeck and pragmatic 8-Phase Shift Key (PSK) Trellis Code Modulation (TCM) coding techniques with and without a (255,223) Reed-Solomon outer code as they are used for Tracking Data and Relay Satellite System (TDRSS) S-Band and Ku-Band return services. The performance of these codes at high data rates is compared to uncoded Quadrature PSK (QPSK) and rate 1/2 convolutionally coded QPSK in the presence of Radio Frequency Interference (RFI), self-interference, and hardware distortions. This paper shows that the outer Reed-Solomon code is necessary to achieve a 10(exp -5) Bit Error Rate (BER) with an acceptable level of degradation in the presence of RFI. This paper also shows that the TCM codes with or without the Reed-Solomon outer code do not perform well in the presence of self-interference. In fact, the uncoded QPSK signal performs better than the TCM coded signal in the self-interference situation considered in this analysis. Finally, this paper shows that the E(sub b)/N(sub 0) degradation due to TDRSS hardware distortions is approximately 1.3 dB with a TCM coded signal or a rate 1/2 convolutionally coded QPSK signal and is 3.2 dB with an uncoded QPSK signal.
2014-08-06
This final rule will update the prospective payment rates for Medicare inpatient hospital services provided by inpatient psychiatric facilities (IPFs). These changes will be applicable to IPF discharges occurring during the fiscal year (FY) beginning October 1, 2014 through September 30, 2015. This final rule will also address implementation of ICD-10-CM and ICD-10-PCS codes; finalize a new methodology for updating the cost of living adjustment (COLA), and finalize new quality measures and reporting requirements under the IPF quality reporting program.
DefEX: Hands-On Cyber Defense Exercise for Undergraduate Students
2011-07-01
Injection, and 4) File Upload. Next, the students patched the associated flawed Perl and PHP Hypertext Preprocessor ( PHP ) code. Finally, students...underlying script. The Zora XSS vulnerability existed in a PHP file that echoed unfiltered user input back to the screen. To eliminate the...vulnerability, students filtered the input using the PHP htmlentities function and retested the code. The htmlentities function translates certain ambiguous
Box codes of lengths 48 and 72
NASA Technical Reports Server (NTRS)
Solomon, G.; Jin, Y.
1993-01-01
A self-dual code length 48, dimension 24, with Hamming distance essentially equal to 12 is constructed here. There are only six code words of weight eight. All the other code words have weights that are multiples of four and have a minimum weight equal to 12. This code may be encoded systematically and arises from a strict binary representation of the (8,4;5) Reed-Solomon (RS) code over GF (64). The code may be considered as six interrelated (8,7;2) codes. The Mattson-Solomon representation of the cyclic decomposition of these codes and their parity sums are used to detect an odd number of errors in any of the six codes. These may then be used in a correction algorithm for hard or soft decision decoding. A (72,36;15) box code was constructed from a (63,35;8) cyclic code. The theoretical justification is presented herein. A second (72,36;15) code is constructed from an inner (63,27;16) Bose Chaudhuri Hocquenghem (BCH) code and expanded to length 72 using box code algorithms for extension. This code was simulated and verified to have a minimum distance of 15 with even weight words congruent to zero modulo four. The decoding for hard and soft decision is still more complex than the first code constructed above. Finally, an (8,4;5) RS code over GF (512) in the binary representation of the (72,36;15) box code gives rise to a (72,36;16*) code with nine words of weight eight, and all the rest have weights greater than or equal to 16.
Do plant cell walls have a code?
Tavares, Eveline Q P; Buckeridge, Marcos S
2015-12-01
A code is a set of rules that establish correspondence between two worlds, signs (consisting of encrypted information) and meaning (of the decrypted message). A third element, the adaptor, connects both worlds, assigning meaning to a code. We propose that a Glycomic Code exists in plant cell walls where signs are represented by monosaccharides and phenylpropanoids and meaning is cell wall architecture with its highly complex association of polymers. Cell wall biosynthetic mechanisms, structure, architecture and properties are addressed according to Code Biology perspective, focusing on how they oppose to cell wall deconstruction. Cell wall hydrolysis is mainly focused as a mechanism of decryption of the Glycomic Code. Evidence for encoded information in cell wall polymers fine structure is highlighted and the implications of the existence of the Glycomic Code are discussed. Aspects related to fine structure are responsible for polysaccharide packing and polymer-polymer interactions, affecting the final cell wall architecture. The question whether polymers assembly within a wall display similar properties as other biological macromolecules (i.e. proteins, DNA, histones) is addressed, i.e. do they display a code? Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Experimental QR code optical encryption: noise-free data recovering.
Barrera, John Fredy; Mira-Agudelo, Alejandro; Torroba, Roberto
2014-05-15
We report, to our knowledge for the first time, the experimental implementation of a quick response (QR) code as a "container" in an optical encryption system. A joint transform correlator architecture in an interferometric configuration is chosen as the experimental scheme. As the implementation is not possible in a single step, a multiplexing procedure to encrypt the QR code of the original information is applied. Once the QR code is correctly decrypted, the speckle noise present in the recovered QR code is eliminated by a simple digital procedure. Finally, the original information is retrieved completely free of any kind of degradation after reading the QR code. Additionally, we propose and implement a new protocol in which the reception of the encrypted QR code and its decryption, the digital block processing, and the reading of the decrypted QR code are performed employing only one device (smartphone, tablet, or computer). The overall method probes to produce an outcome far more attractive to make the adoption of the technique a plausible option. Experimental results are presented to demonstrate the practicality of the proposed security system.
Software Certification - Coding, Code, and Coders
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Holzmann, Gerard J.
2011-01-01
We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.
Han, Yaoqiang; Dang, Anhong; Ren, Yongxiong; Tang, Junxiong; Guo, Hong
2010-12-20
In free space optical communication (FSOC) systems, channel fading caused by atmospheric turbulence degrades the system performance seriously. However, channel coding combined with diversity techniques can be exploited to mitigate channel fading. In this paper, based on the experimental study of the channel fading effects, we propose to use turbo product code (TPC) as the channel coding scheme, which features good resistance to burst errors and no error floor. However, only channel coding cannot cope with burst errors caused by channel fading, interleaving is also used. We investigate the efficiency of interleaving for different interleaving depths, and then the optimum interleaving depth for TPC is also determined. Finally, an experimental study of TPC with interleaving is demonstrated, and we show that TPC with interleaving can significantly mitigate channel fading in FSOC systems.
NASA Technical Reports Server (NTRS)
Wood, Jerry R.; Schmidt, James F.; Steinke, Ronald J.; Chima, Rodrick V.; Kunik, William G.
1987-01-01
Increased emphasis on sustained supersonic or hypersonic cruise has revived interest in the supersonic throughflow fan as a possible component in advanced propulsion systems. Use of a fan that can operate with a supersonic inlet axial Mach number is attractive from the standpoint of reducing the inlet losses incurred in diffusing the flow from a supersonic flight Mach number to a subsonic one at the fan face. The design of the experiment using advanced computational codes to calculate the components required is described. The rotor was designed using existing turbomachinery design and analysis codes modified to handle fully supersonic axial flow through the rotor. A two-dimensional axisymmetric throughflow design code plus a blade element code were used to generate fan rotor velocity diagrams and blade shapes. A quasi-three-dimensional, thin shear layer Navier-Stokes code was used to assess the performance of the fan rotor blade shapes. The final design was stacked and checked for three-dimensional effects using a three-dimensional Euler code interactively coupled with a two-dimensional boundary layer code. The nozzle design in the expansion region was analyzed with a three-dimensional parabolized viscous code which corroborated the results from the Euler code. A translating supersonic diffuser was designed using these same codes.
NASA Electronic Library System (NELS) optimization
NASA Technical Reports Server (NTRS)
Pribyl, William L.
1993-01-01
This is a compilation of NELS (NASA Electronic Library System) Optimization progress/problem, interim, and final reports for all phases. The NELS database was examined, particularly in the memory, disk contention, and CPU, to discover bottlenecks. Methods to increase the speed of NELS code were investigated. The tasks included restructuring the existing code to interact with others more effectively. An error reporting code to help detect and remove bugs in the NELS was added. Report writing tools were recommended to integrate with the ASV3 system. The Oracle database management system and tools were to be installed on a Sun workstation, intended for demonstration purposes.
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
Evidence and Ethics in Individual Events: An Examination of an AFA-NIET Final Round.
ERIC Educational Resources Information Center
Cronn-Mills, Daniel; Schnoor, Larry G.
A study investigated the use of source citations and evidence in the final round of Informative Speaking at the 1998 American Forensic Association-National Individual Events Tournament (AFA-NIET). The "AFA Code of Forensics Program and Forensics Tournament Standards for College and Universities" was the framework for the analysis.…
2003-01-10
This final rule amends the fire safety standards for hospitals, long-term care facilities, intermediate care facilities for the mentally retarded, ambulatory surgery centers, hospices that provide inpatient services, religious nonmedical health care institutions, critical access hospitals, and Programs of All-Inclusive Care for the Elderly facilities. Further, this final rule adopts the 2000 edition of the Life Safety Code and eliminates references in our regulations to all earlier editions.
Code-Phase Clock Bias and Frequency Offset in PPP Clock Solutions.
Defraigne, Pascale; Sleewaegen, Jean-Marie
2016-07-01
Precise point positioning (PPP) is a zero-difference single-station technique that has proved to be very effective for time and frequency transfer, enabling the comparison of atomic clocks with a precision of a hundred picoseconds and a one-day stability below the 1e-15 level. It was, however, noted that for some receivers, a frequency difference is observed between the clock solution based on the code measurements and the clock solution based on the carrier-phase measurements. These observations reveal some inconsistency either between the code and carrier phases measured by the receiver or between the data analysis strategy of codes and carrier phases. One explanation for this discrepancy is the time offset that can exist for some receivers between the code and the carrier-phase latching. This paper explains how a code-phase bias in the receiver hardware can induce a frequency difference between the code and the carrier-phase clock solutions. The impact on PPP is then quantified. Finally, the possibility to determine this code-phase bias in the PPP modeling is investigated, and the first results are shown to be inappropriate due to the high level of code noise.
Haliasos, N; Rezajooi, K; O'neill, K S; Van Dellen, J; Hudovsky, Anita; Nouraei, Sar
2010-04-01
Clinical coding is the translation of documented clinical activities during an admission to a codified language. Healthcare Resource Groupings (HRGs) are derived from coding data and are used to calculate payment to hospitals in England, Wales and Scotland and to conduct national audit and benchmarking exercises. Coding is an error-prone process and an understanding of its accuracy within neurosurgery is critical for financial, organizational and clinical governance purposes. We undertook a multidisciplinary audit of neurosurgical clinical coding accuracy. Neurosurgeons trained in coding assessed the accuracy of 386 patient episodes. Where clinicians felt a coding error was present, the case was discussed with an experienced clinical coder. Concordance between the initial coder-only clinical coding and the final clinician-coder multidisciplinary coding was assessed. At least one coding error occurred in 71/386 patients (18.4%). There were 36 diagnosis and 93 procedure errors and in 40 cases, the initial HRG changed (10.4%). Financially, this translated to pound111 revenue-loss per patient episode and projected to pound171,452 of annual loss to the department. 85% of all coding errors were due to accumulation of coding changes that occurred only once in the whole data set. Neurosurgical clinical coding is error-prone. This is financially disadvantageous and with the coding data being the source of comparisons within and between departments, coding inaccuracies paint a distorted picture of departmental activity and subspecialism in audit and benchmarking. Clinical engagement improves accuracy and is encouraged within a clinical governance framework.
Optimum Boundaries of Signal-to-Noise Ratio for Adaptive Code Modulations
2017-11-14
1510–1521, Feb. 2015. [2]. Pursley, M. B. and Royster, T. C., “Adaptive-rate nonbinary LDPC coding for frequency - hop communications ,” IEEE...and this can cause a very narrowband noise near the center frequency during USRP signal acquisition and generation. This can cause a high BER...Final Report APPROVED FOR PUBLIC RELEASE; DISTRIBUTION IS UNLIMITED. AIR FORCE RESEARCH LABORATORY Space Vehicles Directorate 3550 Aberdeen Ave
Software Library: A Reusable Software Issue.
1984-06-01
On reverse aide it neceeary aid Identify by block number) Software Library; Program Library; Reusability; Generator 20 ABSTRACT (Cmlnue on revere... Software Library. A particular example of the Software Library, the Program Library, is described as a prototype of a reusable library. A hierarchical... programming libraries are described. Finally, non code products in the Software Library are discussed. Accesson Fo NTIS R~jS DrrC TA Availability Codes 0
NASA Technical Reports Server (NTRS)
1995-01-01
In the course of preparing the SD_SURF space debris analysis code, several problems and possibilities for improvement of the BUMPERII code were documented and sent to MSFC. These suggestions and problem reports are included here as a part of the contract final report. This includes reducing BUMPERII memory requirements, compiling problems with BUMPERII, FORTRAN-lint analysis of BUMPERII, and error in function PRV in BUMPERII.
Mechanisms of Temporal Pattern Discrimination by Human Observers
1994-02-15
Research Center Department of Psychology University of Florida Gainesville, Florida 32611 15 February 1994 Final Technical Report for Period 1 October 1990...Center tfpdCbE Department of Psychology ________ AFOSR/NL Gr. &OORESS (City. Stteco and ZIP Code) 7b. ADDRESS (City’. State and ZIP Code) University of...contrasting novice and experienced performance. Journal of Experimental Psychology : Human Perception and Performance, 18, 50-71. Berg, B. G. (1989). Analysis
Telepharmacy and bar-code technology in an i.v. chemotherapy admixture area.
O'Neal, Brian C; Worden, John C; Couldry, Rick J
2009-07-01
A program using telepharmacy and bar-code technology to increase the presence of the pharmacist at a critical risk point during chemotherapy preparation is described. Telepharmacy hardware and software were acquired, and an inspection camera was placed in a biological safety cabinet to allow the pharmacy technician to take digital photographs at various stages of the chemotherapy preparation process. Once the pharmacist checks the medication vials' agreement with the work label, the technician takes the product into the biological safety cabinet, where the appropriate patient is selected from the pending work list, a queue of patient orders sent from the pharmacy information system. The technician then scans the bar code on the vial. Assuming the bar code matches, the technician photographs the work label, vials, diluents and fluids to be used, and the syringe (before injecting the contents into the bag) along with the vial. The pharmacist views all images as a part of the final product-checking process. This process allows the pharmacist to verify that the correct quantity of medication was transferred from the primary source to a secondary container without being physically present at the time of transfer. Telepharmacy and bar coding provide a means to improve the accuracy of chemotherapy preparation by decreasing the likelihood of using the incorrect product or quantity of drug. The system facilitates the reading of small product labels and removes the need for a pharmacist to handle contaminated syringes and vials when checking the final product.
U.S. Seismic Design Maps Web Application
NASA Astrophysics Data System (ADS)
Martinez, E.; Fee, J.
2015-12-01
The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.
Bit Error Probability for Maximum Likelihood Decoding of Linear Block Codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Fossorier, Marc P. C.; Rhee, Dojun
1996-01-01
In this paper, the bit error probability P(sub b) for maximum likelihood decoding of binary linear codes is investigated. The contribution of each information bit to P(sub b) is considered. For randomly generated codes, it is shown that the conventional approximation at high SNR P(sub b) is approximately equal to (d(sub H)/N)P(sub s), where P(sub s) represents the block error probability, holds for systematic encoding only. Also systematic encoding provides the minimum P(sub b) when the inverse mapping corresponding to the generator matrix of the code is used to retrieve the information sequence. The bit error performances corresponding to other generator matrix forms are also evaluated. Although derived for codes with a generator matrix randomly generated, these results are shown to provide good approximations for codes used in practice. Finally, for decoding methods which require a generator matrix with a particular structure such as trellis decoding or algebraic-based soft decision decoding, equivalent schemes that reduce the bit error probability are discussed.
NASA Astrophysics Data System (ADS)
Qin, Yi; Wang, Zhipeng; Wang, Hongjuan; Gong, Qiong
2018-07-01
We propose a binary image encryption method in joint transform correlator (JTC) by aid of the run-length encoding (RLE) and Quick Response (QR) code, which enables lossless retrieval of the primary image. The binary image is encoded with RLE to obtain the highly compressed data, and then the compressed binary image is further scrambled using a chaos-based method. The compressed and scrambled binary image is then transformed into one QR code that will be finally encrypted in JTC. The proposed method successfully, for the first time to our best knowledge, encodes a binary image into a QR code with the identical size of it, and therefore may probe a new way for extending the application of QR code in optical security. Moreover, the preprocessing operations, including RLE, chaos scrambling and the QR code translation, append an additional security level on JTC. We present digital results that confirm our approach.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
The Athena Astrophysical MHD Code in Cylindrical Geometry
NASA Astrophysics Data System (ADS)
Skinner, M. A.; Ostriker, E. C.
2011-10-01
We have developed a method for implementing cylindrical coordinates in the Athena MHD code (Skinner & Ostriker 2010). The extension has been designed to alter the existing Cartesian-coordinates code (Stone et al. 2008) as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport, a central feature of the Athena algorithm, while making use of previously implemented code modules such as the eigensystems and Riemann solvers. Angular-momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully. We describe modifications for cylindrical coordinates of the higher-order spatial reconstruction and characteristic evolution steps as well as the finite-volume and constrained transport updates. Finally, we have developed a test suite of standard and novel problems in one-, two-, and three-dimensions designed to validate our algorithms and implementation and to be of use to other code developers. The code is suitable for use in a wide variety of astrophysical applications and is freely available for download on the web.
Mixture block coding with progressive transmission in packet video. Appendix 1: Item 2. M.S. Thesis
NASA Technical Reports Server (NTRS)
Chen, Yun-Chung
1989-01-01
Video transmission will become an important part of future multimedia communication because of dramatically increasing user demand for video, and rapid evolution of coding algorithm and VLSI technology. Video transmission will be part of the broadband-integrated services digital network (B-ISDN). Asynchronous transfer mode (ATM) is a viable candidate for implementation of B-ISDN due to its inherent flexibility, service independency, and high performance. According to the characteristics of ATM, the information has to be coded into discrete cells which travel independently in the packet switching network. A practical realization of an ATM video codec called Mixture Block Coding with Progressive Transmission (MBCPT) is presented. This variable bit rate coding algorithm shows how a constant quality performance can be obtained according to user demand. Interactions between codec and network are emphasized including packetization, service synchronization, flow control, and error recovery. Finally, some simulation results based on MBCPT coding with error recovery are presented.
TEA: A Code Calculating Thermochemical Equilibrium Abundances
NASA Astrophysics Data System (ADS)
Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver
2016-07-01
We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. We tested the code against the method of Burrows & Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows & Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.
TOPLHA and ALOHA: comparison between Lower Hybrid wave coupling codes
NASA Astrophysics Data System (ADS)
Meneghini, Orso; Hillairet, J.; Goniche, M.; Bilato, R.; Voyer, D.; Parker, R.
2008-11-01
TOPLHA and ALOHA are wave coupling simulation tools for LH antennas. Both codes are able to account for realistic 3D antenna geometries and use a 1D plasma model. In the framework of a collaboration between MIT and CEA laboratories, the two codes have been extensively compared. In TOPLHA the EM problem is self consistently formulated by means of a set of multiple coupled integral equations having as domain the triangles of the meshed antenna surface. TOPLHA currently uses the FELHS code for modeling the plasma response. ALOHA instead uses a mode matching approach and its own plasma model. Comparisons have been done for several plasma scenarios on different antenna designs: an array of independent waveguides, a multi-junction antenna and a passive/active multi-junction antenna. When simulating the same geometry and plasma conditions the two codes compare remarkably well both for the reflection coefficients and for the launched spectra. The different approach of the two codes to solve the same problem strengthens the confidence in the final results.
TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu
2016-07-01
We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but withmore » higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.« less
Investigation of Near Shannon Limit Coding Schemes
NASA Technical Reports Server (NTRS)
Kwatra, S. C.; Kim, J.; Mo, Fan
1999-01-01
Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.
A decoding procedure for the Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Lim, R. S.
1978-01-01
A decoding procedure is described for the (n,k) t-error-correcting Reed-Solomon (RS) code, and an implementation of the (31,15) RS code for the I4-TENEX central system. This code can be used for error correction in large archival memory systems. The principal features of the decoder are a Galois field arithmetic unit implemented by microprogramming a microprocessor, and syndrome calculation by using the g(x) encoding shift register. Complete decoding of the (31,15) code is expected to take less than 500 microsecs. The syndrome calculation is performed by hardware using the encoding shift register and a modified Chien search. The error location polynomial is computed by using Lin's table, which is an interpretation of Berlekamp's iterative algorithm. The error location numbers are calculated by using the Chien search. Finally, the error values are computed by using Forney's method.
Estimating the costs of VA ambulatory care.
Phibbs, Ciaran S; Bhandari, Aman; Yu, Wei; Barnett, Paul G
2003-09-01
This article reports how we matched Common Procedure Terminology (CPT) codes with Medicare payment rates and aggregate Veterans Affairs (VA) budget data to estimate the costs of every VA ambulatory encounter. Converting CPT codes to encounter-level costs was more complex than a simple match of Medicare reimbursements to CPT codes. About 40 percent of the CPT codes used in VA, representing about 20 percent of procedures, did not have a Medicare payment rate and required other cost estimates. Reconciling aggregated estimated costs to the VA budget allocations for outpatient care produced final VA cost estimates that were lower than projected Medicare reimbursements. The methods used to estimate costs for encounters could be replicated for other settings. They are potentially useful for any system that does not generate billing data, when CPT codes are simpler to collect than billing data, or when there is a need to standardize cost estimates across data sources.
Code of practice for food handler activities.
Smith, T A; Kanas, R P; McCoubrey, I A; Belton, M E
2005-08-01
The food industry regulates various aspects of food handler activities, according to legislation and customer expectations. The purpose of this paper is to provide a code of practice which delineates a set of working standards for food handler hygiene, handwashing, use of protective equipment, wearing of jewellery and body piercing. The code was developed by a working group of occupational physicians with expertise in both food manufacturing and retail, using a risk assessment approach. Views were also obtained from other occupational physicians working within the food industry and the relevant regulatory bodies. The final version of the code (available in full as Supplementary data in Occupational Medicine Online) therefore represents a broad consensus of opinion. The code of practice represents a set of minimum standards for food handler suitability and activities, based on a practical assessment of risk, for application in food businesses. It aims to provide useful working advice to food businesses of all sizes.
Toward performance portability of the Albany finite element analysis code using the Kokkos library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.
Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less
Toward performance portability of the Albany finite element analysis code using the Kokkos library
Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.; ...
2018-02-05
Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less
Development of a Model and Computer Code to Describe Solar Grade Silicon Production Processes
NASA Technical Reports Server (NTRS)
Srivastava, R.; Gould, R. K.
1979-01-01
Mathematical models and computer codes based on these models, which allow prediction of the product distribution in chemical reactors for converting gaseous silicon compounds to condensed-phase silicon were developed. The following tasks were accomplished: (1) formulation of a model for silicon vapor separation/collection from the developing turbulent flow stream within reactors of the Westinghouse (2) modification of an available general parabolic code to achieve solutions to the governing partial differential equations (boundary layer type) which describe migration of the vapor to the reactor walls, (3) a parametric study using the boundary layer code to optimize the performance characteristics of the Westinghouse reactor, (4) calculations relating to the collection efficiency of the new AeroChem reactor, and (5) final testing of the modified LAPP code for use as a method of predicting Si(1) droplet sizes in these reactors.
A Degree Distribution Optimization Algorithm for Image Transmission
NASA Astrophysics Data System (ADS)
Jiang, Wei; Yang, Junjie
2016-09-01
Luby Transform (LT) code is the first practical implementation of digital fountain code. The coding behavior of LT code is mainly decided by the degree distribution which determines the relationship between source data and codewords. Two degree distributions are suggested by Luby. They work well in typical situations but not optimally in case of finite encoding symbols. In this work, the degree distribution optimization algorithm is proposed to explore the potential of LT code. Firstly selection scheme of sparse degrees for LT codes is introduced. Then probability distribution is optimized according to the selected degrees. In image transmission, bit stream is sensitive to the channel noise and even a single bit error may cause the loss of synchronization between the encoder and the decoder. Therefore the proposed algorithm is designed for image transmission situation. Moreover, optimal class partition is studied for image transmission with unequal error protection. The experimental results are quite promising. Compared with LT code with robust soliton distribution, the proposed algorithm improves the final quality of recovered images obviously with the same overhead.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-12
... that a refusal by a member to take action necessary to effectuate a final decision of a FINRA officer... necessary to effectuate a final decision of a FINRA officer or the UPC Committee under the UPC Code (FINRA Rule 11000 Series) or other FINRA rules that permit review of FINRA decisions by the UPC Committee...
2013-04-01
completely change the entire landscape. For example, under the quantum computing regime, factoring prime numbers requires only polynomial time (i.e., Shor’s...AFRL-OSR-VA-TR-2013-0206 Wireless Cybersecurity Biao Chen Syracuse University April 2013 Final Report DISTRIBUTION A...19a. NAME OF RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (Include area code) 21-02-2013 FINAL REPORT 01-04-2009 TO 30-11-2012 Wireless Cybersecurity
Hybrid 3D model for the interaction of plasma thruster plumes with nearby objects
NASA Astrophysics Data System (ADS)
Cichocki, Filippo; Domínguez-Vázquez, Adrián; Merino, Mario; Ahedo, Eduardo
2017-12-01
This paper presents a hybrid particle-in-cell (PIC) fluid approach to model the interaction of a plasma plume with a spacecraft and/or any nearby object. Ions and neutrals are modeled with a PIC approach, while electrons are treated as a fluid. After a first iteration of the code, the domain is split into quasineutral and non-neutral regions, based on non-neutrality criteria, such as the relative charge density and the Debye length-to-cell size ratio. At the material boundaries of the former quasineutral region, a dedicated algorithm ensures that the Bohm condition is met. In the latter non-neutral regions, the electron density and electric potential are obtained by solving the coupled electron momentum balance and Poisson equations. Boundary conditions for both the electric current and potential are finally obtained with a plasma sheath sub-code and an equivalent circuit model. The hybrid code is validated by applying it to a typical plasma plume-spacecraft interaction scenario, and the physics and capabilities of the model are finally discussed.
NASA Technical Reports Server (NTRS)
Bartlett, E. P.; Morse, H. L.; Tong, H.
1971-01-01
Procedures and methods for predicting aerothermodynamic heating to delta orbiter shuttle vehicles were reviewed. A number of approximate methods were found to be adequate for large scale parameter studies, but are considered inadequate for final design calculations. It is recommended that final design calculations be based on a computer code which accounts for nonequilibrium chemistry, streamline spreading, entropy swallowing, and turbulence. It is further recommended that this code be developed with the intent that it can be directly coupled with an exact inviscid flow field calculation when the latter becomes available. A nonsimilar, equilibrium chemistry computer code (BLIMP) was used to evaluate the effects of entropy swallowing, turbulence, and various three dimensional approximations. These solutions were compared with available wind tunnel data. It was found study that, for wind tunnel conditions, the effect of entropy swallowing and three dimensionality are small for laminar boundary layers but entropy swallowing causes a significant increase in turbulent heat transfer. However, it is noted that even small effects (say, 10-20%) may be important for the shuttle reusability concept.
2014-03-10
This document contains final regulations providing guidance toemployers that are subject to the information reporting requirements under section 6056 of the Internal Revenue Code (Code), enacted by the Affordable Care Act (generally employers with at least 50 full-time employees, including full-time equivalent employees). Section 6056 requires those employers to report to the IRS information about the health care coverage, if any, they offered to full-time employees, in order to administer the employer shared responsibility provisions of section 4980H of the Code. Section 6056 also requires those employers to furnish related statements to employees that employees may use to determine whether, for each month of the calendar year, they may claim on their individual tax returns a premium tax credit under section 36B (premium tax credit). The regulations provide for a general reporting method and alternative reporting methods designed to simplify and reduce the cost of reporting for employers subject to the information reporting requirements under section 6056. The regulations affect those employers, employees and other individuals.
Multi-channel feature dictionaries for RGB-D object recognition
NASA Astrophysics Data System (ADS)
Lan, Xiaodong; Li, Qiming; Chong, Mina; Song, Jian; Li, Jun
2018-04-01
Hierarchical matching pursuit (HMP) is a popular feature learning method for RGB-D object recognition. However, the feature representation with only one dictionary for RGB channels in HMP does not capture sufficient visual information. In this paper, we propose multi-channel feature dictionaries based feature learning method for RGB-D object recognition. The process of feature extraction in the proposed method consists of two layers. The K-SVD algorithm is used to learn dictionaries in sparse coding of these two layers. In the first-layer, we obtain features by performing max pooling on sparse codes of pixels in a cell. And the obtained features of cells in a patch are concatenated to generate patch jointly features. Then, patch jointly features in the first-layer are used to learn the dictionary and sparse codes in the second-layer. Finally, spatial pyramid pooling can be applied to the patch jointly features of any layer to generate the final object features in our method. Experimental results show that our method with first or second-layer features can obtain a comparable or better performance than some published state-of-the-art methods.
Shared responsibility payment for not maintaining minimum essential coverage. Final regulations.
2013-08-30
This document contains final regulations on the requirement to maintain minimum essential coverage enacted by the Patient Protection and Affordable Care Act and the Health Care and Education Reconciliation Act of 2010, as amended by the TRICARE Affirmation Act and Public Law 111-173. These final regulations provide guidance to individual taxpayers on the liability under section 5000A of the Internal Revenue Code for the shared responsibility payment for not maintaining minimum essential coverage and largely finalize the rules in the notice of proposed rulemaking published in the Federal Register on February 1, 2013.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baron, Edward
2014-04-28
The progress over the course of the grant period was excellent. We went from 3-D test codes to full 3-D production codes. We studied several SNe Ia. Most of the support has gone for the 3 years of support of OU graduate student Brian Friesen, who is now mature in his fourth year of research. It is unfortunate that there will be no further DOE support to see him through to the completion of his PhD.
Psychometric challenges and proposed solutions when scoring facial emotion expression codes.
Olderbak, Sally; Hildebrandt, Andrea; Pinkpank, Thomas; Sommer, Werner; Wilhelm, Oliver
2014-12-01
Coding of facial emotion expressions is increasingly performed by automated emotion expression scoring software; however, there is limited discussion on how best to score the resulting codes. We present a discussion of facial emotion expression theories and a review of contemporary emotion expression coding methodology. We highlight methodological challenges pertinent to scoring software-coded facial emotion expression codes and present important psychometric research questions centered on comparing competing scoring procedures of these codes. Then, on the basis of a time series data set collected to assess individual differences in facial emotion expression ability, we derive, apply, and evaluate several statistical procedures, including four scoring methods and four data treatments, to score software-coded emotion expression data. These scoring procedures are illustrated to inform analysis decisions pertaining to the scoring and data treatment of other emotion expression questions and under different experimental circumstances. Overall, we found applying loess smoothing and controlling for baseline facial emotion expression and facial plasticity are recommended methods of data treatment. When scoring facial emotion expression ability, maximum score is preferred. Finally, we discuss the scoring methods and data treatments in the larger context of emotion expression research.
DataRocket: Interactive Visualisation of Data Structures
NASA Astrophysics Data System (ADS)
Parkes, Steve; Ramsay, Craig
2010-08-01
CodeRocket is a software engineering tool that provides cognitive support to the software engineer for reasoning about a method or procedure and for documenting the resulting code [1]. DataRocket is a software engineering tool designed to support visualisation and reasoning about program data structures. DataRocket is part of the CodeRocket family of software tools developed by Rapid Quality Systems [2] a spin-out company from the Space Technology Centre at the University of Dundee. CodeRocket and DataRocket integrate seamlessly with existing architectural design and coding tools and provide extensive documentation with little or no effort on behalf of the software engineer. Comprehensive, abstract, detailed design documentation is available early on in a project so that it can be used for design reviews with project managers and non expert stakeholders. Code and documentation remain fully synchronised even when changes are implemented in the code without reference to the existing documentation. At the end of a project the press of a button suffices to produce the detailed design document. Existing legacy code can be easily imported into CodeRocket and DataRocket to reverse engineer detailed design documentation making legacy code more manageable and adding substantially to its value. This paper introduces CodeRocket. It then explains the rationale for DataRocket and describes the key features of this new tool. Finally the major benefits of DataRocket for different stakeholders are considered.
Final Report: Correctness Tools for Petascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mellor-Crummey, John
2014-10-27
In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoringmore » of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.« less
Shared responsibility for employers regarding health coverage. Final regulations.
2014-02-12
This document contains final regulations providing guidance to employers that are subject to the shared responsibility provisions regarding employee health coverage under section 4980H of the Internal Revenue Code (Code), enacted by the Affordable Care Act. These regulations affect employers referred to as applicable large employers (generally meaning, for each year, employers that had 50 or more full-time employees, including full-time equivalent employees, during the prior year). Generally, under section 4980H an applicable large employer that, for a calendar month, fails to offer to its full-time employees health coverage that is affordable and provides minimum value may be subject to an assessable payment if a full-time employee enrolls for that month in a qualified health plan for which the employee receives a premium tax credit.
NASA Astrophysics Data System (ADS)
Huang, Wei; Ma, Chengfu; Chen, Yuhang
2014-12-01
A method for simple and reliable displacement measurement with nanoscale resolution is proposed. The measurement is realized by combining a common optical microscopy imaging of a specially coded nonperiodic microstructure, namely two-dimensional zero-reference mark (2-D ZRM), and subsequent correlation analysis of the obtained image sequence. The autocorrelation peak contrast of the ZRM code is maximized with well-developed artificial intelligence algorithms, which enables robust and accurate displacement determination. To improve the resolution, subpixel image correlation analysis is employed. Finally, we experimentally demonstrate the quasi-static and dynamic displacement characterization ability of a micro 2-D ZRM.
Giles, Tracey M; de Lacey, Sheryl; Muir-Cochrane, Eimear
2016-01-01
Grounded theory method has been described extensively in the literature. Yet, the varying processes portrayed can be confusing for novice grounded theorists. This article provides a worked example of the data analysis phase of a constructivist grounded theory study that examined family presence during resuscitation in acute health care settings. Core grounded theory methods are exemplified, including initial and focused coding, constant comparative analysis, memo writing, theoretical sampling, and theoretical saturation. The article traces the construction of the core category "Conditional Permission" from initial and focused codes, subcategories, and properties, through to its position in the final substantive grounded theory.
Learning Short Binary Codes for Large-scale Image Retrieval.
Liu, Li; Yu, Mengyang; Shao, Ling
2017-03-01
Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.
Coset Codes Viewed as Terminated Convolutional Codes
NASA Technical Reports Server (NTRS)
Fossorier, Marc P. C.; Lin, Shu
1996-01-01
In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.
An assessment of multibody simulation tools for articulated spacecraft
NASA Technical Reports Server (NTRS)
Man, Guy K.; Sirlin, Samuel W.
1989-01-01
A survey of multibody simulation codes was conducted in the spring of 1988, to obtain an assessment of the state of the art in multibody simulation codes from the users of the codes. This survey covers the most often used articulated multibody simulation codes in the spacecraft and robotics community. There was no attempt to perform a complete survey of all available multibody codes in all disciplines. Furthermore, this is not an exhaustive evaluation of even robotics and spacecraft multibody simulation codes, as the survey was designed to capture feedback on issues most important to the users of simulation codes. We must keep in mind that the information received was limited and the technical background of the respondents varied greatly. Therefore, only the most often cited observations from the questionnaire are reported here. In this survey, it was found that no one code had both many users (reports) and no limitations. The first section is a report on multibody code applications. Following applications is a discussion of execution time, which is the most troublesome issue for flexible multibody codes. The representation of component flexible bodies, which affects both simulation setup time as well as execution time, is presented next. Following component data preparation, two sections address the accessibility or usability of a code, evaluated by considering its user interface design and examining the overall simulation integrated environment. A summary of user efforts at code verification is reported, before a tabular summary of the questionnaire responses. Finally, some conclusions are drawn.
GRILLIX: a 3D turbulence code based on the flux-coordinate independent approach
NASA Astrophysics Data System (ADS)
Stegmeir, Andreas; Coster, David; Ross, Alexander; Maj, Omar; Lackner, Karl; Poli, Emanuele
2018-03-01
The GRILLIX code is presented with which plasma turbulence/transport in various geometries can be simulated in 3D. The distinguishing feature of the code is that it is based on the flux-coordinate independent approach (FCI) (Hariri and Ottaviani 2013 Comput. Phys. Commun. 184 2419; Stegmeir et al 2016 Comput. Phys. Commun. 198 139). Cylindrical or Cartesian grids are used on which perpendicular operators are discretised via standard finite difference methods and parallel operators via a field line tracing and interpolation procedure (field line map). This offers a very high flexibility with respect to geometry, especially a separatrix with X-point(s) or a magnetic axis can be treated easily in contrast to approaches which are based on field aligned coordinates and suffer from coordinate singularities. Aiming finally for simulation of edge and scrape-off layer (SOL) turbulence, an isothermal electrostatic drift-reduced Braginskii model (Zeiler et al 1997 Phys. Plasmas 4 2134) has been implemented in GRILLIX. We present the numerical approach, which is based on a toroidally staggered formulation of the FCI, we show verification of the code with the method of manufactured solutions and show a benchmark based on a TORPEX blob experiment, previously performed by several edge/SOL codes (Riva et al 2016 Plasma Phys. Control. Fusion 58 044005). Examples for slab, circular, limiter and diverted geometry are presented. Finally, the results show that the FCI approach in general and GRILLIX in particular are viable approaches in order to tackle simulation of edge/SOL turbulence in diverted geometry.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-03
... dumping margins likely to prevail is indicated in the ``Final Results of Sunset Review'' section of this... likelihood of continuation or recurrence of dumping and the magnitude of the margins likely to prevail if the... dumping margin likely to prevail [FR Doc. 2013-28952 Filed 12-2-13; 8:45 am] BILLING CODE 3510-DS-P ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.
1995-12-31
In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less
NASA Astrophysics Data System (ADS)
Nasaruddin; Tsujioka, Tetsuo
An optical CDMA (OCDMA) system is a flexible technology for future broadband multiple access networks. A secure OCDMA network in broadband optical access technologies is also becoming an issue of great importance. In this paper, we propose novel reconfigurable wavelength-time (W-T) optical codes that lead to secure transmission in OCDMA networks. The proposed W-T optical codes are constructed by using quasigroups (QGs) for wavelength hopping and one-dimensional optical orthogonal codes (OOCs) for time spreading; we call them QGs/OOCs. Both QGs and OOCs are randomly generated by a computer search to ensure that an eavesdropper could not improve its interception performance by making use of the coding structure. Then, the proposed reconfigurable QGs/OOCs can provide more codewords, and many different code set patterns, which differ in both wavelength and time positions for given code parameters. Moreover, the bit error probability of the proposed codes is analyzed numerically. To realize the proposed codes, a secure system is proposed by employing reconfigurable encoders/decoders based on array waveguide gratings (AWGs), which allow the users to change their codeword patterns to protect against eavesdropping. Finally, the probability of breaking a certain codeword in the proposed system is evaluated analytically. The results show that the proposed codes and system can provide a large codeword pattern, and decrease the probability of breaking a certain codeword, to enhance OCDMA network security.
Adapting a Navier-Stokes code to the ICL-DAP
NASA Technical Reports Server (NTRS)
Grosch, C. E.
1985-01-01
The results of an experiment are reported, i.c., to adapt a Navier-Stokes code, originally developed on a serial computer, to concurrent processing on the CL Distributed Array Processor (DAP). The algorithm used in solving the Navier-Stokes equations is briefly described. The architecture of the DAP and DAP FORTRAN are also described. The modifications of the algorithm so as to fit the DAP are given and discussed. Finally, performance results are given and conclusions are drawn.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-05-26
The Circular calls the attention of Coast Guard field units, marine surveyors, shippers and carriers of nuclear materials to the International Maritime Organization (IMO) Code for the Safe Carriage of Irradiated Nuclear Fuel, Plutonium and High-Level Radioactive Wastes in Flasks on Board Ships (IMO Resolution A.748(18)).
Coherent communication with continuous quantum variables
NASA Astrophysics Data System (ADS)
Wilde, Mark M.; Krovi, Hari; Brun, Todd A.
2007-06-01
The coherent bit (cobit) channel is a resource intermediate between classical and quantum communication. It produces coherent versions of teleportation and superdense coding. We extend the cobit channel to continuous variables by providing a definition of the coherent nat (conat) channel. We construct several coherent protocols that use both a position-quadrature and a momentum-quadrature conat channel with finite squeezing. Finally, we show that the quality of squeezing diminishes through successive compositions of coherent teleportation and superdense coding.
CO-FIRING COAL: FEEDLOT AND LITTER BIOMASS FUELS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Kalyan Annamalai; Dr. John Sweeten; Dr. Sayeed Mukhtar
2000-10-24
The following are proposed activities for quarter 1 (6/15/00-9/14/00): (1) Finalize the allocation of funds within TAMU to co-principal investigators and the final task lists; (2) Acquire 3 D computer code for coal combustion and modify for cofiring Coal:Feedlot biomass and Coal:Litter biomass fuels; (3) Develop a simple one dimensional model for fixed bed gasifier cofired with coal:biomass fuels; and (4) Prepare the boiler burner for reburn tests with feedlot biomass fuels. The following were achieved During Quarter 5 (6/15/00-9/14/00): (1) Funds are being allocated to co-principal investigators; task list from Prof. Mukhtar has been received (Appendix A); (2) Ordermore » has been placed to acquire Pulverized Coal gasification and Combustion 3 D (PCGC-3) computer code for coal combustion and modify for cofiring Coal: Feedlot biomass and Coal: Litter biomass fuels. Reason for selecting this code is the availability of source code for modification to include biomass fuels; (3) A simplified one-dimensional model has been developed; however convergence had not yet been achieved; and (4) The length of the boiler burner has been increased to increase the residence time. A premixed propane burner has been installed to simulate coal combustion gases. First coal, as a reburn fuel will be used to generate base line data followed by methane, feedlot and litter biomass fuels.« less
Code of Federal Regulations, 2010 CFR
2010-10-01
... Technology (NIST) using NIST Special Publication 800-87, “Codes for the Identification of Federal and Federally Assisted Organizations,” at http://csrc.nist.gov/publications/nistpubs/800-87/sp800-87-Final.pdf...
Phonological coding during reading.
Leinenger, Mallorie
2014-11-01
The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Phonological coding during reading
Leinenger, Mallorie
2014-01-01
The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679
Lessons Learned through the Development and Publication of AstroImageJ
NASA Astrophysics Data System (ADS)
Collins, Karen
2018-01-01
As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.
Solving free-plasma-boundary problems with the SIESTA MHD code
NASA Astrophysics Data System (ADS)
Sanchez, R.; Peraza-Rodriguez, H.; Reynolds-Barredo, J. M.; Tribaldos, V.; Geiger, J.; Hirshman, S. P.; Cianciosa, M.
2017-10-01
SIESTA is a recently developed MHD equilibrium code designed to perform fast and accurate calculations of ideal MHD equilibria for 3D magnetic configurations. It is an iterative code that uses the solution obtained by the VMEC code to provide a background coordinate system and an initial guess of the solution. The final solution that SIESTA finds can exhibit magnetic islands and stochastic regions. In its original implementation, SIESTA addressed only fixed-boundary problems. This fixed boundary condition somewhat restricts its possible applications. In this contribution we describe a recent extension of SIESTA that enables it to address free-plasma-boundary situations, opening up the possibility of investigating problems with SIESTA in which the plasma boundary is perturbed either externally or internally. As an illustration, the extended version of SIESTA is applied to a configuration of the W7-X stellarator.
Design of wavefront coding optical system with annular aperture
NASA Astrophysics Data System (ADS)
Chen, Xinhua; Zhou, Jiankang; Shen, Weimin
2016-10-01
Wavefront coding can extend the depth of field of traditional optical system by inserting a phase mask into the pupil plane. In this paper, the point spread function (PSF) of wavefront coding system with annular aperture are analyzed. Stationary phase method and fast Fourier transform (FFT) method are used to compute the diffraction integral respectively. The OTF invariance is analyzed for the annular aperture with cubic phase mask under different obscuration ratio. With these analysis results, a wavefront coding system using Maksutov-Cassegrain configuration is designed finally. It is an F/8.21 catadioptric system with annular aperture, and its focal length is 821mm. The strength of the cubic phase mask is optimized with user-defined operand in Zemax. The Wiener filtering algorithm is used to restore the images and the numerical simulation proves the validity of the design.
Sandford, M.T. II; Handel, T.G.; Bradley, J.N.
1998-03-10
A method of embedding auxiliary information into the digital representation of host data created by a lossy compression technique is disclosed. The method applies to data compressed with lossy algorithms based on series expansion, quantization to a finite number of symbols, and entropy coding. Lossy compression methods represent the original data as integer indices having redundancy and uncertainty in value by one unit. Indices which are adjacent in value are manipulated to encode auxiliary data. By a substantially reverse process, the embedded auxiliary data can be retrieved easily by an authorized user. Lossy compression methods use loss-less compressions known also as entropy coding, to reduce to the final size the intermediate representation as indices. The efficiency of the compression entropy coding, known also as entropy coding is increased by manipulating the indices at the intermediate stage in the manner taught by the method. 11 figs.
Sandford, II, Maxwell T.; Handel, Theodore G.; Bradley, Jonathan N.
1998-01-01
A method of embedding auxiliary information into the digital representation of host data created by a lossy compression technique. The method applies to data compressed with lossy algorithms based on series expansion, quantization to a finite number of symbols, and entropy coding. Lossy compression methods represent the original data as integer indices having redundancy and uncertainty in value by one unit. Indices which are adjacent in value are manipulated to encode auxiliary data. By a substantially reverse process, the embedded auxiliary data can be retrieved easily by an authorized user. Lossy compression methods use loss-less compressions known also as entropy coding, to reduce to the final size the intermediate representation as indices. The efficiency of the compression entropy coding, known also as entropy coding is increased by manipulating the indices at the intermediate stage in the manner taught by the method.
Transport of Light Ions in Matter
NASA Technical Reports Server (NTRS)
Wilson, J. W.; Cucinotta, F. A.; Tai, H.; Shinn, J. L.; Chun, S. Y.; Tripathi, R. K.; Sihver, L.
1998-01-01
A recent set of light ion experiments are analyzed using the Green's function method of solving the Boltzmann equation for ions of high charge and energy (the GRNTRN transport code) and the NUCFRG2 fragmentation database generator code. Although the NUCFRG2 code reasonably represents the fragmentation of heavy ions, the effects of light ion fragmentation requires a more detailed nuclear model including shell structure and short range correlations appearing as tightly bound clusters in the light ion nucleus. The most recent NTJCFRG2 code is augmented with a quasielastic alpha knockout model and semiempirical adjustments (up to 30 percent in charge removal) in the fragmentation process allowing reasonable agreement with the experiments to be obtained. A final resolution of the appropriate cross sections must await the full development of a coupled channel reaction model in which shell structure and clustering can be accurately evaluated.
Viterbi decoding for satellite and space communication.
NASA Technical Reports Server (NTRS)
Heller, J. A.; Jacobs, I. M.
1971-01-01
Convolutional coding and Viterbi decoding, along with binary phase-shift keyed modulation, is presented as an efficient system for reliable communication on power limited satellite and space channels. Performance results, obtained theoretically and through computer simulation, are given for optimum short constraint length codes for a range of code constraint lengths and code rates. System efficiency is compared for hard receiver quantization and 4 and 8 level soft quantization. The effects on performance of varying of certain parameters relevant to decoder complexity and cost are examined. Quantitative performance degradation due to imperfect carrier phase coherence is evaluated and compared to that of an uncoded system. As an example of decoder performance versus complexity, a recently implemented 2-Mbit/sec constraint length 7 Viterbi decoder is discussed. Finally a comparison is made between Viterbi and sequential decoding in terms of suitability to various system requirements.
2014-11-26
This document contains final regulations relating to the requirement to maintain minimum essential coverage enacted by the Patient Protection and Affordable Care Act and the Health Care and Education Reconciliation Act of 2010, as amended by the TRICARE Affirmation Act and Public Law 111-173 (collectively, the Affordable Care Act). These final regulations provide individual taxpayers with guidance under section 5000A of the Internal Revenue Code on the requirement to maintain minimum essential coverage and rules governing certain types of exemptions from that requirement.
Operational evaluation of a DGPS / SATCOM VTS : final report
DOT National Transportation Integrated Search
1996-09-01
Satellite communications (SATCOM) using code division multiple access(CDMA) modulation and burst messaging, provided a new dimension to communication channel capacity, operating dependability, and area of coverage. This technology, together with diff...
Code of Federal Regulations, 2011 CFR
2011-10-01
... applicable agency codes maintained by the National Institute of Standards and Technology (NIST) using NIST...,” at http://csrc.nist.gov/publications/nistpubs/800-87/sp800-87-Final.pdf. (d) Agencies exempt from the...
Force-free electrodynamics in dynamical curved spacetimes
NASA Astrophysics Data System (ADS)
McWilliams, Sean
2015-04-01
We present results on our study of force-free electrodynamics in curved spacetimes. Specifically, we present several improvements to what has become the established set of evolution equations, and we apply these to study the nonlinear stability of analytically known force-free solutions for the first time. We implement our method in a new pseudo-spectral code built on top of the SpEC code for evolving dynamic spacetimes. Finally, we revisit these known solutions and attempt to clarify some interesting properties that render them analytically tractable. Finally, we preview some new work that similarly revisits the established approach to solving another problem in numerical relativity: the post-merger recoil from asymmetric gravitational-wave emission. These new results may have significant implications for the parameter dependence of recoils, and consequently on the statistical expectations for recoil velocities of merged systems.
Rodrigues, J M; Trombert-Paviot, B; Baud, R; Wagner, J; Meusnier-Carriot, F
1998-01-01
GALEN has developed a language independent common reference model based on a medically oriented ontology and practical tools and techniques for managing healthcare terminology including natural language processing. GALEN-IN-USE is the current phase which applied the modelling and the tools to the development or the updating of coding systems for surgical procedures in different national coding centers co-operating within the European Federation of Coding Centre (EFCC) to create a language independent knowledge repository for multicultural Europe. We used an integrated set of artificial intelligence terminology tools named CLAssification Manager workbench to process French professional medical language rubrics into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation we generate controlled French natural language. The French national coding centre is then able to retrieve the initial professional rubrics with different categories of concepts, to compare the professional language proposed by expert clinicians to the French generated controlled vocabulary and to finalize the linguistic labels of the coding system in relation with the meanings of the conceptual system structure.
Giulio, Massimo Di
2018-05-19
A discriminative statistical test among the different theories proposed to explain the origin of the genetic code is presented. Gathering the amino acids into polarity and biosynthetic classes that are the first expression of the physicochemical theory of the origin of the genetic code and the second expression of the coevolution theory, these classes are utilized in the Fisher's exact test to establish their significance within the genetic code table. Linking to the rows and columns of the genetic code of probabilities that express the statistical significance of these classes, I have finally been in the condition to be able to calculate a χ value to link to both the physicochemical theory and to the coevolution theory that would express the corroboration level referred to these theories. The comparison between these two χ values showed that the coevolution theory is able to explain - in this strictly empirical analysis - the origin of the genetic code better than that of the physicochemical theory. Copyright © 2018 Elsevier B.V. All rights reserved.
RELAP-7 Code Assessment Plan and Requirement Traceability Matrix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.
2016-10-01
The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less
Validation of the SINDA/FLUINT code using several analytical solutions
NASA Technical Reports Server (NTRS)
Keller, John R.
1995-01-01
The Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA/FLUINT) code has often been used to determine the transient and steady-state response of various thermal and fluid flow networks. While this code is an often used design and analysis tool, the validation of this program has been limited to a few simple studies. For the current study, the SINDA/FLUINT code was compared to four different analytical solutions. The thermal analyzer portion of the code (conduction and radiative heat transfer, SINDA portion) was first compared to two separate solutions. The first comparison examined a semi-infinite slab with a periodic surface temperature boundary condition. Next, a small, uniform temperature object (lumped capacitance) was allowed to radiate to a fixed temperature sink. The fluid portion of the code (FLUINT) was also compared to two different analytical solutions. The first study examined a tank filling process by an ideal gas in which there is both control volume work and heat transfer. The final comparison considered the flow in a pipe joining two infinite reservoirs of pressure. The results of all these studies showed that for the situations examined here, the SINDA/FLUINT code was able to match the results of the analytical solutions.
Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes. Part 3
NASA Technical Reports Server (NTRS)
Lin, Shu
1998-01-01
Decoding algorithms based on the trellis representation of a code (block or convolutional) drastically reduce decoding complexity. The best known and most commonly used trellis-based decoding algorithm is the Viterbi algorithm. It is a maximum likelihood decoding algorithm. Convolutional codes with the Viterbi decoding have been widely used for error control in digital communications over the last two decades. This chapter is concerned with the application of the Viterbi decoding algorithm to linear block codes. First, the Viterbi algorithm is presented. Then, optimum sectionalization of a trellis to minimize the computational complexity of a Viterbi decoder is discussed and an algorithm is presented. Some design issues for IC (integrated circuit) implementation of a Viterbi decoder are considered and discussed. Finally, a new decoding algorithm based on the principle of compare-select-add is presented. This new algorithm can be applied to both block and convolutional codes and is more efficient than the conventional Viterbi algorithm based on the add-compare-select principle. This algorithm is particularly efficient for rate 1/n antipodal convolutional codes and their high-rate punctured codes. It reduces computational complexity by one-third compared with the Viterbi algorithm.
Code Parallelization with CAPO: A User Manual
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Frumkin, Michael; Yan, Jerry; Biegel, Bryan (Technical Monitor)
2001-01-01
A software tool has been developed to assist the parallelization of scientific codes. This tool, CAPO, extends an existing parallelization toolkit, CAPTools developed at the University of Greenwich, to generate OpenMP parallel codes for shared memory architectures. This is an interactive toolkit to transform a serial Fortran application code to an equivalent parallel version of the software - in a small fraction of the time normally required for a manual parallelization. We first discuss the way in which loop types are categorized and how efficient OpenMP directives can be defined and inserted into the existing code using the in-depth interprocedural analysis. The use of the toolkit on a number of application codes ranging from benchmark to real-world application codes is presented. This will demonstrate the great potential of using the toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of processors. The second part of the document gives references to the parameters and the graphic user interface implemented in the toolkit. Finally a set of tutorials is included for hands-on experiences with this toolkit.
NASA Technical Reports Server (NTRS)
Wang, Yongli; Benson, Robert F.
2011-01-01
Two software applications have been produced specifically for the analysis of some million digital topside ionograms produced by a recent analog-to-digital conversion effort of selected analog telemetry tapes from the Alouette-2, ISIS-1 and ISIS-2 satellites. One, TOPIST (TOPside Ionogram Scalar with True-height algorithm) from the University of Massachusetts Lowell, is designed for the automatic identification of the topside-ionogram ionospheric-reflection traces and their inversion into vertical electron-density profiles Ne(h). TOPIST also has the capability of manual intervention. The other application, from the Goddard Space Flight Center based on the FORTRAN code of John E. Jackson from the 1960s, is designed as an IDL-based interactive program for the scaling of selected digital topside-sounder ionograms. The Jackson code has also been modified, with some effort, so as to run on modern computers. This modification was motivated by the need to scale selected ionograms from the millions of Alouette/ISIS topside-sounder ionograms that only exist on 35-mm film. During this modification, it became evident that it would be more efficient to design a new code, based on the capabilities of present-day computers, than to continue to modify the old code. Such a new code has been produced and here we will describe its capabilities and compare Ne(h) profiles produced from it with those produced by the Jackson code. The concept of the new code is to assume an initial Ne(h) and derive a final Ne(h) through an iteration process that makes the resulting apparent-height profile fir the scaled values within a certain error range. The new code can be used on the X-, O-, and Z-mode traces. It does not assume any predefined profile shape between two contiguous points, like the exponential rule used in Jackson s program. Instead, Monotone Piecewise Cubic Interpolation is applied in the global profile to keep the monotone nature of the profile, which also ensures better smoothness in the final profile than in Jackson s program. The new code uses the complete refractive index expression for a cold collisionless plasma and can accommodate the IGRF, T96, and other geomagnetic field models.
NASA Technical Reports Server (NTRS)
Likhanskii, Alexandre
2012-01-01
This report is the final report of a SBIR Phase I project. It is identical to the final report submitted, after some proprietary information of administrative nature has been removed. The development of a numerical simulation tool for dielectric barrier discharge (DBD) plasma actuator is reported. The objectives of the project were to analyze and predict DBD operation at wide range of ambient gas pressures. It overcomes the limitations of traditional DBD codes which are limited to low-speed applications and have weak prediction capabilities. The software tool allows DBD actuator analysis and prediction for subsonic to hypersonic flow regime. The simulation tool is based on the VORPAL code developed by Tech-X Corporation. VORPAL's capability of modeling DBD plasma actuator at low pressures (0.1 to 10 torr) using kinetic plasma modeling approach, and at moderate to atmospheric pressures (1 to 10 atm) using hydrodynamic plasma modeling approach, were demonstrated. In addition, results of experiments with pulsed+bias DBD configuration that were performed for validation purposes are reported.
Perception of "no code" and the role of the nurse.
Honan, S; Helseth, C C; Bakke, J; Karpiuk, K; Krsnak, G; Torkelson, R
1991-01-01
CPR is now the rule rather than the exception and death is often viewed as the ultimate failure in modern medicine, rather than the final event of the natural life process (Stevens, 1986). The "No Code" concept has created a major dilemma in health care. An interagency collaborative study was conducted to ascertain the perceptions of nurses, physicians, and laypersons about this issue. This article deals primarily with the nurse's role and perceptions of the "No Code" issue. The comparison of nurses' perceptions with those of physicians and laypersons is unique to this study. Based on this research, suggestions are presented that will assist nursing educators and health care professionals in managing this complex dilemma.
An overview of data acquisition, signal coding and data analysis techniques for MST radars
NASA Technical Reports Server (NTRS)
Rastogi, P. K.
1986-01-01
An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.
Automated error correction in IBM quantum computer and explicit generalization
NASA Astrophysics Data System (ADS)
Ghosh, Debjit; Agarwal, Pratik; Pandey, Pratyush; Behera, Bikash K.; Panigrahi, Prasanta K.
2018-06-01
Construction of a fault-tolerant quantum computer remains a challenging problem due to unavoidable noise and fragile quantum states. However, this goal can be achieved by introducing quantum error-correcting codes. Here, we experimentally realize an automated error correction code and demonstrate the nondestructive discrimination of GHZ states in IBM 5-qubit quantum computer. After performing quantum state tomography, we obtain the experimental results with a high fidelity. Finally, we generalize the investigated code for maximally entangled n-qudit case, which could both detect and automatically correct any arbitrary phase-change error, or any phase-flip error, or any bit-flip error, or combined error of all types of error.
NARMER-1: a photon point-kernel code with build-up factors
NASA Astrophysics Data System (ADS)
Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence
2017-09-01
This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.
NASA Astrophysics Data System (ADS)
Vu, Thang X.; Duhamel, Pierre; Chatzinotas, Symeon; Ottersten, Bjorn
2017-12-01
This work studies the performance of a cooperative network which consists of two channel-coded sources, multiple relays, and one destination. To achieve high spectral efficiency, we assume that a single time slot is dedicated to relaying. Conventional network-coded-based cooperation (NCC) selects the best relay which uses network coding to serve the two sources simultaneously. The bit error rate (BER) performance of NCC with channel coding, however, is still unknown. In this paper, we firstly study the BER of NCC via a closed-form expression and analytically show that NCC only achieves diversity of order two regardless of the number of available relays and the channel code. Secondly, we propose a novel partial relaying-based cooperation (PARC) scheme to improve the system diversity in the finite signal-to-noise ratio (SNR) regime. In particular, closed-form expressions for the system BER and diversity order of PARC are derived as a function of the operating SNR value and the minimum distance of the channel code. We analytically show that the proposed PARC achieves full (instantaneous) diversity order in the finite SNR regime, given that an appropriate channel code is used. Finally, numerical results verify our analysis and demonstrate a large SNR gain of PARC over NCC in the SNR region of interest.
2006-09-22
This final rule adopts the substance of the April 15, 2004 tentative interim amendment (TIA) 00-1 (101), Alcohol Based Hand Rub Solutions, an amendment to the 2000 edition of the Life Safety Code, published by the National Fire Protection Association (NFPA). This amendment allows certain health care facilities to place alcohol-based hand rub dispensers in egress corridors under specified conditions. This final rule also requires that nursing facilities at least install battery-operated single station smoke alarms in resident rooms and common areas if they are not fully sprinklered or they do not have system-based smoke detectors in those areas. Finally, this final rule confirms as final the provisions of the March 25, 2005 interim final rule with changes and responds to public comments on that rule.
GAME: GAlaxy Machine learning for Emission lines
NASA Astrophysics Data System (ADS)
Ucci, G.; Ferrara, A.; Pallottini, A.; Gallerani, S.
2018-06-01
We present an updated, optimized version of GAME (GAlaxy Machine learning for Emission lines), a code designed to infer key interstellar medium physical properties from emission line intensities of ultraviolet /optical/far-infrared galaxy spectra. The improvements concern (a) an enlarged spectral library including Pop III stars, (b) the inclusion of spectral noise in the training procedure, and (c) an accurate evaluation of uncertainties. We extensively validate the optimized code and compare its performance against empirical methods and other available emission line codes (PYQZ and HII-CHI-MISTRY) on a sample of 62 SDSS stacked galaxy spectra and 75 observed HII regions. Very good agreement is found for metallicity. However, ionization parameters derived by GAME tend to be higher. We show that this is due to the use of too limited libraries in the other codes. The main advantages of GAME are the simultaneous use of all the measured spectral lines and the extremely short computational times. We finally discuss the code potential and limitations.
A new code for modelling the near field diffusion releases from the final disposal of nuclear waste
NASA Astrophysics Data System (ADS)
Vopálka, D.; Vokál, A.
2003-01-01
The canisters with spent nuclear fuel produced during the operation of WWER reactors at the Czech power plants are planned, like in other countries, to be disposed of in an underground repository. Canisters will be surrounded by compacted bentonite that will retard the migration of safety-relevant radionuclides into the host rock. A new code that enables the modelling of the critical radionuclides transport from the canister through the bentonite layer in the cylindrical geometry was developed. The code enables to solve the diffusion equation for various types of initial and boundary conditions by means of the finite difference method and to take into account the non-linear shape of the sorption isotherm. A comparison of the code reported here with code PAGODA, which is based on analytical solution of the transport equation, was made for the actinide chain 4N+3 that includes 239Pu. A simple parametric study of the releases of 239Pu, 129I, and 14C into geosphere is discussed.
Establishment of a New Drug Code for Marihuana Extract. Final rule.
2016-12-14
The Drug Enforcement Administration is creating a new Administration Controlled Substances Code Number for "Marihuana Extract." This code number will allow DEA and DEA-registered entities to track quantities of this material separately from quantities of marihuana. This, in turn, will aid in complying with relevant treaty provisions. Under international drug control treaties administered by the United Nations, some differences exist between the regulatory controls pertaining to marihuana extract versus those for marihuana and tetrahydrocannabinols. The DEA has previously established separate code numbers for marihuana and for tetrahydrocannabinols, but not for marihuana extract. To better track these materials and comply with treaty provisions, DEA is creating a separate code number for marihuana extract with the following definition: "Meaning an extract containing one or more cannabinoids that has been derived from any plant of the genus Cannabis, other than the separated resin (whether crude or purified) obtained from the plant." Extracts of marihuana will continue to be treated as Schedule I controlled substances.
Turbulence dissipation challenge: particle-in-cell simulations
NASA Astrophysics Data System (ADS)
Roytershteyn, V.; Karimabadi, H.; Omelchenko, Y.; Germaschewski, K.
2015-12-01
We discuss application of three particle in cell (PIC) codes to the problems relevant to turbulence dissipation challenge. VPIC is a fully kinetic code extensively used to study a variety of diverse problems ranging from laboratory plasmas to astrophysics. PSC is a flexible fully kinetic code offering a variety of algorithms that can be advantageous to turbulence simulations, including high order particle shapes, dynamic load balancing, and ability to efficiently run on Graphics Processing Units (GPUs). Finally, HYPERS is a novel hybrid (kinetic ions+fluid electrons) code, which utilizes asynchronous time advance and a number of other advanced algorithms. We present examples drawn both from large-scale turbulence simulations and from the test problems outlined by the turbulence dissipation challenge. Special attention is paid to such issues as the small-scale intermittency of inertial range turbulence, mode content of the sub-proton range of scales, the formation of electron-scale current sheets and the role of magnetic reconnection, as well as numerical challenges of applying PIC codes to simulations of astrophysical turbulence.
Feature reconstruction of LFP signals based on PLSR in the neural information decoding study.
Yonghui Dong; Zhigang Shang; Mengmeng Li; Xinyu Liu; Hong Wan
2017-07-01
To solve the problems of Signal-to-Noise Ratio (SNR) and multicollinearity when the Local Field Potential (LFP) signals is used for the decoding of animal motion intention, a feature reconstruction of LFP signals based on partial least squares regression (PLSR) in the neural information decoding study is proposed in this paper. Firstly, the feature information of LFP coding band is extracted based on wavelet transform. Then the PLSR model is constructed by the extracted LFP coding features. According to the multicollinearity characteristics among the coding features, several latent variables which contribute greatly to the steering behavior are obtained, and the new LFP coding features are reconstructed. Finally, the K-Nearest Neighbor (KNN) method is used to classify the reconstructed coding features to verify the decoding performance. The results show that the proposed method can achieve the highest accuracy compared to the other three methods and the decoding effect of the proposed method is robust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santucci, P.; Guetat, P.
1993-12-31
This document describes the code CERISE, Code d`Evaluations Radiologiques Individuelles pour des Situations en Enterprise et dans l`Environnement. This code has been developed in the frame of European studies to establish acceptance criteria of very low-level radioactive waste and materials. This code is written in Fortran and runs on PC. It calculates doses received by the different pathways: external exposure, ingestion, inhalation and skin contamination. Twenty basic scenarios are already elaborated, which have been determined from previous studies. Calculations establish the relation between surface, specific and/or total activities, and doses. Results can be expressed as doses for an average activitymore » unit, or as average activity limits for a set of reference doses (defined for each scenario analyzed). In this last case, the minimal activity values and the corresponding limiting scenarios, are selected and summarized in a final table.« less
HITEMP Material and Structural Optimization Technology Transfer
NASA Technical Reports Server (NTRS)
Collier, Craig S.; Arnold, Steve (Technical Monitor)
2001-01-01
The feasibility of adding viscoelasticity and the Generalized Method of Cells (GMC) for micromechanical viscoelastic behavior into the commercial HyperSizer structural analysis and optimization code was investigated. The viscoelasticity methodology was developed in four steps. First, a simplified algorithm was devised to test the iterative time stepping method for simple one-dimensional multiple ply structures. Second, GMC code was made into a callable subroutine and incorporated into the one-dimensional code to test the accuracy and usability of the code. Third, the viscoelastic time-stepping and iterative scheme was incorporated into HyperSizer for homogeneous, isotropic viscoelastic materials. Finally, the GMC was included in a version of HyperSizer. MS Windows executable files implementing each of these steps is delivered with this report, as well as source code. The findings of this research are that both viscoelasticity and GMC are feasible and valuable additions to HyperSizer and that the door is open for more advanced nonlinear capability, such as viscoplasticity.
1980-12-01
Detachment, White Oak Laboratory, Silver Spring Code 240, Sigmund Jacobs (1) G. B. Wilmot (1) 1 Naval Underwater Systems Center, Newport (Code 5B331...Models by Kenneth K. Kuo and Mridul Kumar Systems Associates DTIC Pennsylvanir State University ELECTE for the APR 8 1981 Research Department B...ACTIVTY OF THE NAVAL MATERIAL COMMAND FOREWORD This is the final report for a research program conducted by Systems Associates, Pennsylvania State
User's guide to resin infusion simulation program in the FORTRAN language
NASA Technical Reports Server (NTRS)
Weideman, Mark H.; Hammond, Vince H.; Loos, Alfred C.
1992-01-01
RTMCL is a user friendly computer code which simulates the manufacture of fabric composites by the resin infusion process. The computer code is based on the process simulation model described in reference 1. Included in the user's guide is a detailed step by step description of how to run the program and enter and modify the input data set. Sample input and output files are included along with an explanation of the results. Finally, a complete listing of the program is provided.
Self-adaptive multimethod optimization applied to a tailored heating forging process
NASA Astrophysics Data System (ADS)
Baldan, M.; Steinberg, T.; Baake, E.
2018-05-01
The presented paper describes an innovative self-adaptive multi-objective optimization code. Investigation goals concern proving the superiority of this code compared to NGSA-II and applying it to an inductor’s design case study addressed to a “tailored” heating forging application. The choice of the frequency and the heating time are followed by the determination of the turns number and their positions. Finally, a straightforward optimization is performed in order to minimize energy consumption using “optimal control”.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, John H.; Belcourt, Kenneth Noel
Completion of the CASL L3 milestone THM.CFD.P6.03 provides a tabular material properties capability to the Hydra code. A tabular interpolation package used in Sandia codes was modified to support the needs of multi-phase solvers in Hydra. Use of the interface is described. The package was released to Hydra under a government use license. A dummy physics was created in Hydra to prototype use of the interpolation routines. Finally, a test using the dummy physics verifies the correct behavior of the interpolation for a test water table. 3
Multigrid solution of internal flows using unstructured solution adaptive meshes
NASA Technical Reports Server (NTRS)
Smith, Wayne A.; Blake, Kenneth R.
1992-01-01
This is the final report of the NASA Lewis SBIR Phase 2 Contract Number NAS3-25785, Multigrid Solution of Internal Flows Using Unstructured Solution Adaptive Meshes. The objective of this project, as described in the Statement of Work, is to develop and deliver to NASA a general three-dimensional Navier-Stokes code using unstructured solution-adaptive meshes for accuracy and multigrid techniques for convergence acceleration. The code will primarily be applied, but not necessarily limited, to high speed internal flows in turbomachinery.
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye
2016-01-01
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye
2016-06-07
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
NASA Astrophysics Data System (ADS)
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye
2016-06-01
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
Olier, Ivan; Springate, David A; Ashcroft, Darren M; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos
2016-01-01
The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists.
On complexity of trellis structure of linear block codes
NASA Technical Reports Server (NTRS)
Lin, Shu
1990-01-01
The trellis structure of linear block codes (LBCs) is discussed. The state and branch complexities of a trellis diagram (TD) for a LBC is investigated. The TD with the minimum number of states is said to be minimal. The branch complexity of a minimal TD for a LBC is expressed in terms of the dimensions of specific subcodes of the given code. Then upper and lower bounds are derived on the number of states of a minimal TD for a LBC, and it is shown that a cyclic (or shortened cyclic) code is the worst in terms of the state complexity among the LBCs of the same length and dimension. Furthermore, it is shown that the structural complexity of a minimal TD for a LBC depends on the order of its bit positions. This fact suggests that an appropriate permutation of the bit positions of a code may result in an equivalent code with a much simpler minimal TD. Boolean polynomial representation of codewords of a LBC is also considered. This representation helps in study of the trellis structure of the code. Boolean polynomial representation of a code is applied to construct its minimal TD. Particularly, the construction of minimal trellises for Reed-Muller codes and the extended and permuted binary primitive BCH codes which contain Reed-Muller as subcodes is emphasized. Finally, the structural complexity of minimal trellises for the extended and permuted, and double-error-correcting BCH codes is analyzed and presented. It is shown that these codes have relatively simple trellis structure and hence can be decoded with the Viterbi decoding algorithm.
Makwana, K. D.; Zhdankin, V.; Li, H.; ...
2015-04-10
We performed simulations of decaying magnetohydrodynamic (MHD) turbulence with a fluid and a kinetic code. The initial condition is an ensemble of long-wavelength, counter-propagating, shear-Alfvén waves, which interact and rapidly generate strong MHD turbulence. The total energy is conserved and the rate of turbulent energy decay is very similar in both codes, although the fluid code has numerical dissipation, whereas the kinetic code has kinetic dissipation. The inertial range power spectrum index is similar in both the codes. The fluid code shows a perpendicular wavenumber spectral slope of k-1.3⊥k⊥-1.3. The kinetic code shows a spectral slope of k-1.5⊥k⊥-1.5 for smallermore » simulation domain, and k-1.3⊥k⊥-1.3 for larger domain. We then estimate that collisionless damping mechanisms in the kinetic code can account for the dissipation of the observed nonlinear energy cascade. Current sheets are geometrically characterized. Their lengths and widths are in good agreement between the two codes. The length scales linearly with the driving scale of the turbulence. In the fluid code, their thickness is determined by the grid resolution as there is no explicit diffusivity. In the kinetic code, their thickness is very close to the skin-depth, irrespective of the grid resolution. Finally, this work shows that kinetic codes can reproduce the MHD inertial range dynamics at large scales, while at the same time capturing important kinetic physics at small scales.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makwana, K. D.; Zhdankin, V.; Li, H.
We performed simulations of decaying magnetohydrodynamic (MHD) turbulence with a fluid and a kinetic code. The initial condition is an ensemble of long-wavelength, counter-propagating, shear-Alfvén waves, which interact and rapidly generate strong MHD turbulence. The total energy is conserved and the rate of turbulent energy decay is very similar in both codes, although the fluid code has numerical dissipation, whereas the kinetic code has kinetic dissipation. The inertial range power spectrum index is similar in both the codes. The fluid code shows a perpendicular wavenumber spectral slope of k-1.3⊥k⊥-1.3. The kinetic code shows a spectral slope of k-1.5⊥k⊥-1.5 for smallermore » simulation domain, and k-1.3⊥k⊥-1.3 for larger domain. We then estimate that collisionless damping mechanisms in the kinetic code can account for the dissipation of the observed nonlinear energy cascade. Current sheets are geometrically characterized. Their lengths and widths are in good agreement between the two codes. The length scales linearly with the driving scale of the turbulence. In the fluid code, their thickness is determined by the grid resolution as there is no explicit diffusivity. In the kinetic code, their thickness is very close to the skin-depth, irrespective of the grid resolution. Finally, this work shows that kinetic codes can reproduce the MHD inertial range dynamics at large scales, while at the same time capturing important kinetic physics at small scales.« less
Provisional Coding Practices: Are They Really a Waste of Time?
Krypuy, Matthew; McCormack, Lena
2006-11-01
In order to facilitate effective clinical coding and hence the precise financial reimbursement of acute services, in 2005 Western District Health Service (WDHS) (located in regional Victoria, Australia) undertook a provisional coding trial for inpatient medical episodes to determine the magnitude and accuracy of clinical documentation. Utilising clinical coding software installed on a laptop computer, provisional coding was undertaken for all current overnight inpatient episodes under each physician one day prior to attending their daily ward round. The provisionally coded episodes were re-coded upon the completion of the discharge summary and the final Diagnostic Related Group (DRG) allocation and weight were compared to the provisional DRG assignment. A total of 54 out of 220 inpatient medical episodes were provisionally coded. This represented approximately a 25% cross section of the population selected for observation. Approximately 67.6% of the provisionally allocated DRGs were accurate in contrast to 32.4% which were subject to change once the discharge summary was completed. The DRG changes were primarily due to: disease progression of a patient during their care episode which could not be identified by clinical coding staff due to discharge prior to the following scheduled ward round; the discharge destination of particular patients; and the accuracy of clinical documentation on the discharge summary. The information gathered from the provisional coding trial supported the hypothesis that clinical documentation standards were sufficient and adequate to support precise clinical coding and DRG assignment at WDHS. The trial further highlighted the importance of a complete and accurate discharge summary available during the coding process of acute inpatient episodes.
Context-aware and locality-constrained coding for image categorization.
Xiao, Wenhua; Wang, Bin; Liu, Yu; Bao, Weidong; Zhang, Maojun
2014-01-01
Improving the coding strategy for BOF (Bag-of-Features) based feature design has drawn increasing attention in recent image categorization works. However, the ambiguity in coding procedure still impedes its further development. In this paper, we introduce a context-aware and locality-constrained Coding (CALC) approach with context information for describing objects in a discriminative way. It is generally achieved by learning a word-to-word cooccurrence prior to imposing context information over locality-constrained coding. Firstly, the local context of each category is evaluated by learning a word-to-word cooccurrence matrix representing the spatial distribution of local features in neighbor region. Then, the learned cooccurrence matrix is used for measuring the context distance between local features and code words. Finally, a coding strategy simultaneously considers locality in feature space and context space, while introducing the weight of feature is proposed. This novel coding strategy not only semantically preserves the information in coding, but also has the ability to alleviate the noise distortion of each class. Extensive experiments on several available datasets (Scene-15, Caltech101, and Caltech256) are conducted to validate the superiority of our algorithm by comparing it with baselines and recent published methods. Experimental results show that our method significantly improves the performance of baselines and achieves comparable and even better performance with the state of the arts.
10 CFR 1014.10 - Action on approved claims.
Code of Federal Regulations, 2010 CFR
2010-01-01
... against the United States and against any employee of the Government whose act or omission gave rise to... 2677 of title 28, United States Code, that acceptance shall be final and conclusive on the claimant...
76 FR 42160 - Commercial Space Transportation Advisory Committee-Public Teleconference
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-18
... this teleconference consists of the following topics: Final discussion of the CONOPS report on reentry... the European Code of Conduct and the Long Term Sustainability of Space effort by the United Nations...
Lesson 1: Overview of the Final Rule
Cross-Media Electronic Reporting Regulation (CROMERR) 101: Fundamentals for States, Tribes, and Local Governments is designed for States, Tribes, and Local Governments that administer EPA-authorized programs under Title 40 of the Code of Federal Regulation
Coding visual features extracted from video sequences.
Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano
2014-05-01
Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenthal, Andrew
The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stovemore » pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.« less
Caillet, P; Oberlin, P; Monnet, E; Guillon-Grammatico, L; Métral, P; Belhassen, M; Denier, P; Banaei-Bouchareb, L; Viprey, M; Biau, D; Schott, A-M
2017-10-01
Osteoporotic hip fractures (OHF) are associated with significant morbidity and mortality. The French medico-administrative database (SNIIRAM) offers an interesting opportunity to improve the management of OHF. However, the validity of studies conducted with this database relies heavily on the quality of the algorithm used to detect OHF. The aim of the REDSIAM network is to facilitate the use of the SNIIRAM database. The main objective of this study was to present and discuss several OHF-detection algorithms that could be used with this database. A non-systematic literature search was performed. The Medline database was explored during the period January 2005-August 2016. Furthermore, a snowball search was then carried out from the articles included and field experts were contacted. The extraction was conducted using the chart developed by the REDSIAM network's "Methodology" task force. The ICD-10 codes used to detect OHF are mainly S72.0, S72.1, and S72.2. The performance of these algorithms is at best partially validated. Complementary use of medical and surgical procedure codes would affect their performance. Finally, few studies described how they dealt with fractures of non-osteoporotic origin, re-hospitalization, and potential contralateral fracture cases. Authors in the literature encourage the use of ICD-10 codes S72.0 to S72.2 to develop algorithms for OHF detection. These are the codes most frequently used for OHF in France. Depending on the study objectives, other ICD10 codes and medical and surgical procedures could be usefully discussed for inclusion in the algorithm. Detection and management of duplicates and non-osteoporotic fractures should be considered in the process. Finally, when a study is based on such an algorithm, all these points should be precisely described in the publication. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Is phonology bypassed in normal or dyslexic development?
Pennington, B F; Lefly, D L; Van Orden, G C; Bookman, M O; Smith, S D
1987-01-01
A pervasive assumption in most accounts of normal reading and spelling development is that phonological coding is important early in development but is subsequently superseded by faster, orthographic coding which bypasses phonology. We call this assumption, which derives from dual process theory, the developmental bypass hypothesis. The present study tests four specific predictions of the developmental bypass hypothesis by comparing dyslexics and nondyslexics from the same families in a cross-sectional design. The four predictions are: 1) That phonological coding skill develops early in normal readers and soon reaches asymptote, whereas orthographic coding skill has a protracted course of development; 2) that the correlation of adult reading or spelling performance with phonological coding skill is considerably less than the correlation with orthographic coding skill; 3) that dyslexics who are mainly deficient in phonological coding skill should be able to bypass this deficit and eventually close the gap in reading and spelling performance; and 4) that the greatest differences between dyslexics and developmental controls on measures of phonological coding skill should be observed early rather than late in development.None of the four predictions of the developmental bypass hypothesis were upheld. Phonological coding skill continued to develop in nondyslexics until adulthood. It accounted for a substantial (32-53 percent) portion of the variance in reading and spelling performance in adult nondyslexics, whereas orthographic coding skill did not account for a statistically reliable portion of this variance. The dyslexics differed little across age in phonological coding skill, but made linear progress in orthographic coding skill, surpassing spelling-age (SA) controls by adulthood. Nonetheless, they didnot close the gap in reading and spelling performance. Finally, dyslexics were significantly worse than SA (and Reading Age [RA]) controls in phonological coding skill only in adulthood.
NASA Technical Reports Server (NTRS)
Sandlin, Doral R.; Bauer, Brent Alan
1993-01-01
This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition one study illustrates the module's ability to optimize a configuration's agility performance.
NASA Technical Reports Server (NTRS)
Bauer, Brent
1993-01-01
This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition, one study illustrates the module's ability to optimize a configuration's agility performance.
Direction-selective circuits shape noise to ensure a precise population code
Zylberberg, Joel; Cafaro, Jon; Turner, Maxwell H
2016-01-01
Summary Neural responses are noisy, and circuit structure can correlate this noise across neurons. Theoretical studies show that noise correlations can have diverse effects on population coding, but these studies rarely explore stimulus dependence of noise correlations. Here, we show that noise correlations in responses of ON-OFF direction-selective retinal ganglion cells are strongly stimulus dependent and we uncover the circuit mechanisms producing this stimulus dependence. A population model based on these mechanistic studies shows that stimulus-dependent noise correlations improve the encoding of motion direction two-fold compared to independent noise. This work demonstrates a mechanism by which a neural circuit effectively shapes its signal and noise in concert, minimizing corruption of signal by noise. Finally, we generalize our findings beyond direction coding in the retina and show that stimulus-dependent correlations will generally enhance information coding in populations of diversely tuned neurons. PMID:26796691
Surface Modeling and Grid Generation of Orbital Sciences X34 Vehicle. Phase 1
NASA Technical Reports Server (NTRS)
Alter, Stephen J.
1997-01-01
The surface modeling and grid generation requirements, motivations, and methods used to develop Computational Fluid Dynamic volume grids for the X34-Phase 1 are presented. The requirements set forth by the Aerothermodynamics Branch at the NASA Langley Research Center serve as the basis for the final techniques used in the construction of all volume grids, including grids for parametric studies of the X34. The Integrated Computer Engineering and Manufacturing code for Computational Fluid Dynamics (ICEM/CFD), the Grid Generation code (GRIDGEN), the Three-Dimensional Multi-block Advanced Grid Generation System (3DMAGGS) code, and Volume Grid Manipulator (VGM) code are used to enable the necessary surface modeling, surface grid generation, volume grid generation, and grid alterations, respectively. All volume grids generated for the X34, as outlined in this paper, were used for CFD simulations within the Aerothermodynamics Branch.
Eisenbach, Markus; Larkin, Jeff; Lutjens, Justin; ...
2016-07-12
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn–Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. In this paper, we present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. We reimplement the scattering matrix calculation for GPUs with a block matrix inversion algorithm that only uses accelerator memory. Finally, using the Craymore » XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code.« less
An overview of new video coding tools under consideration for VP10: the successor to VP9
NASA Astrophysics Data System (ADS)
Mukherjee, Debargha; Su, Hui; Bankoski, James; Converse, Alex; Han, Jingning; Liu, Zoe; Xu, Yaowu
2015-09-01
Google started an opensource project, entitled the WebM Project, in 2010 to develop royaltyfree video codecs for the web. The present generation codec developed in the WebM project called VP9 was finalized in mid2013 and is currently being served extensively by YouTube, resulting in billions of views per day. Even though adoption of VP9 outside Google is still in its infancy, the WebM project has already embarked on an ambitious project to develop a next edition codec VP10 that achieves at least a generational bitrate reduction over the current generation codec VP9. Although the project is still in early stages, a set of new experimental coding tools have already been added to baseline VP9 to achieve modest coding gains over a large enough test set. This paper provides a technical overview of these coding tools.
The Simple Video Coder: A free tool for efficiently coding social video data.
Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C
2017-08-01
Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.
An approach for coupled-code multiphysics core simulations from a common input
Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...
2014-12-10
This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less
44 CFR 11.19 - Action on approved claim.
Code of Federal Regulations, 2010 CFR
2010-10-01
... States and against any employee of the Government whose act or omission gave rise to the claim, by reason... made under section 2672 or 2677 of title 28, United States Code, is final and conclusive on the...
44 CFR 11.19 - Action on approved claim.
Code of Federal Regulations, 2011 CFR
2011-10-01
... States and against any employee of the Government whose act or omission gave rise to the claim, by reason... made under section 2672 or 2677 of title 28, United States Code, is final and conclusive on the...
Special Consolidated Checklists for Organic Air Emission Standards
This checklist consolidates changes made to the Federal code by the December 6, 1994 final rule regarding Subpart CC standards [(59 FR 62896); Revision Checklist 154] and subsequent revisions which have occurred through December 31, 2002.
77 FR 4059 - Meeting of the California Desert District Advisory Council
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-26
... California Desert District manager, five field office managers, and council subgroups. Final agenda items... A. Raml, California Desert District Manager. [FR Doc. 2012-1630 Filed 1-25-12; 8:45 am] BILLING CODE...
Criteria for a catastrophically disabled determination for purposes of enrollment. Final rule.
2013-12-03
The Department of Veterans Affairs (VA) is amending its regulation concerning the manner in which VA determines that a veteran is catastrophically disabled for purposes of enrollment in priority group 4 for VA health care. As amended by this rulemaking, the regulation articulates the clinical criteria that identify an individual as catastrophically disabled, instead of using the corresponding International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) and Current Procedural Terminology (CPT[registered trademark]) codes. The revisions ensure that the regulation is not out of date when new versions of those codes are published. The revisions also broaden some of the descriptions for a finding of catastrophic disability. Additionally, the final rule does not rely on the Folstein Mini Mental State Examination (MMSE) as a criterion for determining whether a veteran meets the definition of catastrophically disabled, because we have determined that the MMSE is no longer a necessary clinical assessment tool.
Wolf-Rayet stars, black holes and the first detected gravitational wave source
NASA Astrophysics Data System (ADS)
Bogomazov, A. I.; Cherepashchuk, A. M.; Lipunov, V. M.; Tutukov, A. V.
2018-01-01
The recently discovered burst of gravitational waves GW150914 provides a good new chance to verify the current view on the evolution of close binary stars. Modern population synthesis codes help to study this evolution from two main sequence stars up to the formation of two final remnant degenerate dwarfs, neutron stars or black holes (Masevich and Tutukov, 1988). To study the evolution of the GW150914 predecessor we use the ;Scenario Machine; code presented by Lipunov et al. (1996). The scenario modeling conducted in this study allowed to describe the evolution of systems for which the final stage is a massive BH+BH merger. We find that the initial mass of the primary component can be 100÷140M⊙ and the initial separation of the components can be 50÷350R⊙. Our calculations show the plausibility of modern evolutionary scenarios for binary stars and the population synthesis modeling based on it.
Architecture for time or transform domain decoding of reed-solomon codes
NASA Technical Reports Server (NTRS)
Hsu, In-Shek (Inventor); Truong, Trieu-Kie (Inventor); Deutsch, Leslie J. (Inventor); Shao, Howard M. (Inventor)
1989-01-01
Two pipeline (255,233) RS decoders, one a time domain decoder and the other a transform domain decoder, use the same first part to develop an errata locator polynomial .tau.(x), and an errata evaluator polynominal A(x). Both the time domain decoder and transform domain decoder have a modified GCD that uses an input multiplexer and an output demultiplexer to reduce the number of GCD cells required. The time domain decoder uses a Chien search and polynomial evaluator on the GCD outputs .tau.(x) and A(x), for the final decoding steps, while the transform domain decoder uses a transform error pattern algorithm operating on .tau.(x) and the initial syndrome computation S(x), followed by an inverse transform algorithm in sequence for the final decoding steps prior to adding the received RS coded message to produce a decoded output message.
NASA Astrophysics Data System (ADS)
Noriega-Mendoza, H.; Aguilar, L. A.
2018-04-01
We performed high precision, N-body simulations of the cold collapse of initially spherical, collisionless systems using the GYRFALCON code of Dehnen (2000). The collapses produce very prolate spheroidal configurations. After the collapse, the systems are simulated for 85 and 170 half-mass radius dynamical timescales, during which energy conservation is better than 0.005%. We use this period to extract individual particle orbits directly from the simulations. We then use the TAXON code of Carpintero and Aguilar (1998) to classify 1 to 1.5% of the extracted orbits from our final, relaxed configurations: less than 15% are chaotic orbits, 30% are box orbits and 60% are tube orbits (long and short axis). Our goal has been to prove that direct orbit extraction is feasible, and that there is no need to "freeze" the final N-body system configuration to extract a time-independent potential.
58th SOW Low-Dust Helicopter Landing Zone Final Environmental Assessment
2012-11-01
Effects AQCR Air Quality Control Region BASH Bird/wildlife-Aircraft Strike Hazard CEQ Council on Environmental Quality CFR Code of Federal Regulations...force would continue to be applied to minimize risks to aircrews and the general population. No unacceptable hazards to military personnel, the public...and Final EA As a result of comments received on the Draft EA, Section 3.1.2, Global Climate Change, and Hazardous and Toxic Materials and Waste
Final Regulatory Evaluation: Metropolitan Washington Airports Policy,
1981-10-01
Sponsoring Agency Code APO-220 15. Supplementary Notes None 16 . Abstract This final regulatory evaluation examines the potential impacts of rules to...to recover combined direct and allocated maintenance and operation, depreciation and interest charges on the landing field areas of Washington National...931,197 t13,462,003 10.92 16 Revenues increased 6.5 percent in 1980 totaling $25.3 million, which equates to $1.73 per passenger handled. At the same
1984-12-01
Octol** explosive. The experimental charges were lightly confined with aluminum bodies and had cone diameters of 84mm. The charges modelled using HEMP...solved using the following relationships: .Final Final V 0 1 IV sin 9, where Voz aj i teailcmpnn fj~Fnl n Final F where V is the adial component of Fnan h...velocity vector is equal to the vector addition of the flow and 8. MMiles L. Lampson, "The Influence of Convergence - Velocity Gradients on the Formation
Applications of potential theory computations to transonic aeroelasticity
NASA Technical Reports Server (NTRS)
Edwards, J. W.
1986-01-01
Unsteady aerodynamic and aeroelastic stability calculations based upon transonic small disturbance (TSD) potential theory are presented. Results from the two-dimensional XTRAN2L code and the three-dimensional XTRAN3S code are compared with experiment to demonstrate the ability of TSD codes to treat transonic effects. The necessity of nonisentropic corrections to transonic potential theory is demonstrated. Dynamic computational effects resulting from the choice of grid and boundary conditions are illustrated. Unsteady airloads for a number of parameter variations including airfoil shape and thickness, Mach number, frequency, and amplitude are given. Finally, samples of transonic aeroelastic calculations are given. A key observation is the extent to which unsteady transonic airloads calculated by inviscid potential theory may be treated in a locally linear manner.
Wavelet-based audio embedding and audio/video compression
NASA Astrophysics Data System (ADS)
Mendenhall, Michael J.; Claypoole, Roger L., Jr.
2001-12-01
Watermarking, traditionally used for copyright protection, is used in a new and exciting way. An efficient wavelet-based watermarking technique embeds audio information into a video signal. Several effective compression techniques are applied to compress the resulting audio/video signal in an embedded fashion. This wavelet-based compression algorithm incorporates bit-plane coding, index coding, and Huffman coding. To demonstrate the potential of this audio embedding and audio/video compression algorithm, we embed an audio signal into a video signal and then compress. Results show that overall compression rates of 15:1 can be achieved. The video signal is reconstructed with a median PSNR of nearly 33 dB. Finally, the audio signal is extracted from the compressed audio/video signal without error.
Vallat, B; Wilson, D W
2003-08-01
The authors discuss the mission, organisation and resources of Veterinary Services in the new international trading environment and examine how the standards for Veterinary Services, contained in the OIE (World Organisation for Animal Health) International Animal Health Code (the Code), help provide the necessary support for Veterinary Services to meet their rights and obligations under the provisions of the Sanitary and Phytosanitary (SPS) Agreement of the World Trade Organization (WTO). The authors describe the challenges of gaining access to international trading markets through surveillance and control of OIE listed diseases. Finally, the approach in the Code to the principles underpinning the quality of Veterinary Services and to guidelines for evaluating Veterinary Services, is discussed.
Validating a Monotonically-Integrated Large Eddy Simulation Code for Subsonic Jet Acoustics
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Bridges, James
2017-01-01
The results of subsonic jet validation cases for the Naval Research Lab's Jet Engine Noise REduction (JENRE) code are reported. Two set points from the Tanna matrix, set point 3 (Ma = 0.5, unheated) and set point 7 (Ma = 0.9, unheated) are attempted on three different meshes. After a brief discussion of the JENRE code and the meshes constructed for this work, the turbulent statistics for the axial velocity are presented and compared to experimental data, with favorable results. Preliminary simulations for set point 23 (Ma = 0.5, Tj=T1 = 1.764) on one of the meshes are also described. Finally, the proposed configuration for the farfield noise prediction with JENRE's Ffowcs-Williams Hawking solver are detailed.
NASA Technical Reports Server (NTRS)
Bobbitt, Percy J.
1992-01-01
A discussion is given of the many factors that affect sonic booms with particular emphasis on the application and development of improved computational fluid dynamics (CFD) codes. The benefits that accrue from interference (induced) lift, distributing lift using canard configurations, the use of wings with dihedral or anhedral and hybrid laminar flow control for drag reduction are detailed. The application of the most advanced codes to a wider variety of configurations along with improved ray-tracing codes to arrive at more accurate and, hopefully, lower sonic booms is advocated. Finally, it is speculated that when all of the latest technology is applied to the design of a supersonic transport it will be found environmentally acceptable.
Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van
2018-04-01
In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lezberg, Erwin A.; Mularz, Edward J.; Liou, Meng-Sing
1991-03-01
The objectives and accomplishments of research in chemical reacting flows, including both experimental and computational problems are described. The experimental research emphasizes the acquisition of reliable reacting-flow data for code validation, the development of chemical kinetics mechanisms, and the understanding of two-phase flow dynamics. Typical results from two nonreacting spray studies are presented. The computational fluid dynamics (CFD) research emphasizes the development of efficient and accurate algorithms and codes, as well as validation of methods and modeling (turbulence and kinetics) for reacting flows. Major developments of the RPLUS code and its application to mixing concepts, the General Electric combustor, and the Government baseline engine for the National Aerospace Plane are detailed. Finally, the turbulence research in the newly established Center for Modeling of Turbulence and Transition (CMOTT) is described.
NASA Technical Reports Server (NTRS)
Warren, Gary
1988-01-01
The SOS code is used to compute the resonance modes (frequency-domain information) of sample devices and separately to compute the transient behavior of the same devices. A code, DOT, is created to compute appropriate dot products of the time-domain and frequency-domain results. The transient behavior of individual modes in the device is then plotted. Modes in a coupled-cavity traveling-wave tube (CCTWT) section excited beam in separate simulations are analyzed. Mode energy vs. time and mode phase vs. time are computed and it is determined whether the transient waves are forward or backward waves for each case. Finally, the hot-test mode frequencies of the CCTWT section are computed.
Olier, Ivan; Springate, David A.; Ashcroft, Darren M.; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos
2016-01-01
Background The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. Methods We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. Results We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. Conclusion We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists. PMID:26918439
VICTORIA: A mechanistic model for radionuclide behavior in the reactor coolant system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaperow, J.H.; Bixler, N.E.
1996-12-31
VICTORIA is the U.S. Nuclear Regulatory Commission`s (NRC`s) mechanistic, best-estimate code for analysis of fission product release from the core and subsequent transport in the reactor vessel and reactor coolant system. VICTORIA requires thermal-hydraulic data (i.e., temperatures, pressures, and velocities) as input. In the past, these data have been taken from the results of calculations from thermal-hydraulic codes such as SCDAP/RELAP5, MELCOR, and MAAP. Validation and assessment of VICTORIA 1.0 have been completed. An independent peer review of VICTORIA, directed by Brookhaven National Laboratory and supported by experts in the areas of fuel release, fission product chemistry, and aerosol physics,more » has been undertaken. This peer review, which will independently assess the code`s capabilities, is nearing completion with the peer review committee`s final report expected in Dec 1996. A limited amount of additional development is expected as a result of the peer review. Following this additional development, the NRC plans to release VICTORIA 1.1 and an updated and improved code manual. Future plans mainly involve use of the code for plant calculations to investigate specific safety issues as they arise. Also, the code will continue to be used in support of the Phebus experiments.« less
NASA Astrophysics Data System (ADS)
Al Zain, Jamal; El Hajjaji, O.; El Bardouni, T.; Boukhal, H.; Jaï, Otman
2018-06-01
The MNSR is a pool type research reactor, which is difficult to model because of the importance of neutron leakage. The aim of this study is to evaluate a 2-D transport model for the reactor compatible with the latest release of the DRAGON code and 3-D diffusion of the DONJON code. DRAGON code is then used to generate the group macroscopic cross sections needed for full core diffusion calculations. The diffusion DONJON code, is then used to compute the effective multiplication factor (keff), the feedback reactivity coefficients and neutron flux which account for variation in fuel and moderator temperatures as well as the void coefficient have been calculated using the DRAGON and DONJON codes for the MNSR research reactor. The cross sections of all the reactor components at different temperatures were generated using the DRAGON code. These group constants were used then in the DONJON code to calculate the multiplication factor and the neutron spectrum at different water and fuel temperatures using 69 energy groups. Only one parameter was changed where all other parameters were kept constant. Finally, Good agreements between the calculated and measured have been obtained for every of the feedback reactivity coefficients and neutron flux.
Wang, Licheng; Wang, Zidong; Han, Qing-Long; Wei, Guoliang
2017-09-06
The synchronization control problem is investigated for a class of discrete-time dynamical networks with packet dropouts via a coding-decoding-based approach. The data is transmitted through digital communication channels and only the sequence of finite coded signals is sent to the controller. A series of mutually independent Bernoulli distributed random variables is utilized to model the packet dropout phenomenon occurring in the transmissions of coded signals. The purpose of the addressed synchronization control problem is to design a suitable coding-decoding procedure for each node, based on which an efficient decoder-based control protocol is developed to guarantee that the closed-loop network achieves the desired synchronization performance. By applying a modified uniform quantization approach and the Kronecker product technique, criteria for ensuring the detectability of the dynamical network are established by means of the size of the coding alphabet, the coding period and the probability information of packet dropouts. Subsequently, by resorting to the input-to-state stability theory, the desired controller parameter is obtained in terms of the solutions to a certain set of inequality constraints which can be solved effectively via available software packages. Finally, two simulation examples are provided to demonstrate the effectiveness of the obtained results.
Information Theory, Inference and Learning Algorithms
NASA Astrophysics Data System (ADS)
Mackay, David J. C.
2003-10-01
Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
Michel, Christian J
2017-04-18
In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C 3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X . As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X . Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes.
Race coding and the other-race effect in face recognition.
Rhodes, Gillian; Locke, Vance; Ewing, Louise; Evangelista, Emma
2009-01-01
Other-race faces are generally recognised more poorly than own-race faces. According to Levin's influential race-coding hypothesis, this other-race recognition deficit results from spontaneous coding of race-specifying information, at the expense of individuating information, in other-race faces. Therefore, requiring participants to code race-specifying information for all faces should eliminate the other-race effect by reducing recognition of own-race faces to the level of other-race faces. We tested this prediction in two experiments. Race coding was induced by requiring participants to rate study faces on race typicality (experiment 1) or to categorise them by race (experiment 2). Neither manipulation reduced the other-race effect, providing no support for the race-coding hypothesis. Instead, race-coding instructions marginally increased the other-race effect in experiment 1 and had no effect in experiment 2. These results do not support the race-coding hypothesis. Surprisingly, a control task of rating the attractiveness of study faces increased the other-race effect, indicating that deeper encoding of faces does not necessarily reduce the effect (experiment 1). Finally, the normally robust other-race effect was absent when participants were instructed to individuate other-race faces (experiment 2). We suggest that poorer recognition of other-race faces may reflect reduced perceptual expertise with such faces and perhaps reduced motivation to individuate them.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Edmund J.; Anderson, Michael T.
In May 2010, the NRC issued a proposed notice of rulemaking that includes a provision to add a new section to its rules to require licensees to implement ASME Code Case N-770, ‘‘Alternative Examination Requirements and Acceptance Standards for Class 1 PWR Piping and Vessel Nozzle Butt Welds Fabricated with UNS N06082 or UNS W86182 Weld Filler Material With or Without the Application of Listed Mitigation Activities, Section XI, Division 1,’’ with 15 conditions. Code Case N-770 contains baseline and inservice inspection (ISI) requirements for unmitigated butt welds fabricated with Alloy 82/182 material and preservice and ISI requirements for mitigatedmore » butt welds. The NRC stated that application of ASME Code Case N-770 is necessary because the inspections currently required by the ASME Code, Section XI, were not written to address stress corrosion cracking Alloy 82/182 butt welds, and the safety consequences of inadequate inspections can be significant. The NRC expects to issue the final rule incorporating this code case into its regulations in the spring 2011 time frame. This paper discusses the new examination requirements, the conditions that NRC is imposing , and the major concerns with implementation of the new Code Case.« less
NASA Astrophysics Data System (ADS)
Pei, Yong; Modestino, James W.
2004-12-01
Digital video delivered over wired-to-wireless networks is expected to suffer quality degradation from both packet loss and bit errors in the payload. In this paper, the quality degradation due to packet loss and bit errors in the payload are quantitatively evaluated and their effects are assessed. We propose the use of a concatenated forward error correction (FEC) coding scheme employing Reed-Solomon (RS) codes and rate-compatible punctured convolutional (RCPC) codes to protect the video data from packet loss and bit errors, respectively. Furthermore, the performance of a joint source-channel coding (JSCC) approach employing this concatenated FEC coding scheme for video transmission is studied. Finally, we describe an improved end-to-end architecture using an edge proxy in a mobile support station to implement differential error protection for the corresponding channel impairments expected on the two networks. Results indicate that with an appropriate JSCC approach and the use of an edge proxy, FEC-based error-control techniques together with passive error-recovery techniques can significantly improve the effective video throughput and lead to acceptable video delivery quality over time-varying heterogeneous wired-to-wireless IP networks.
Analysis of film cooling in rocket nozzles
NASA Technical Reports Server (NTRS)
Woodbury, Keith A.; Karr, Gerald R.
1992-01-01
Progress during the reporting period is summarized. Analysis of film cooling in rocket nozzles by computational fluid dynamics (CFD) computer codes is desirable for two reasons. First, it allows prediction of resulting flow fields within the rocket nozzle, in particular the interaction of the coolant boundary layer with the main flow. This facilitates evaluation of potential cooling configurations with regard to total thrust, etc., before construction and testing of any prototype. Secondly, CFD simulation of film cooling allows for assessment of the effectiveness of the proposed cooling in limiting nozzle wall temperature rises. This latter objective is the focus of the current work. The desired objective is to use the Finite Difference Navier Stokes (FDNS) code to predict wall heat fluxes or wall temperatures in rocket nozzles. As prior work has revealed that the FDNS code is deficient in the thermal modeling of boundary conditions, the first step is to correct these deficiencies in the FDNS code. Next, these changes must be tested against available data. Finally, the code will be used to model film cooling of a particular rocket nozzle. The third task of this research, using the modified code to compute the flow of hot gases through a nozzle, is described.
Minimizing embedding impact in steganography using trellis-coded quantization
NASA Astrophysics Data System (ADS)
Filler, Tomáš; Judas, Jan; Fridrich, Jessica
2010-01-01
In this paper, we propose a practical approach to minimizing embedding impact in steganography based on syndrome coding and trellis-coded quantization and contrast its performance with bounds derived from appropriate rate-distortion bounds. We assume that each cover element can be assigned a positive scalar expressing the impact of making an embedding change at that element (single-letter distortion). The problem is to embed a given payload with minimal possible average embedding impact. This task, which can be viewed as a generalization of matrix embedding or writing on wet paper, has been approached using heuristic and suboptimal tools in the past. Here, we propose a fast and very versatile solution to this problem that can theoretically achieve performance arbitrarily close to the bound. It is based on syndrome coding using linear convolutional codes with the optimal binary quantizer implemented using the Viterbi algorithm run in the dual domain. The complexity and memory requirements of the embedding algorithm are linear w.r.t. the number of cover elements. For practitioners, we include detailed algorithms for finding good codes and their implementation. Finally, we report extensive experimental results for a large set of relative payloads and for different distortion profiles, including the wet paper channel.
MrLavaLoba: A new probabilistic model for the simulation of lava flows as a settling process
NASA Astrophysics Data System (ADS)
de'Michieli Vitturi, Mattia; Tarquini, Simone
2018-01-01
A new code to simulate lava flow spread, MrLavaLoba, is presented. In the code, erupted lava is itemized in parcels having an elliptical shape and prescribed volume. New parcels bud from existing ones according to a probabilistic law influenced by the local steepest slope direction and by tunable input settings. MrLavaLoba must be accounted among the probabilistic codes for the simulation of lava flows, because it is not intended to mimic the actual process of flowing or to provide directly the progression with time of the flow field, but rather to guess the most probable inundated area and final thickness of the lava deposit. The code's flexibility allows it to produce variable lava flow spread and emplacement according to different dynamics (e.g. pahoehoe or channelized-'a'ā). For a given scenario, it is shown that model outputs converge, in probabilistic terms, towards a single solution. The code is applied to real cases in Hawaii and Mt. Etna, and the obtained maps are shown. The model is written in Python and the source code is available at http://demichie.github.io/MrLavaLoba/.
Integrated coding-aware intra-ONU scheduling for passive optical networks with inter-ONU traffic
NASA Astrophysics Data System (ADS)
Li, Yan; Dai, Shifang; Wu, Weiwei
2016-12-01
Recently, with the soaring of traffic among optical network units (ONUs), network coding (NC) is becoming an appealing technique for improving the performance of passive optical networks (PONs) with such inter-ONU traffic. However, in the existed NC-based PONs, NC can only be implemented by buffering inter-ONU traffic at the optical line terminal (OLT) to wait for the establishment of coding condition, such passive uncertain waiting severely limits the effect of NC technique. In this paper, we will study integrated coding-aware intra-ONU scheduling in which the scheduling of inter-ONU traffic within each ONU will be undertaken by the OLT to actively facilitate the forming of coding inter-ONU traffic based on the global inter-ONU traffic distribution, and then the performance of PONs with inter-ONU traffic can be significantly improved. We firstly design two report message patterns and an inter-ONU traffic transmission framework as the basis for the integrated coding-aware intra-ONU scheduling. Three specific scheduling strategies are then proposed for adapting diverse global inter-ONU traffic distributions. The effectiveness of the work is finally evaluated by both theoretical analysis and simulations.
Direct G-code manipulation for 3D material weaving
NASA Astrophysics Data System (ADS)
Koda, S.; Tanaka, H.
2017-04-01
The process of conventional 3D printing begins by first build a 3D model, then convert to the model to G-code via a slicer software, feed the G-code to the printer, and finally start the printing. The most simple and popular 3D printing technique is Fused Deposition Modeling. However, in this method, the printing path that the printer head can take is restricted by the G-code. Therefore the printed 3D models with complex pattern have structural errors like holes or gaps between the printed material lines. In addition, the structural density and the material's position of the printed model are difficult to control. We realized the G-code editing, Fabrix, for making a more precise and functional printed model with both single and multiple material. The models with different stiffness are fabricated by the controlling the printing density of the filament materials with our method. In addition, the multi-material 3D printing has a possibility to expand the physical properties by the material combination and its G-code editing. These results show the new printing method to provide more creative and functional 3D printing techniques.
A robust recognition and accurate locating method for circular coded diagonal target
NASA Astrophysics Data System (ADS)
Bao, Yunna; Shang, Yang; Sun, Xiaoliang; Zhou, Jiexin
2017-10-01
As a category of special control points which can be automatically identified, artificial coded targets have been widely developed in the field of computer vision, photogrammetry, augmented reality, etc. In this paper, a new circular coded target designed by RockeTech technology Corp. Ltd is analyzed and studied, which is called circular coded diagonal target (CCDT). A novel detection and recognition method with good robustness is proposed in the paper, and implemented on Visual Studio. In this algorithm, firstly, the ellipse features of the center circle are used for rough positioning. Then, according to the characteristics of the center diagonal target, a circular frequency filter is designed to choose the correct center circle and eliminates non-target noise. The precise positioning of the coded target is done by the correlation coefficient fitting extreme value method. Finally, the coded target recognition is achieved by decoding the binary sequence in the outer ring of the extracted target. To test the proposed algorithm, this paper has carried out simulation experiments and real experiments. The results show that the CCDT recognition and accurate locating method proposed in this paper can robustly recognize and accurately locate the targets in complex and noisy background.
A study of data coding technology developments in the 1980-1985 time frame, volume 2
NASA Technical Reports Server (NTRS)
Ingels, F. M.; Shahsavari, M. M.
1978-01-01
The source parameters of digitized analog data are discussed. Different data compression schemes are outlined and analysis of their implementation are presented. Finally, bandwidth compression techniques are given for video signals.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-22
... manufacturing and petroleum refineries. Carpet and rug mills (NAICS code 314110). Fiber, yarn, and thread mills... the abrasion of carpets. This is of particular concern for children since they engage in a variety of...
Potential metrics for designating and monitoring oversize/overweight corridors : final report.
DOT National Transportation Integrated Search
2016-11-11
This report: - Discusses how oversize/overweight (OS/OW) corridors are currently designated in Texas. - Provides information on OS/OW corridors as detailed in the Texas Transportation Code. - Lists potential metrics (as identified by stakeholder work...
Flexural anchorage performance at diagonal crack locations : final report.
DOT National Transportation Integrated Search
2010-12-01
Large numbers of reinforced concrete deck girder bridges that were constructed during the interstate system expansion of the 1950s have developed diagonal cracking in the stems. Though compliant with design codes when constructed, many of these bridg...
Brownfields Samoa Peninsula Project: Phase I Sustainable Site Analysis Final Report
This report provides an analysis and scoring using the Leadership in Energy and Environmental Design, Neighborhood Development Rating System, and the Land and Natural Development Code in order to assess the proposed redevelopment a master plan.
Engineering properties of brittle repair materials : final report : volume I.
DOT National Transportation Integrated Search
1992-09-01
Most codes of practice prescribe procedures for selecting patch configuration and materials based on tests devised for evaluating new pavement materials. This study is aimed at examining the special consideration to be given to such evaluation proced...
Engineering properties of brittle repair materials : final report : volume II.
DOT National Transportation Integrated Search
1992-09-01
Most codes of practice prescribe procedures for selecting patch configuration and materials based on tests devised for evaluating new pavement materials. This study is aimed at examining the special consideration to be given to such evaluation proced...
NASA Astrophysics Data System (ADS)
Georg, Peter; Richtmann, Daniel; Wettig, Tilo
2018-03-01
We describe our experience porting the Regensburg implementation of the DD-αAMG solver from QPACE 2 to QPACE 3. We first review how the code was ported from the first generation Intel Xeon Phi processor (Knights Corner) to its successor (Knights Landing). We then describe the modifications in the communication library necessitated by the switch from InfiniBand to Omni-Path. Finally, we present the performance of the code on a single processor as well as the scaling on many nodes, where in both cases the speedup factor is close to the theoretical expectations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haber, Eldad
2014-03-17
The focus of research was: Developing adaptive mesh for the solution of Maxwell's equations; Developing a parallel framework for time dependent inverse Maxwell's equations; Developing multilevel methods for optimization problems with inequality constraints; A new inversion code for inverse Maxwell's equations in the 0th frequency (DC resistivity); A new inversion code for inverse Maxwell's equations in low frequency regime. Although the research concentrated on electromagnetic forward and in- verse problems the results of the research was applied to the problem of image registration.
2011-04-01
there it is a computer implementation of the method just introduced. It uses Scilab ® programming language, and the Young modulus is calculated as final...laminate without Z-pins, its thickness, lamina stacking sequence and lamina’s engineering elastic constants, the second Scilab ® code can be used to find...EL thickness, the second Scilab ® code is employed once again; this time, though, a new Young’s modulus estimate would be produced. On the other hand
Energy Levels and Oscillator Strengths for Ne-like Iron Ions
NASA Astrophysics Data System (ADS)
Zhong, J. Y.; Zhang, J.; Zhao, G.; Lu, X..
2004-02-01
Energy levels and oscillator strengths among the 27 fine-structure levels belonging to the (1s22s2)2p6, 2p53s, 2p53p and 2p53d configurations of neon-like iron ion have been calculated by using three atomic structure codes, RCN/RCG, AUTOSTRUCTURE (AS) and GRASP. The relativistic corrections of the wave functions are taken into account in RCN/RCG calculations. The results well agree with experimental and theoretical data wherever available. Finally the accuracy of three codes was analyzed.
Superdense Coding over Optical Fiber Links with Complete Bell-State Measurements
Williams, Brian P.; Sadlier, Ronald J.; Humble, Travis S.
2017-02-01
Adopting quantum communication to modern networking requires transmitting quantum information through a fiber-based infrastructure. In this paper, we report the first demonstration of superdense coding over optical fiber links, taking advantage of a complete Bell-state measurement enabled by time-polarization hyperentanglement, linear optics, and common single-photon detectors. Finally, we demonstrate the highest single-qubit channel capacity to date utilizing linear optics, 1.665 ± 0.018, and we provide a full experimental implementation of a hybrid, quantum-classical communication protocol for image transfer.
Recent advances in hypersonic technology
NASA Technical Reports Server (NTRS)
Dwoyer, Douglas L.
1990-01-01
This paper will focus on recent advances in hypersonic aerodynamic prediction techniques. Current capabilities of existing numerical methods for predicting high Mach number flows will be discussed and shortcomings will be identified. Physical models available for inclusion into modern codes for predicting the effects of transition and turbulence will also be outlined and their limitations identified. Chemical reaction models appropriate to high-speed flows will be addressed, and the impact of their inclusion in computational fluid dynamics codes will be discussed. Finally, the problem of validating predictive techniques for high Mach number flows will be addressed.
1987-12-31
Kent E., Hamel, Cheryl J., and Shrestha, Lisa B. 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (Year, Month, Day) 15 PAGE COUNT Final FROM...DTIC USERS UNCLASSIFIED 22a NAME OF RESPONSIBLE INDIVIDUAL 22b TELEPHONE (Include Area Code) 22c OFFICE SYMBOL Cheryl J. Hamel 407-380-4825 Code 712 DO...Lab ATTN: Dr Alva Bittner, Jr., P. 0. Box 29407 New Orleans, LA 70189 Commanding Officer NETPMSA ATTN: Mr Dennis Knott Pensacola, FL 32509-5000
Theoretical, Experimental, and Computational Evaluation of Disk-Loaded Circular Wave Guides
NASA Technical Reports Server (NTRS)
Wallett, Thomas M.; Qureshi, A. Haq
1994-01-01
A disk-loaded circular wave guide structure and test fixture were fabricated. The dispersion characteristics were found by theoretical analysis, experimental testing, and computer simulation using the codes ARGUS and SOS. Interaction impedances were computed based on the corresponding dispersion characteristics. Finally, an equivalent circuit model for one period of the structure was chosen using equivalent circuit models for cylindrical wave guides of different radii. Optimum values for the discrete capacitors and inductors describing discontinuities between cylindrical wave guides were found using the computer code TOUCHSTONE.
Current and anticipated uses of the thermal hydraulics codes at the NRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support thesemore » needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.« less
NASA Astrophysics Data System (ADS)
Trejos, Sorayda; Fredy Barrera, John; Torroba, Roberto
2015-08-01
We present for the first time an optical encrypting-decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome.
Making Homes Healthy: International Code Council Processes and Patterns.
Coyle, Edward C; Isett, Kimberley R; Rondone, Joseph; Harris, Rebecca; Howell, M Claire Batten; Brandus, Katherine; Hughes, Gwendolyn; Kerfoot, Richard; Hicks, Diana
2016-01-01
Americans spend more than 90% of their time indoors, so it is important that homes are healthy environments. Yet many homes contribute to preventable illnesses via poor air quality, pests, safety hazards, and others. Efforts have been made to promote healthy housing through code changes, but results have been mixed. In support of such efforts, we analyzed International Code Council's (ICC) building code change process to uncover patterns of content and context that may contribute to successful adoptions of model codes. Discover patterns of facilitators and barriers to code amendments proposals. Mixed methods study of ICC records of past code change proposals. N = 2660. N/A. N/A. There were 4 possible outcomes for each code proposal studied: accepted as submitted, accepted as modified, accepted as modified by public comment, and denied. We found numerous correlates for final adoption of model codes proposed to the ICC. The number of proponents listed on a proposal was inversely correlated with success. Organizations that submitted more than 15 proposals had a higher chance of success than those that submitted fewer than 15. Proposals submitted by federal agencies correlated with a higher chance of success. Public comments in favor of a proposal correlated with an increased chance of success, while negative public comment had an even stronger negative correlation. To increase the chance of success, public health officials should submit their code changes through internal ICC committees or a federal agency, limit the number of cosponsors of the proposal, work with (or become) an active proposal submitter, and encourage public comment in favor of passage through their broader coalition.
The design of the CMOS wireless bar code scanner applying optical system based on ZigBee
NASA Astrophysics Data System (ADS)
Chen, Yuelin; Peng, Jian
2008-03-01
The traditional bar code scanner is influenced by the length of data line, but the farthest distance of the wireless bar code scanner of wireless communication is generally between 30m and 100m on the market. By rebuilding the traditional CCD optical bar code scanner, a CMOS code scanner is designed based on the ZigBee to meet the demands of market. The scan system consists of the CMOS image sensor and embedded chip S3C2401X, when the two dimensional bar code is read, the results show the inaccurate and wrong code bar, resulted from image defile, disturber, reads image condition badness, signal interference, unstable system voltage. So we put forward the method which uses the matrix evaluation and Read-Solomon arithmetic to solve them. In order to construct the whole wireless optics of bar code system and to ensure its ability of transmitting bar code image signals digitally with long distances, ZigBee is used to transmit data to the base station, and this module is designed based on image acquisition system, and at last the wireless transmitting/receiving CC2430 module circuit linking chart is established. And by transplanting the embedded RTOS system LINUX to the MCU, an applying wireless CMOS optics bar code scanner and multi-task system is constructed. Finally, performance of communication is tested by evaluation software Smart RF. In broad space, every ZIGBEE node can realize 50m transmission with high reliability. When adding more ZigBee nodes, the transmission distance can be several thousands of meters long.
Valenzuela-Miranda, Diego; Gallardo-Escárate, Cristian
2016-12-01
Despite the high prevalence and impact to Chilean salmon aquaculture of the intracellular bacterium Piscirickettsia salmonis, the molecular underpinnings of host-pathogen interactions remain unclear. Herein, the interplay of coding and non-coding transcripts has been proposed as a key mechanism involved in immune response. Therefore, the aim of this study was to evidence how coding and non-coding transcripts are modulated during the infection process of Atlantic salmon with P. salmonis. For this, RNA-seq was conducted in brain, spleen, and head kidney samples, revealing different transcriptional profiles according to bacterial load. Additionally, while most of the regulated genes annotated for diverse biological processes during infection, a common response associated with clathrin-mediated endocytosis and iron homeostasis was present in all tissues. Interestingly, while endocytosis-promoting factors and clathrin inductions were upregulated, endocytic receptors were mainly downregulated. Furthermore, the regulation of genes related to iron homeostasis suggested an intracellular accumulation of iron, a process in which heme biosynthesis/degradation pathways might play an important role. Regarding the non-coding response, 918 putative long non-coding RNAs were identified, where 425 were newly characterized for S. salar. Finally, co-localization and co-expression analyses revealed a strong correlation between the modulations of long non-coding RNAs and genes associated with endocytosis and iron homeostasis. These results represent the first comprehensive study of putative interplaying mechanisms of coding and non-coding RNAs during bacterial infection in salmonids. Copyright © 2016 Elsevier Ltd. All rights reserved.
2014-09-11
This final rule introduces regulatory flexibilities and general improvements for certification to the 2014 Edition EHR certification criteria (2014 Edition). It also codifies a few revisions and updates to the ONC HIT Certification Program for certification to the 2014 Edition and future editions of certification criteria as well as makes administrative updates to the Code of Federal Regulations.
Prediction of global ionospheric VTEC maps using an adaptive autoregressive model
NASA Astrophysics Data System (ADS)
Wang, Cheng; Xin, Shaoming; Liu, Xiaolu; Shi, Chuang; Fan, Lei
2018-02-01
In this contribution, an adaptive autoregressive model is proposed and developed to predict global ionospheric vertical total electron content maps (VTEC). Specifically, the spherical harmonic (SH) coefficients are predicted based on the autoregressive model, and the order of the autoregressive model is determined adaptively using the F-test method. To test our method, final CODE and IGS global ionospheric map (GIM) products, as well as altimeter TEC data during low and mid-to-high solar activity period collected by JASON, are used to evaluate the precision of our forecasting products. Results indicate that the predicted products derived from the model proposed in this paper have good consistency with the final GIMs in low solar activity, where the annual mean of the root-mean-square value is approximately 1.5 TECU. However, the performance of predicted vertical TEC in periods of mid-to-high solar activity has less accuracy than that during low solar activity periods, especially in the equatorial ionization anomaly region and the Southern Hemisphere. Additionally, in comparison with forecasting products, the final IGS GIMs have the best consistency with altimeter TEC data. Future work is needed to investigate the performance of forecasting products using the proposed method in an operational environment, rather than using the SH coefficients from the final CODE products, to understand the real-time applicability of the method.
Auto-Regulatory RNA Editing Fine-Tunes mRNA Re-Coding and Complex Behaviour in Drosophila
Savva, Yiannis A.; Jepson, James E.C; Sahin, Asli; Sugden, Arthur U.; Dorsky, Jacquelyn S.; Alpert, Lauren; Lawrence, Charles; Reenan, Robert A.
2014-01-01
Auto-regulatory feedback loops are a common molecular strategy used to optimize protein function. In Drosophila many mRNAs involved in neuro-transmission are re-coded at the RNA level by the RNA editing enzyme dADAR, leading to the incorporation of amino acids that are not directly encoded by the genome. dADAR also re-codes its own transcript, but the consequences of this auto-regulation in vivo are unclear. Here we show that hard-wiring or abolishing endogenous dADAR auto-regulation dramatically remodels the landscape of re-coding events in a site-specific manner. These molecular phenotypes correlate with altered localization of dADAR within the nuclear compartment. Furthermore, auto-editing exhibits sexually dimorphic patterns of spatial regulation and can be modified by abiotic environmental factors. Finally, we demonstrate that modifying dAdar auto-editing affects adaptive complex behaviors. Our results reveal the in vivo relevance of auto-regulatory control over post-transcriptional mRNA re-coding events in fine-tuning brain function and organismal behavior. PMID:22531175
Specific and Modular Binding Code for Cytosine Recognition in Pumilio/FBF (PUF) RNA-binding Domains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Shuyun; Wang, Yang; Cassidy-Amstutz, Caleb
2011-10-28
Pumilio/fem-3 mRNA-binding factor (PUF) proteins possess a recognition code for bases A, U, and G, allowing designed RNA sequence specificity of their modular Pumilio (PUM) repeats. However, recognition side chains in a PUM repeat for cytosine are unknown. Here we report identification of a cytosine-recognition code by screening random amino acid combinations at conserved RNA recognition positions using a yeast three-hybrid system. This C-recognition code is specific and modular as specificity can be transferred to different positions in the RNA recognition sequence. A crystal structure of a modified PUF domain reveals specific contacts between an arginine side chain and themore » cytosine base. We applied the C-recognition code to design PUF domains that recognize targets with multiple cytosines and to generate engineered splicing factors that modulate alternative splicing. Finally, we identified a divergent yeast PUF protein, Nop9p, that may recognize natural target RNAs with cytosine. This work deepens our understanding of natural PUF protein target recognition and expands the ability to engineer PUF domains to recognize any RNA sequence.« less
Differential expression and emerging functions of non-coding RNAs in cold adaptation.
Frigault, Jacques J; Morin, Mathieu D; Morin, Pier Jr
2017-01-01
Several species undergo substantial physiological and biochemical changes to confront the harsh conditions associated with winter. Small mammalian hibernators and cold-hardy insects are examples of natural models of cold adaptation that have been amply explored. While the molecular picture associated with cold adaptation has started to become clearer in recent years, notably through the use of high-throughput experimental approaches, the underlying cold-associated functions attributed to several non-coding RNAs, including microRNAs (miRNAs) and long non-coding RNAs (lncRNAs), remain to be better characterized. Nevertheless, key pioneering work has provided clues on the likely relevance of these molecules in cold adaptation. With an emphasis on mammalian hibernation and insect cold hardiness, this work first reviews various molecular changes documented so far in these processes. The cascades leading to miRNA and lncRNA production as well as the mechanisms of action of these non-coding RNAs are subsequently described. Finally, we present examples of differentially expressed non-coding RNAs in models of cold adaptation and elaborate on the potential significance of this modulation with respect to low-temperature adaptation.
Malnutrition coding 101: financial impact and more.
Giannopoulos, Georgia A; Merriman, Louise R; Rumsey, Alissa; Zwiebel, Douglas S
2013-12-01
Recent articles have addressed the characteristics associated with adult malnutrition as published by the Academy of Nutrition and Dietetics (the Academy) and the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.). This article describes a successful interdisciplinary program developed by the Department of Food and Nutrition at New York-Presbyterian Hospital to maintain and monitor clinical documentation, ensure accurate International Classification of Diseases 9th Edition (ICD-9) coding, and identify subsequent incremental revenue resulting from the early identification, documentation, and treatment of malnutrition in an adult inpatient population. The first step in the process requires registered dietitians to identify patients with malnutrition; then clear and specifically worded diagnostic statements that include the type and severity of malnutrition are documented in the medical record by the physician, nurse practitioner, or physician's assistant. This protocol allows the Heath Information Management/Coding department to accurately assign ICD-9 codes associated with protein-energy malnutrition. Once clinical coding is complete, a final diagnosis related group (DRG) is generated to ensure appropriate hospital reimbursement. Successful interdisciplinary programs such as this can drive optimal care and ensure appropriate reimbursement.
A Secure and Robust Approach to Software Tamper Resistance
NASA Astrophysics Data System (ADS)
Ghosh, Sudeep; Hiser, Jason D.; Davidson, Jack W.
Software tamper-resistance mechanisms have increasingly assumed significance as a technique to prevent unintended uses of software. Closely related to anti-tampering techniques are obfuscation techniques, which make code difficult to understand or analyze and therefore, challenging to modify meaningfully. This paper describes a secure and robust approach to software tamper resistance and obfuscation using process-level virtualization. The proposed techniques involve novel uses of software check summing guards and encryption to protect an application. In particular, a virtual machine (VM) is assembled with the application at software build time such that the application cannot run without the VM. The VM provides just-in-time decryption of the program and dynamism for the application's code. The application's code is used to protect the VM to ensure a level of circular protection. Finally, to prevent the attacker from obtaining an analyzable snapshot of the code, the VM periodically discards all decrypted code. We describe a prototype implementation of these techniques and evaluate the run-time performance of applications using our system. We also discuss how our system provides stronger protection against tampering attacks than previously described tamper-resistance approaches.
A novel approach of an absolute coding pattern based on Hamiltonian graph
NASA Astrophysics Data System (ADS)
Wang, Ya'nan; Wang, Huawei; Hao, Fusheng; Liu, Liqiang
2017-02-01
In this paper, a novel approach of an optical type absolute rotary encoder coding pattern is presented. The concept is based on the principle of the absolute encoder to find out a unique sequence that ensures an unambiguous shaft position of any angular. We design a single-ring and a n-by-2 matrix absolute encoder coding pattern by using the variations of Hamiltonian graph principle. 12 encoding bits is used in the single-ring by a linear array CCD to achieve an 1080-position cycle encoding. Besides, a 2-by-2 matrix is used as an unit in the 2-track disk to achieve a 16-bits encoding pattern by using an area array CCD sensor (as a sample). Finally, a higher resolution can be gained by an electronic subdivision of the signals. Compared with the conventional gray or binary code pattern (for a 2n resolution), this new pattern has a higher resolution (2n*n) with less coding tracks, which means the new pattern can lead to a smaller encoder, which is essential in the industrial production.
ICD-10 procedure codes produce transition challenges.
Boyd, Andrew D; Li, Jianrong 'John'; Kenost, Colleen; Zaim, Samir Rachid; Krive, Jacob; Mittal, Manish; Satava, Richard A; Burton, Michael; Smith, Jacob; Lussier, Yves A
2018-01-01
The transition of procedure coding from ICD-9-CM-Vol-3 to ICD-10-PCS has generated problems for the medical community at large resulting from the lack of clarity required to integrate two non-congruent coding systems. We hypothesized that quantifying these issues with network topology analyses offers a better understanding of the issues, and therefore we developed solutions (online tools) to empower hospital administrators and researchers to address these challenges. Five topologies were identified: "identity"(I), "class-to-subclass"(C2S), "subclass-toclass"(S2C), "convoluted(C)", and "no mapping"(NM). The procedure codes in the 2010 Illinois Medicaid dataset (3,290 patients, 116 institutions) were categorized as C=55%, C2S=40%, I=3%, NM=2%, and S2C=1%. Majority of the problematic and ambiguous mappings (convoluted) pertained to operations in ophthalmology cardiology, urology, gyneco-obstetrics, and dermatology. Finally, the algorithms were expanded into a user-friendly tool to identify problematic topologies and specify lists of procedural codes utilized by medical professionals and researchers for mitigating error-prone translations, simplifying research, and improving quality.http://www.lussiergroup.org/transition-to-ICD10PCS.
Verification and Validation: High Charge and Energy (HZE) Transport Codes and Future Development
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Mertens, Christopher J.; Blattnig, Steve R.; Clowdsley, Martha S.; Cucinotta, Francis A.; Tweed, John; Heinbockel, John H.; Walker, Steven A.; Nealy, John E.
2005-01-01
In the present paper, we give the formalism for further developing a fully three-dimensional HZETRN code using marching procedures but also development of a new Green's function code is discussed. The final Green's function code is capable of not only validation in the space environment but also in ground based laboratories with directed beams of ions of specific energy and characterized with detailed diagnostic particle spectrometer devices. Special emphasis is given to verification of the computational procedures and validation of the resultant computational model using laboratory and spaceflight measurements. Due to historical requirements, two parallel development paths for computational model implementation using marching procedures and Green s function techniques are followed. A new version of the HZETRN code capable of simulating HZE ions with either laboratory or space boundary conditions is under development. Validation of computational models at this time is particularly important for President Bush s Initiative to develop infrastructure for human exploration with first target demonstration of the Crew Exploration Vehicle (CEV) in low Earth orbit in 2008.
2014-01-01
Background The genome is pervasively transcribed but most transcripts do not code for proteins, constituting non-protein-coding RNAs. Despite increasing numbers of functional reports of individual long non-coding RNAs (lncRNAs), assessing the extent of functionality among the non-coding transcriptional output of mammalian cells remains intricate. In the protein-coding world, transcripts differentially expressed in the context of processes essential for the survival of multicellular organisms have been instrumental in the discovery of functionally relevant proteins and their deregulation is frequently associated with diseases. We therefore systematically identified lncRNAs expressed differentially in response to oncologically relevant processes and cell-cycle, p53 and STAT3 pathways, using tiling arrays. Results We found that up to 80% of the pathway-triggered transcriptional responses are non-coding. Among these we identified very large macroRNAs with pathway-specific expression patterns and demonstrated that these are likely continuous transcripts. MacroRNAs contain elements conserved in mammals and sauropsids, which in part exhibit conserved RNA secondary structure. Comparing evolutionary rates of a macroRNA to adjacent protein-coding genes suggests a local action of the transcript. Finally, in different grades of astrocytoma, a tumor disease unrelated to the initially used cell lines, macroRNAs are differentially expressed. Conclusions It has been shown previously that the majority of expressed non-ribosomal transcripts are non-coding. We now conclude that differential expression triggered by signaling pathways gives rise to a similar abundance of non-coding content. It is thus unlikely that the prevalence of non-coding transcripts in the cell is a trivial consequence of leaky or random transcription events. PMID:24594072
Multimedia techniques for construction education and training : final report.
DOT National Transportation Integrated Search
2017-02-01
The current profession of civil engineering often focuses education and training on code compliance rather than constructability and construction techniques. Also, it is well accepted that it takes a decade or more for engineers to develop a high-lev...
STANDARD FIELD CODES FOR NORTH AMERICAN AMPHIBIANS. (R825795)
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
DOT National Transportation Integrated Search
2014-11-15
The simplified procedure in design codes for determining earthquake response spectra involves : estimating site coefficients to adjust available rock accelerations to site accelerations. Several : investigators have noted concerns with the site coeff...
Final report : Kentucky research peer exchange : October 12\\0x201014, 2011.
DOT National Transportation Integrated Search
2012-03-16
Title 23 of the Code of Federal Regulations (23 CFR) establishes requirements for state : departments of transportation (DOTs) to conduct periodic reviews of their research, : development, and technology (RD&T) programs.One of the tools available...
78 FR 4766 - Authority Citation Correction
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
...-19-11] Authority Citation Correction AGENCY: Securities and Exchange Commission. ACTION: Final rule..., respectively) that each included an inaccurate amendatory instruction pertaining to an authority citation. The Commission is publishing this technical amendment to accurately reflect the authority citation in the Code of...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Computer Reader finalization costs, cost per image, and Remote Bar Code Sorter leakage; (8) Percentage of... processing units costs for Carrier Route, High Density, and Saturation mail; (j) Mail processing unit costs...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Computer Reader finalization costs, cost per image, and Remote Bar Code Sorter leakage; (8) Percentage of... processing units costs for Carrier Route, High Density, and Saturation mail; (j) Mail processing unit costs...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Computer Reader finalization costs, cost per image, and Remote Bar Code Sorter leakage; (8) Percentage of... processing units costs for Carrier Route, High Density, and Saturation mail; (j) Mail processing unit costs...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Computer Reader finalization costs, cost per image, and Remote Bar Code Sorter leakage; (8) Percentage of... processing units costs for Carrier Route, High Density, and Saturation mail; (j) Mail processing unit costs...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Computer Reader finalization costs, cost per image, and Remote Bar Code Sorter leakage; (8) Percentage of... processing units costs for Carrier Route, High Density, and Saturation mail; (j) Mail processing unit costs...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
College Students’ Perceived Differences Between the Terms Real Meal, Meal, and Snack
Banna, Jinan; Richards, Rickelle; Brown, Lora Beth
2017-01-01
Objective To assess qualitatively and quantitatively college students’ perceived differences between a real meal, meal, and snack. Design A descriptive study design was used to administer an 11-item online survey to college students. Setting Two university campuses in the western US. Participants Pilot testing was conducted with 20 students. The final survey was completed by 628 ethnically diverse students. Main Outcome Measures Students’ perceptions of the terms real meal, meal, and snack. Analysis Three researchers coded the data independently, reconciled differences via conference calls, and agreed on a final coding scheme. Data were reevaluated based on the coding scheme. Means, frequencies, Pearson chi-square, and t test statistics were used. Results More than half of students perceived a difference between the terms real meal and meal. Most (97.6%) perceived a difference between the terms meal and snack. A marked difference in the way students defined these terms was evident, with a real meal deemed nutritious and healthy and meeting dietary recommendations, compared with meals, which were considered anything to eat. Conclusions and Implications These findings suggest that the term real meal may provide nutrition educators with a simple phrase to use in educational campaigns to promote healthful food intake among college students. PMID:27993555
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenquist, Ian; Tonks, Michael
2016-10-01
Light water reactor fuel pellets are fabricated using sintering to final densities of 95% or greater. During reactor operation, the porosity remaining in the fuel after fabrication decreases further due to irradiation-assisted densification. While empirical models have been developed to describe this densification process, a mechanistic model is needed as part of the ongoing work by the NEAMS program to develop a more predictive fuel performance code. In this work we will develop a phase field model of sintering of UO 2 in the MARMOT code, and validate it by comparing to published sintering data. We will then add themore » capability to capture irradiation effects into the model, and use it to develop a mechanistic model of densification that will go into the BISON code and add another essential piece to the microstructure-based materials models. The final step will be to add the effects of applied fields, to model field-assisted sintering of UO 2. The results of the phase field model will be validated by comparing to data from field-assisted sintering. Tasks over three years: 1. Develop a sintering model for UO 2 in MARMOT 2. Expand model to account for irradiation effects 3. Develop a mechanistic macroscale model of densification for BISON« less
Hierarchical Recurrent Neural Hashing for Image Retrieval With Hierarchical Convolutional Features.
Lu, Xiaoqiang; Chen, Yaxiong; Li, Xuelong
Hashing has been an important and effective technology in image retrieval due to its computational efficiency and fast search speed. The traditional hashing methods usually learn hash functions to obtain binary codes by exploiting hand-crafted features, which cannot optimally represent the information of the sample. Recently, deep learning methods can achieve better performance, since deep learning architectures can learn more effective image representation features. However, these methods only use semantic features to generate hash codes by shallow projection but ignore texture details. In this paper, we proposed a novel hashing method, namely hierarchical recurrent neural hashing (HRNH), to exploit hierarchical recurrent neural network to generate effective hash codes. There are three contributions of this paper. First, a deep hashing method is proposed to extensively exploit both spatial details and semantic information, in which, we leverage hierarchical convolutional features to construct image pyramid representation. Second, our proposed deep network can exploit directly convolutional feature maps as input to preserve the spatial structure of convolutional feature maps. Finally, we propose a new loss function that considers the quantization error of binarizing the continuous embeddings into the discrete binary codes, and simultaneously maintains the semantic similarity and balanceable property of hash codes. Experimental results on four widely used data sets demonstrate that the proposed HRNH can achieve superior performance over other state-of-the-art hashing methods.Hashing has been an important and effective technology in image retrieval due to its computational efficiency and fast search speed. The traditional hashing methods usually learn hash functions to obtain binary codes by exploiting hand-crafted features, which cannot optimally represent the information of the sample. Recently, deep learning methods can achieve better performance, since deep learning architectures can learn more effective image representation features. However, these methods only use semantic features to generate hash codes by shallow projection but ignore texture details. In this paper, we proposed a novel hashing method, namely hierarchical recurrent neural hashing (HRNH), to exploit hierarchical recurrent neural network to generate effective hash codes. There are three contributions of this paper. First, a deep hashing method is proposed to extensively exploit both spatial details and semantic information, in which, we leverage hierarchical convolutional features to construct image pyramid representation. Second, our proposed deep network can exploit directly convolutional feature maps as input to preserve the spatial structure of convolutional feature maps. Finally, we propose a new loss function that considers the quantization error of binarizing the continuous embeddings into the discrete binary codes, and simultaneously maintains the semantic similarity and balanceable property of hash codes. Experimental results on four widely used data sets demonstrate that the proposed HRNH can achieve superior performance over other state-of-the-art hashing methods.
NASA Astrophysics Data System (ADS)
The present conference on the development status of communications systems in the context of electronic warfare gives attention to topics in spread spectrum code acquisition, digital speech technology, fiber-optics communications, free space optical communications, the networking of HF systems, and applications and evaluation methods for digital speech. Also treated are issues in local area network system design, coding techniques and applications, technology applications for HF systems, receiver technologies, software development status, channel simultion/prediction methods, C3 networking spread spectrum networks, the improvement of communication efficiency and reliability through technical control methods, mobile radio systems, and adaptive antenna arrays. Finally, communications system cost analyses, spread spectrum performance, voice and image coding, switched networks, and microwave GaAs ICs, are considered.
Chemically Reacting One-Dimensional Gas-Particle Flows
NASA Technical Reports Server (NTRS)
Tevepaugh, J. A.; Penny, M. M.
1975-01-01
The governing equations for the one-dimensional flow of a gas-particle system are discussed. Gas-particle effects are coupled via the system momentum and energy equations with the gas assumed to be chemically frozen or in chemical equilibrium. A computer code for calculating the one-dimensional flow of a gas-particle system is discussed and a user's input guide presented. The computer code provides for the expansion of the gas-particle system from a specified starting velocity and nozzle inlet geometry. Though general in nature, the final output of the code is a startline for initiating the solution of a supersonic gas-particle system in rocket nozzles. The startline includes gasdynamic data defining gaseous startline points from the nozzle centerline to the nozzle wall and particle properties at points along the gaseous startline.
NASA Astrophysics Data System (ADS)
Bagli, Enrico; Guidi, Vincenzo
2013-08-01
A toolkit for the simulation of coherent interactions between high-energy charged particles and complex crystal structures, called DYNECHARM++ has been developed. The code has been written in C++ language taking advantage of this object-oriented programing method. The code is capable to evaluating the electrical characteristics of complex atomic structures and to simulate and track the particle trajectory within them. Calculation method of electrical characteristics based on their expansion in Fourier series has been adopted. Two different approaches to simulate the interaction have been adopted, relying on the full integration of particle trajectories under the continuum potential approximation and on the definition of cross-sections of coherent processes. Finally, the code has proved to reproduce experimental results and to simulate interaction of charged particles with complex structures.
Multiple Access Schemes for Lunar Missions
NASA Technical Reports Server (NTRS)
Deutsch, Leslie; Hamkins, Jon; Stocklin, Frank J.
2010-01-01
Two years ago, the NASA Coding, Modulation, and Link Protocol (CMLP) study was completed. The study, led by the authors of this paper, recommended codes, modulation schemes, and desired attributes of link protocols for all space communication links in NASA's future space architecture. Portions of the NASA CMLP team were reassembled to resolve one open issue: the use of multiple access (MA) communication from the lunar surface. The CMLP-MA team analyzed and simulated two candidate multiple access schemes that were identified in the original CMLP study: Code Division MA (CDMA) and Frequency Division MA (FDMA) based on a bandwidth-efficient Continuous Phase Modulation (CPM) with a superimposed Pseudo-Noise (PN) ranging signal (CPM/PN). This paper summarizes the results of the analysis and simulation of the CMLP-MA study and describes the final recommendations.
Post-Newtonian Dynamical Modeling of Supermassive Black Holes in Galactic-scale Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rantala, Antti; Pihajoki, Pauli; Johansson, Peter H.
We present KETJU, a new extension of the widely used smoothed particle hydrodynamics simulation code GADGET-3. The key feature of the code is the inclusion of algorithmically regularized regions around every supermassive black hole (SMBH). This allows for simultaneously following global galactic-scale dynamical and astrophysical processes, while solving the dynamics of SMBHs, SMBH binaries, and surrounding stellar systems at subparsec scales. The KETJU code includes post-Newtonian terms in the equations of motions of the SMBHs, which enables a new SMBH merger criterion based on the gravitational wave coalescence timescale, pushing the merger separation of SMBHs down to ∼0.005 pc. Wemore » test the performance of our code by comparison to NBODY7 and rVINE. We set up dynamically stable multicomponent merger progenitor galaxies to study the SMBH binary evolution during galaxy mergers. In our simulation sample the SMBH binaries do not suffer from the final-parsec problem, which we attribute to the nonspherical shape of the merger remnants. For bulge-only models, the hardening rate decreases with increasing resolution, whereas for models that in addition include massive dark matter halos, the SMBH binary hardening rate becomes practically independent of the mass resolution of the stellar bulge. The SMBHs coalesce on average 200 Myr after the formation of the SMBH binary. However, small differences in the initial SMBH binary eccentricities can result in large differences in the SMBH coalescence times. Finally, we discuss the future prospects of KETJU, which allows for a straightforward inclusion of gas physics in the simulations.« less
Visuospatial memory computations during whole-body rotations in roll.
Van Pelt, S; Van Gisbergen, J A M; Medendorp, W P
2005-08-01
We used a memory-saccade task to test whether the location of a target, briefly presented before a whole-body rotation in roll, is stored in egocentric or in allocentric coordinates. To make this distinction, we exploited the fact that subjects, when tilted sideways in darkness, make systematic errors when indicating the direction of gravity (an allocentric task) even though they have a veridical percept of their self-orientation in space. We hypothesized that if spatial memory is coded allocentrically, these distortions affect the coding of remembered targets and their readout after a body rotation. Alternatively, if coding is egocentric, updating for body rotation becomes essential and errors in performance should be related to the amount of intervening rotation. Subjects (n = 6) were tested making saccades to remembered world-fixed targets after passive body tilts. Initial and final tilt angle ranged between -120 degrees CCW and 120 degrees CW. The results showed that subjects made large systematic directional errors in their saccades (up to 90 degrees ). These errors did not occur in the absence of intervening body rotation, ruling out a memory degradation effect. Regression analysis showed that the errors were closely related to the amount of subjective allocentric distortion at both the initial and final tilt angle, rather than to the amount of intervening rotation. We conclude that the brain uses an allocentric reference frame, possibly gravity-based, to code visuospatial memories during whole-body tilts. This supports the notion that the brain can define information in multiple frames of reference, depending on sensory inputs and task demands.
2013-08-06
This final rule updates the payment rates used under the prospective payment system for skilled nursing facilities (SNFs) for fiscal year (FY) 2014. In addition, it revises and rebases the SNF market basket, revises and updates the labor related share, and makes certain technical and conforming revisions in the regulations text. This final rule also includes a policy for reporting the SNF market basket forecast error in certain limited circumstances and adds a new item to the Minimum Data Set (MDS), Version 3.0 for reporting the number of distinct therapy days. Finally, this final rule adopts a change to the diagnosis code used to determine which residents will receive the AIDS add-on payment, effective for services provided on or after the October 1, 2014 implementation date for conversion to ICD-10-CM.
Commercial Motor Vehicle (CMV) Driver Restart Study: Final Report
DOT National Transportation Integrated Search
2017-03-01
A congressionally-mandated naturalistic study was conducted to evaluate the operational, safety, fatigue, and health impacts of the restart provisions in Sections 395.3(c) and 395.3(d) of Title 49, Code of Federal Regulations. A total of 235 commerci...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-22
... Code of Federal Regulations [CFR] Parts 1500- 1508), Department of the Navy (DoN) NEPA regulations (32... of acquiring additional property and constructing the necessary infrastructure to allow the use of... constructing the [[Page 17645
NUSC Technical Publications Guide.
1985-05-01
Facility personnel especially that of A. Castelluzzo, E. Deland, J. Gesel , and E. Szlosek (all of Code 4343). Reviewed and Approved: 14 July 1980 D...their technical content and format. Review and approve the manual outline, the review manuscript, and the final camera - reproducible copy. Conduct in
ResBos2: Precision Resummation for the LHC ERA
NASA Astrophysics Data System (ADS)
Isaacson, Joshua Paul
With the precision of data at the LHC, it is important to advance theoretical calculations to match it. Previously, the ResBos code was insufficient to adequately describe the data at the LHC. This requires an advancement in the ResBos code, and led to the development of the ResBos2 package. This thesis discusses some of the major improvements that were implemented into the code to advance it and prepare it for the precision of the LHC. The resummation for color singlet particles is improved from approximate NNLL+NLO accuracy to an accuracy of N3LL+NNLO accuracy. The ResBos2 code is validated against the calculation of the total cross-section for Drell-Yan processes against fixed order calculations, to ensure that the calculations are performed correctly. This allows for a prediction of the transverse momentum and φ*eta distributions for the Z boson to be consistent with the data from ATLAS at a collider energy of √s = 8 TeV. Also, the effects of choice of resummation scheme are investigated for the Collins-Soper-Sterman and Catani-deFlorian-Grazzini formalisms. It is shown that as long as the calculation of each of these is performed such that the order of the B coefficient is exactly 1 order higher than that of the C and H coefficients, then the two formalisms are consistent. Additionally, using the improved theoretical prediction will help to reduce the theoretical uncertainty on the mass of the W boson, by reducing the uncertainty in extrapolating the dsigma/dpTW distribution from the data for the dsigma/dpT Z distribution by taking the ratio of the theory predictions for the Z and W transverse momentum. In addition to improving the accuracy of the color singlet final state resummation calculations, the ResBos2 code introduces the resummation of non-color singlet states in the final state. Here the details for the Higgs plus jet calculation are illustrated as an example of one such process. It is shown that it is possible to perform this resummation, but the resummation formalism needs to be modified in order to do so. The major modification that is made is the inclusion of the jet cone-size dependence in the Sudakov form factor. This result resolves, analytically, the Sudakov shoulder singularity. The results of the ResBos2 prediction are compared to both the fixed order and parton shower calculations. The calculations are shown to be consistent for all of the distributions considered up to the theoretical uncertainty. As the LHC continues to increase their data, and their precision on these observables, the ability to have analytic resummation calculations for non-color singlet final states will provide a strong check of perturbative QCD. Finally, the calculation of the terms needed to match to N3LO are done in this work. Once the results become sufficiently publicly available for the perturbative calculation, the ResBos2 code can easily be extended to include these corrections, and be used as a means to predict the total cross-section at N3LO as well.
Analysis of Coherent Microwave Data Collected on the Ocean Over Two Decades
2011-11-14
code) 14-11-2011 Final Report 1 Dec 2009 to 30 Sep 2011 Final Report: Analysis of Coherent Microwave Data Collected on the Ocean over Two Decades...None The objective of this project was to perform further analysis of data sets that had been collected over the past two decades. To this...and can cause cross sections at HH to exceed those at VV in disagreement with composite surface theory, 3) Shadowing is not a factor in low-grazing
2003-01-01
AFRL-IF-RS-TR-2002-315 Final Technical Report January 2003 THE EMERALD MISSION-BASED CORRELATION SYSTEM – AN EXPERIMENTAL DATA...2003 3. REPORT TYPE AND DATES COVERED Final Jan 02 – Jul 02 4. TITLE AND SUBTITLE THE EMERALD MISSION-BASED CORRELATION SYSTEM – AN EXPERIMENTAL...DISTRIBUTION CODE 13. ABSTRACT (Maximum 200 Words) This project was established to experiment on the efficacy of the SRI EMERALD Mission-based
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
Final Report, “Exploiting Global View for Resilience”
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chien, Andrew
2017-03-29
Final technical report for the "Exploiting Global View for Resilience" project. The GVR project aims to create a new approach to portable, resilient applications. The GVR approach builds on a global view data model,, adding versioning (multi-version), user control of timing and rate (multi-stream), and flexible cross layer error signalling and recovery. With a versioned array as a portable abstraction, GVR enables application programmers to exploit deep scientific and application code insights to manage resilience (and its overhead) in a flexible, portable fashion.
An extensive coronagraphic simulation applied to LBT
NASA Astrophysics Data System (ADS)
Vassallo, D.; Carolo, E.; Farinato, J.; Bergomi, M.; Bonavita, M.; Carlotti, A.; D'Orazi, V.; Greggio, D.; Magrin, D.; Mesa, D.; Pinna, E.; Puglisi, A.; Stangalini, M.; Verinaud, C.; Viotto, V.
2016-08-01
In this article we report the results of a comprehensive simulation program aimed at investigating coronagraphic capabilities of SHARK-NIR, a camera selected to proceed to the final design phase at Large Binocular Telescope. For the purpose, we developed a dedicated simulation tool based on physical optics propagation. The code propagates wavefronts through SHARK optical train in an end-to-end fashion and can implement any kind of coronagraph. Detection limits can be finally computed, exploring a wide range of Strehl values and observing conditions.
Multimodal Discriminative Binary Embedding for Large-Scale Cross-Modal Retrieval.
Wang, Di; Gao, Xinbo; Wang, Xiumei; He, Lihuo; Yuan, Bo
2016-10-01
Multimodal hashing, which conducts effective and efficient nearest neighbor search across heterogeneous data on large-scale multimedia databases, has been attracting increasing interest, given the explosive growth of multimedia content on the Internet. Recent multimodal hashing research mainly aims at learning the compact binary codes to preserve semantic information given by labels. The overwhelming majority of these methods are similarity preserving approaches which approximate pairwise similarity matrix with Hamming distances between the to-be-learnt binary hash codes. However, these methods ignore the discriminative property in hash learning process, which results in hash codes from different classes undistinguished, and therefore reduces the accuracy and robustness for the nearest neighbor search. To this end, we present a novel multimodal hashing method, named multimodal discriminative binary embedding (MDBE), which focuses on learning discriminative hash codes. First, the proposed method formulates the hash function learning in terms of classification, where the binary codes generated by the learned hash functions are expected to be discriminative. And then, it exploits the label information to discover the shared structures inside heterogeneous data. Finally, the learned structures are preserved for hash codes to produce similar binary codes in the same class. Hence, the proposed MDBE can preserve both discriminability and similarity for hash codes, and will enhance retrieval accuracy. Thorough experiments on benchmark data sets demonstrate that the proposed method achieves excellent accuracy and competitive computational efficiency compared with the state-of-the-art methods for large-scale cross-modal retrieval task.
Michel, Christian J.
2017-01-01
In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X. As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X. Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes. PMID:28420220
Video coding for 3D-HEVC based on saliency information
NASA Astrophysics Data System (ADS)
Yu, Fang; An, Ping; Yang, Chao; You, Zhixiang; Shen, Liquan
2016-11-01
As an extension of High Efficiency Video Coding ( HEVC), 3D-HEVC has been widely researched under the impetus of the new generation coding standard in recent years. Compared with H.264/AVC, its compression efficiency is doubled while keeping the same video quality. However, its higher encoding complexity and longer encoding time are not negligible. To reduce the computational complexity and guarantee the subjective quality of virtual views, this paper presents a novel video coding method for 3D-HEVC based on the saliency informat ion which is an important part of Human Visual System (HVS). First of all, the relationship between the current coding unit and its adjacent units is used to adjust the maximum depth of each largest coding unit (LCU) and determine the SKIP mode reasonably. Then, according to the saliency informat ion of each frame image, the texture and its corresponding depth map will be divided into three regions, that is, salient area, middle area and non-salient area. Afterwards, d ifferent quantization parameters will be assigned to different regions to conduct low complexity coding. Finally, the compressed video will generate new view point videos through the renderer tool. As shown in our experiments, the proposed method saves more bit rate than other approaches and achieves up to highest 38% encoding time reduction without subjective quality loss in compression or rendering.
Sajad, Amirsaman; Sadeh, Morteza; Yan, Xiaogang; Wang, Hongying; Crawford, John Douglas
2016-01-01
The frontal eye fields (FEFs) participate in both working memory and sensorimotor transformations for saccades, but their role in integrating these functions through time remains unclear. Here, we tracked FEF spatial codes through time using a novel analytic method applied to the classic memory-delay saccade task. Three-dimensional recordings of head-unrestrained gaze shifts were made in two monkeys trained to make gaze shifts toward briefly flashed targets after a variable delay (450-1500 ms). A preliminary analysis of visual and motor response fields in 74 FEF neurons eliminated most potential models for spatial coding at the neuron population level, as in our previous study (Sajad et al., 2015). We then focused on the spatiotemporal transition from an eye-centered target code (T; preferred in the visual response) to an eye-centered intended gaze position code (G; preferred in the movement response) during the memory delay interval. We treated neural population codes as a continuous spatiotemporal variable by dividing the space spanning T and G into intermediate T-G models and dividing the task into discrete steps through time. We found that FEF delay activity, especially in visuomovement cells, progressively transitions from T through intermediate T-G codes that approach, but do not reach, G. This was followed by a final discrete transition from these intermediate T-G delay codes to a "pure" G code in movement cells without delay activity. These results demonstrate that FEF activity undergoes a series of sensory-memory-motor transformations, including a dynamically evolving spatial memory signal and an imperfect memory-to-motor transformation.
Coded Excitation Plane Wave Imaging for Shear Wave Motion Detection
Song, Pengfei; Urban, Matthew W.; Manduca, Armando; Greenleaf, James F.; Chen, Shigao
2015-01-01
Plane wave imaging has greatly advanced the field of shear wave elastography thanks to its ultrafast imaging frame rate and the large field-of-view (FOV). However, plane wave imaging also has decreased penetration due to lack of transmit focusing, which makes it challenging to use plane waves for shear wave detection in deep tissues and in obese patients. This study investigated the feasibility of implementing coded excitation in plane wave imaging for shear wave detection, with the hypothesis that coded ultrasound signals can provide superior detection penetration and shear wave signal-to-noise-ratio (SNR) compared to conventional ultrasound signals. Both phase encoding (Barker code) and frequency encoding (chirp code) methods were studied. A first phantom experiment showed an approximate penetration gain of 2-4 cm for the coded pulses. Two subsequent phantom studies showed that all coded pulses outperformed the conventional short imaging pulse by providing superior sensitivity to small motion and robustness to weak ultrasound signals. Finally, an in vivo liver case study on an obese subject (Body Mass Index = 40) demonstrated the feasibility of using the proposed method for in vivo applications, and showed that all coded pulses could provide higher SNR shear wave signals than the conventional short pulse. These findings indicate that by using coded excitation shear wave detection, one can benefit from the ultrafast imaging frame rate and large FOV provided by plane wave imaging while preserving good penetration and shear wave signal quality, which is essential for obtaining robust shear elasticity measurements of tissue. PMID:26168181
Users manual for program NYQUIST: Liquid rocket nyquist plots developed for use on a PC computer
NASA Astrophysics Data System (ADS)
Armstrong, Wilbur C.
1992-06-01
The piping in a liquid rocket can assume complex configurations due to multiple tanks, multiple engines, and structures that must be piped around. The capability to handle some of these complex configurations have been incorporated into the NYQUIST code. The capability to modify the input on line has been implemented. The configurations allowed include multiple tanks, multiple engines, and the splitting of a pipe into unequal segments going to different (or the same) engines. This program will handle the following type elements: straight pipes, bends, inline accumulators, tuned stub accumulators, Helmholtz resonators, parallel resonators, pumps, split pipes, multiple tanks, and multiple engines. The code is too large to compile as one program using Microsoft FORTRAN 5; therefore, the code was broken into two segments: NYQUIST1.FOR and NYQUIST2.FOR. These are compiled separately and then linked together. The final run code is not too large (approximately equals 344,000 bytes).
NASA Technical Reports Server (NTRS)
Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.
1983-01-01
Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.
High-Fidelity Coupled Monte-Carlo/Thermal-Hydraulics Calculations
NASA Astrophysics Data System (ADS)
Ivanov, Aleksandar; Sanchez, Victor; Ivanov, Kostadin
2014-06-01
Monte Carlo methods have been used as reference reactor physics calculation tools worldwide. The advance in computer technology allows the calculation of detailed flux distributions in both space and energy. In most of the cases however, those calculations are done under the assumption of homogeneous material density and temperature distributions. The aim of this work is to develop a consistent methodology for providing realistic three-dimensional thermal-hydraulic distributions by coupling the in-house developed sub-channel code SUBCHANFLOW with the standard Monte-Carlo transport code MCNP. In addition to the innovative technique of on-the fly material definition, a flux-based weight-window technique has been introduced to improve both the magnitude and the distribution of the relative errors. Finally, a coupled code system for the simulation of steady-state reactor physics problems has been developed. Besides the problem of effective feedback data interchange between the codes, the treatment of temperature dependence of the continuous energy nuclear data has been investigated.
NASA Astrophysics Data System (ADS)
Navon, I. M.; Yu, Jian
A FORTRAN computer program is presented and documented applying the Turkel-Zwas explicit large time-step scheme to a hemispheric barotropic model with constraint restoration of integral invariants of the shallow-water equations. We then proceed to detail the algorithms embodied in the code EXSHALL in this paper, particularly algorithms related to the efficiency and stability of T-Z scheme and the quadratic constraint restoration method which is based on a variational approach. In particular we provide details about the high-latitude filtering, Shapiro filtering, and Robert filtering algorithms used in the code. We explain in detail the various subroutines in the EXSHALL code with emphasis on algorithms implemented in the code and present the flowcharts of some major subroutines. Finally, we provide a visual example illustrating a 4-day run using real initial data, along with a sample printout and graphic isoline contours of the height field and velocity fields.
NASA Astrophysics Data System (ADS)
Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.
1983-05-01
Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.
Users manual for program NYQUIST: Liquid rocket nyquist plots developed for use on a PC computer
NASA Technical Reports Server (NTRS)
Armstrong, Wilbur C.
1992-01-01
The piping in a liquid rocket can assume complex configurations due to multiple tanks, multiple engines, and structures that must be piped around. The capability to handle some of these complex configurations have been incorporated into the NYQUIST code. The capability to modify the input on line has been implemented. The configurations allowed include multiple tanks, multiple engines, and the splitting of a pipe into unequal segments going to different (or the same) engines. This program will handle the following type elements: straight pipes, bends, inline accumulators, tuned stub accumulators, Helmholtz resonators, parallel resonators, pumps, split pipes, multiple tanks, and multiple engines. The code is too large to compile as one program using Microsoft FORTRAN 5; therefore, the code was broken into two segments: NYQUIST1.FOR and NYQUIST2.FOR. These are compiled separately and then linked together. The final run code is not too large (approximately equals 344,000 bytes).
Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine
2014-03-01
Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less
Code-Time Diversity for Direct Sequence Spread Spectrum Systems
Hassan, A. Y.
2014-01-01
Time diversity is achieved in direct sequence spread spectrum by receiving different faded delayed copies of the transmitted symbols from different uncorrelated channel paths when the transmission signal bandwidth is greater than the coherence bandwidth of the channel. In this paper, a new time diversity scheme is proposed for spread spectrum systems. It is called code-time diversity. In this new scheme, N spreading codes are used to transmit one data symbol over N successive symbols interval. The diversity order in the proposed scheme equals to the number of the used spreading codes N multiplied by the number of the uncorrelated paths of the channel L. The paper represents the transmitted signal model. Two demodulators structures will be proposed based on the received signal models from Rayleigh flat and frequency selective fading channels. Probability of error in the proposed diversity scheme is also calculated for the same two fading channels. Finally, simulation results are represented and compared with that of maximal ration combiner (MRC) and multiple-input and multiple-output (MIMO) systems. PMID:24982925
NASA Astrophysics Data System (ADS)
Lu, Weihua; Chen, Xinjian; Zhu, Weifang; Yang, Lei; Cao, Zhaoyuan; Chen, Haoyu
2015-03-01
In this paper, we proposed a method based on the Freeman chain code to segment and count rhesus choroid-retinal vascular endothelial cells (RF/6A) automatically for fluorescence microscopy images. The proposed method consists of four main steps. First, a threshold filter and morphological transform were applied to reduce the noise. Second, the boundary information was used to generate the Freeman chain codes. Third, the concave points were found based on the relationship between the difference of the chain code and the curvature. Finally, cells segmentation and counting were completed based on the characteristics of the number of the concave points, the area and shape of the cells. The proposed method was tested on 100 fluorescence microscopic cell images, and the average true positive rate (TPR) is 98.13% and the average false positive rate (FPR) is 4.47%, respectively. The preliminary results showed the feasibility and efficiency of the proposed method.
Final Report Advanced Quasioptical Launcher System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey Neilson
2010-04-30
This program developed an analytical design tool for designing antenna and mirror systems to convert whispering gallery RF modes to Gaussian or HE11 modes. Whispering gallery modes are generated by gyrotrons used for electron cyclotron heating of fusion plasmas in tokamaks. These modes cannot be easily transmitted and must be converted to free space or waveguide modes compatible with transmission line systems.This program improved the capability of SURF3D/LOT, which was initially developed in a previous SBIR program. This suite of codes revolutionized quasi-optical launcher design, and this code, or equivalent codes, are now used worldwide. This program added functionality tomore » SURF3D/LOT to allow creating of more compact launcher and mirror systems and provide direct coupling to corrugated waveguide within the vacuum envelope of the gyrotron. Analysis was also extended to include full-wave analysis of mirror transmission line systems. The code includes a graphical user interface and is available for advanced design of launcher systems.« less
Speech Rhythms and Multiplexed Oscillatory Sensory Coding in the Human Brain
Gross, Joachim; Hoogenboom, Nienke; Thut, Gregor; Schyns, Philippe; Panzeri, Stefano; Belin, Pascal; Garrod, Simon
2013-01-01
Cortical oscillations are likely candidates for segmentation and coding of continuous speech. Here, we monitored continuous speech processing with magnetoencephalography (MEG) to unravel the principles of speech segmentation and coding. We demonstrate that speech entrains the phase of low-frequency (delta, theta) and the amplitude of high-frequency (gamma) oscillations in the auditory cortex. Phase entrainment is stronger in the right and amplitude entrainment is stronger in the left auditory cortex. Furthermore, edges in the speech envelope phase reset auditory cortex oscillations thereby enhancing their entrainment to speech. This mechanism adapts to the changing physical features of the speech envelope and enables efficient, stimulus-specific speech sampling. Finally, we show that within the auditory cortex, coupling between delta, theta, and gamma oscillations increases following speech edges. Importantly, all couplings (i.e., brain-speech and also within the cortex) attenuate for backward-presented speech, suggesting top-down control. We conclude that segmentation and coding of speech relies on a nested hierarchy of entrained cortical oscillations. PMID:24391472
Physics Based Model for Cryogenic Chilldown and Loading. Part IV: Code Structure
NASA Technical Reports Server (NTRS)
Luchinsky, D. G.; Smelyanskiy, V. N.; Brown, B.
2014-01-01
This is the fourth report in a series of technical reports that describe separated two-phase flow model application to the cryogenic loading operation. In this report we present the structure of the code. The code consists of five major modules: (1) geometry module; (2) solver; (3) material properties; (4) correlations; and finally (5) stability control module. The two key modules - solver and correlations - are further divided into a number of submodules. Most of the physics and knowledge databases related to the properties of cryogenic two-phase flow are included into the cryogenic correlations module. The functional form of those correlations is not well established and is a subject of extensive research. Multiple parametric forms for various correlations are currently available. Some of them are included into correlations module as will be described in details in a separate technical report. Here we describe the overall structure of the code and focus on the details of the solver and stability control modules.
Converting Panax ginseng DNA and chemical fingerprints into two-dimensional barcode.
Cai, Yong; Li, Peng; Li, Xi-Wen; Zhao, Jing; Chen, Hai; Yang, Qing; Hu, Hao
2017-07-01
In this study, we investigated how to convert the Panax ginseng DNA sequence code and chemical fingerprints into a two-dimensional code. In order to improve the compression efficiency, GATC2Bytes and digital merger compression algorithms are proposed. HPLC chemical fingerprint data of 10 groups of P. ginseng from Northeast China and the internal transcribed spacer 2 (ITS2) sequence code as the DNA sequence code were ready for conversion. In order to convert such data into a two-dimensional code, the following six steps were performed: First, the chemical fingerprint characteristic data sets were obtained through the inflection filtering algorithm. Second, precompression processing of such data sets is undertaken. Third, precompression processing was undertaken with the P. ginseng DNA (ITS2) sequence codes. Fourth, the precompressed chemical fingerprint data and the DNA (ITS2) sequence code were combined in accordance with the set data format. Such combined data can be compressed by Zlib, an open source data compression algorithm. Finally, the compressed data generated a two-dimensional code called a quick response code (QR code). Through the abovementioned converting process, it can be found that the number of bytes needed for storing P. ginseng chemical fingerprints and its DNA (ITS2) sequence code can be greatly reduced. After GTCA2Bytes algorithm processing, the ITS2 compression rate reaches 75% and the chemical fingerprint compression rate exceeds 99.65% via filtration and digital merger compression algorithm processing. Therefore, the overall compression ratio even exceeds 99.36%. The capacity of the formed QR code is around 0.5k, which can easily and successfully be read and identified by any smartphone. P. ginseng chemical fingerprints and its DNA (ITS2) sequence code can form a QR code after data processing, and therefore the QR code can be a perfect carrier of the authenticity and quality of P. ginseng information. This study provides a theoretical basis for the development of a quality traceability system of traditional Chinese medicine based on a two-dimensional code.
Multiplexed Detection of Cytokines Based on Dual Bar-Code Strategy and Single-Molecule Counting.
Li, Wei; Jiang, Wei; Dai, Shuang; Wang, Lei
2016-02-02
Cytokines play important roles in the immune system and have been regarded as biomarkers. While single cytokine is not specific and accurate enough to meet the strict diagnosis in practice, in this work, we constructed a multiplexed detection method for cytokines based on dual bar-code strategy and single-molecule counting. Taking interferon-γ (IFN-γ) and tumor necrosis factor-α (TNF-α) as model analytes, first, the magnetic nanobead was functionalized with the second antibody and primary bar-code strands, forming a magnetic nanoprobe. Then, through the specific reaction of the second antibody and the antigen that fixed by the primary antibody, sandwich-type immunocomplex was formed on the substrate. Next, the primary bar-code strands as amplification units triggered multibranched hybridization chain reaction (mHCR), producing nicked double-stranded polymers with multiple branched arms, which were served as secondary bar-code strands. Finally, the secondary bar-code strands hybridized with the multimolecule labeled fluorescence probes, generating enhanced fluorescence signals. The numbers of fluorescence dots were counted one by one for quantification with epi-fluorescence microscope. By integrating the primary and secondary bar-code-based amplification strategy and the multimolecule labeled fluorescence probes, this method displayed an excellent sensitivity with the detection limits were both 5 fM. Unlike the typical bar-code assay that the bar-code strands should be released and identified on a microarray, this method is more direct. Moreover, because of the selective immune reaction and the dual bar-code mechanism, the resulting method could detect the two targets simultaneously. Multiple analysis in human serum was also performed, suggesting that our strategy was reliable and had a great potential application in early clinical diagnosis.
Spherical hashing: binary code embedding with hyperspheres.
Heo, Jae-Pil; Lee, Youngwoon; He, Junfeng; Chang, Shih-Fu; Yoon, Sung-Eui
2015-11-01
Many binary code embedding schemes have been actively studied recently, since they can provide efficient similarity search, and compact data representations suitable for handling large scale image databases. Existing binary code embedding techniques encode high-dimensional data by using hyperplane-based hashing functions. In this paper we propose a novel hypersphere-based hashing function, spherical hashing, to map more spatially coherent data points into a binary code compared to hyperplane-based hashing functions. We also propose a new binary code distance function, spherical Hamming distance, tailored for our hypersphere-based binary coding scheme, and design an efficient iterative optimization process to achieve both balanced partitioning for each hash function and independence between hashing functions. Furthermore, we generalize spherical hashing to support various similarity measures defined by kernel functions. Our extensive experiments show that our spherical hashing technique significantly outperforms state-of-the-art techniques based on hyperplanes across various benchmarks with sizes ranging from one to 75 million of GIST, BoW and VLAD descriptors. The performance gains are consistent and large, up to 100 percent improvements over the second best method among tested methods. These results confirm the unique merits of using hyperspheres to encode proximity regions in high-dimensional spaces. Finally, our method is intuitive and easy to implement.
Final report on LDRD project : coupling strategies for multi-physics applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopkins, Matthew Morgan; Moffat, Harry K.; Carnes, Brian
Many current and future modeling applications at Sandia including ASC milestones will critically depend on the simultaneous solution of vastly different physical phenomena. Issues due to code coupling are often not addressed, understood, or even recognized. The objectives of the LDRD has been both in theory and in code development. We will show that we have provided a fundamental analysis of coupling, i.e., when strong coupling vs. a successive substitution strategy is needed. We have enabled the implementation of tighter coupling strategies through additions to the NOX and Sierra code suites to make coupling strategies available now. We have leveragedmore » existing functionality to do this. Specifically, we have built into NOX the capability to handle fully coupled simulations from multiple codes, and we have also built into NOX the capability to handle Jacobi Free Newton Krylov simulations that link multiple applications. We show how this capability may be accessed from within the Sierra Framework as well as from outside of Sierra. The critical impact from this LDRD is that we have shown how and have delivered strategies for enabling strong Newton-based coupling while respecting the modularity of existing codes. This will facilitate the use of these codes in a coupled manner to solve multi-physic applications.« less
NASA Technical Reports Server (NTRS)
Kwatra, S. C.
1998-01-01
A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.
Dimerization drives EGFR endocytosis through two sets of compatible endocytic codes.
Wang, Qian; Chen, Xinmei; Wang, Zhixiang
2015-03-01
We have shown previously that epidermal growth factor (EGF) receptor (EGFR) endocytosis is controlled by EGFR dimerization. However, it is not clear how the dimerization drives receptor internalization. We propose that EGFR endocytosis is driven by dimerization, bringing two sets of endocytic codes, one contained in each receptor monomer, in close proximity. Here, we tested this hypothesis by generating specific homo- or hetero-dimers of various receptors and their mutants. We show that ErbB2 and ErbB3 homodimers are endocytosis deficient owing to the lack of endocytic codes. Interestingly, EGFR-ErbB2 or EGFR-ErbB3 heterodimers are also endocytosis deficient. Moreover, the heterodimer of EGFR and the endocytosis-deficient mutant EGFRΔ1005-1017 is also impaired in endocytosis. These results indicate that two sets of endocytic codes are required for receptor endocytosis. We found that an EGFR-PDGFRβ heterodimer is endocytosis deficient, although both EGFR and PDGFRβ homodimers are endocytosis-competent, indicating that two compatible sets of endocytic codes are required. Finally, we found that to mediate the endocytosis of the receptor dimer, the two sets of compatible endocytic codes, one contained in each receptor molecule, have to be spatially coordinated. © 2015. Published by The Company of Biologists Ltd.
RETRANO3 benchmarks for Beaver Valley plant transients and FSAR analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaumont, E.T.; Feltus, M.A.
1993-01-01
Any best-estimate code (e.g., RETRANO3) results must be validated against plant data and final safety analysis report (FSAR) predictions. The need for two independent means of benchmarking is necessary to ensure that the results were not biased toward a particular data set and to have a certain degree of accuracy. The code results need to be compared with previous results and show improvements over previous code results. Ideally, the two best means of benchmarking a thermal hydraulics code are comparing results from previous versions of the same code along with actual plant data. This paper describes RETRAN03 benchmarks against RETRAN02more » results, actual plant data, and FSAR predictions. RETRAN03, the Electric Power Research Institute's latest version of the RETRAN thermal-hydraulic analysis codes, offers several upgrades over its predecessor, RETRAN02 Mod5. RETRAN03 can use either implicit or semi-implicit numerics, whereas RETRAN02 Mod5 uses only semi-implicit numerics. Another major upgrade deals with slip model options. RETRAN03 added several new models, including a five-equation model for more accurate modeling of two-phase flow. RETPAN02 Mod5 should give similar but slightly more conservative results than RETRAN03 when executed with RETRAN02 Mod5 options.« less
Current and anticipated uses of thermal-hydraulic codes in Spain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelayo, F.; Reventos, F.
1997-07-01
Spanish activities in the field of Applied Thermal-Hydraulics are steadily increasing as the codes are becoming practicable enough to efficiently sustain engineering decision in the Nuclear Power industry. Before reaching this point, a lot of effort has been devoted to achieve this goal. This paper briefly describes this process, points at the current applications and draws conclusions on the limitations. Finally it establishes the applications where the use of T-H codes would be worth in the future, this in turn implies further development of the codes to widen the scope of application and improve the general performance. Due to themore » different uses of the codes, the applications mainly come from the authority, industry, universities and research institutions. The main conclusion derived from this paper establishes that further code development is justified if the following requisites are considered: (1) Safety relevance of scenarios not presently covered is established. (2) A substantial gain in margins or the capability to use realistic assumptions is obtained. (3) A general consensus on the licensability and methodology for application is reached. The role of Regulatory Body is stressed, as the most relevant outcome of the project may be related to the evolution of the licensing frame.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-31
... the Land Use Code to allow a new zone for Yesler Terrace, street vacation, preliminary and final plat... project that shifts environmental review from the time a permit application is made to an earlier phase in...
Code of Federal Regulations, 2014 CFR
2014-10-01
... subclassification within a test group which is based on engine code, transmission type and gear ratios, final drive... the non-combustion reaction of a consumable fuel, typically hydrogen. Fuel cell electric vehicle means...-groups in each regulatory category to which fuel consumption requirements apply, and are defined as...
Numerical Simulations of 3D Seismic Data Final Report CRADA No. TC02095.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedmann, S. J.; Kostov, C.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of Califomia)/Lawrence-Livermore National Laboratory (LLNL) and Schlumberger Cambridge Research (SCR), to develop synthetic seismic data sets and supporting codes.
Unsteady Propeller Hydrodynamics
2001-06-01
coupling routines, making the code more robust while decreasing the computation burden over currect methods. Finally, a higher order quadratic influence ... function technique was implemented within the wake to more accurately define the induction velocity at the trailing edge which has suffered in the past due to lack of discretization.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-20
... for the year to which they apply, of rents for existing or newly constructed rental dwelling units, as... Census geography. Furthermore, The Census Bureau will not continue to support both ZIP code and ZCTA...
78 FR 52607 - Unified Registration System
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-23
...'' and click ``Search.'' Next, click ``Open Docket Folder'' in the ``Actions'' column. Finally, in the... comments received are posted without change to http://www.regulations.gov . Anyone is able to search the... License CFR Code of Federal Regulations CMV Commercial Motor Vehicle CR Compliance Review CSA Compliance...
Department of Interior Semiannual Regulatory Agenda
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-26
... INTERIOR (DOI) DEPARTMENT OF THE INTERIOR Office of the Secretary 25 CFR Ch. I 30 CFR Chs. II and VII 36... Plugging and Platform Decommissioning 1010-AD61 Department of the Interior (DOI) Final Rule Stage United...-AX19 BILLING CODE 4310--55--S [[Page 21817
Snipes, Shedra A; Cooper, Sharon P; Shipp, Eva M
2017-01-01
This article describes how perceived discrimination shapes the way Latino farmworkers encounter injuries and seek out treatment. After 5 months of ethnographic fieldwork, 89 open-ended, semistructured interviews were analyzed. NVivo was used to code and qualitatively organize the interviews and field notes. Finally, codes, notes, and co-occurring dynamics were used to iteratively assess the data for major themes. The primary source of perceived discrimination was the "boss" or farm owner. Immigrant status was also a significant influence on how farmworkers perceived the discrimination. Specifically, the ability to speak English and length of stay in the United States were related to stronger perceptions of discrimination. Finally, farm owners compelled their Latino employees to work through their injuries without treatment. This ethnographic account brings attention to how discrimination and lack of worksite protections are implicated in farmworkers' injury experiences and suggests the need for policies that better safeguard vulnerable workers.
Maximum aposteriori joint source/channel coding
NASA Technical Reports Server (NTRS)
Sayood, Khalid; Gibson, Jerry D.
1991-01-01
A maximum aposteriori probability (MAP) approach to joint source/channel coder design is presented in this paper. This method attempts to explore a technique for designing joint source/channel codes, rather than ways of distributing bits between source coders and channel coders. For a nonideal source coder, MAP arguments are used to design a decoder which takes advantage of redundancy in the source coder output to perform error correction. Once the decoder is obtained, it is analyzed with the purpose of obtaining 'desirable properties' of the channel input sequence for improving overall system performance. Finally, an encoder design which incorporates these properties is proposed.
Modeling of Interactions of Ablated Plumes
2008-02-01
code was tested and verified using the Sedov-Taylor explosion problem 24. The grid 300 x 300 is used so as the single code run takes 30 minutes in a...still air and b) temperature contours along with the vector field for 20 km at t-10ps. 9 Final report AFOSR FA9550-07-1-0457 February 2008 0960014 09 C ow...ia Figue9FrainoIeodr shok wves a-)pesr otus0 )~22,bt4F,adetJ n d) het trnsferat te TPSw9PS 00 As 10 04 J. ’ Figure 9:Formation of secondary shock
Transitional flow in thin tubes for space station freedom radiator
NASA Technical Reports Server (NTRS)
Loney, Patrick; Ibrahim, Mounir
1995-01-01
A two dimensional finite volume method is used to predict the film coefficients in the transitional flow region (laminar or turbulent) for the radiator panel tubes. The code used to perform this analysis is CAST (Computer Aided Simulation of Turbulent Flows). The information gathered from this code is then used to augment a Sinda85 model that predicts overall performance of the radiator. A final comparison is drawn between the results generated with a Sinda85 model using the Sinda85 provided transition region heat transfer correlations and the Sinda85 model using the CAST generated data.
Analysis Of The Boeing FEL Mirror Measurements
NASA Astrophysics Data System (ADS)
Knapp, Charles E.; Viswanathan, Vriddhachalam K.; Appert, Quentin D.
1989-07-01
The aberrations have been measured for the finished mirrors that are part of the Burst Mode ring resonator of the Free Electron Laser (FEL) being constructed at the Boeing Aerospace Company in Seattle, Washington. This paper presents analysis of these measurements using the GLAD code, a diffraction ray-tracing code. The diffraction losses within the resonator due to the aberrations are presented. The analysis was conducted in two different modes, a paraxial approximation and a full 3-D calculation, and good agreement between the two approaches is shown. Finally, a proposed solution to the problems caused by the aberrations is presented and analyzed.
Preliminary Work in Atmospheric Turbulence Profiles with the Differential Multi-image Motion Monitor
2016-09-01
Center Pacific’s (SSC Pacific) Optical Channel Characterization in Maritime Atmospheres (OCCIMA) Python code is demonstrated with examples that match...OCCIMA) Python code, show how to model the DM3 and anisoplanitic jitter measurements, and finally demonstrate how the turbulence strength profile... python modules. 0.0 0.5 1.0 1.5 2.0 Separation at target plane (m) 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 3.0 A ni so pl an at ic jit te r( λ /D ) Parallel
Future capabilities for the Deep Space Network
NASA Technical Reports Server (NTRS)
Berner, J. B.; Bryant, S. H.; Andrews, K. S.
2004-01-01
This paper will look at three new capabilities that are in different stages of development. First, turbo decoding, which provides improved telemetry performance for data rates up to about 1 Mbps, will be discussed. Next, pseudo-noise ranging will be presented. Pseudo-noise ranging has several advantages over the current sequential ranging, anmely easier operations, improved performance, and the capability to be used in a regenerative implementation on a spacecraft. Finally, Low Density Parity Check decoding will be discussed. LDPC codes can provide performance that matches or slightly exceed turbo codes, but are designed for use in the 10 Mbps range.
Intercode comparison of gyrokinetic global electromagnetic modes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Görler, T., E-mail: tobias.goerler@ipp.mpg.de; Tronko, N.; Hornsby, W. A.
Aiming to fill a corresponding lack of sophisticated test cases for global electromagnetic gyrokinetic codes, a new hierarchical benchmark is proposed. Starting from established test sets with adiabatic electrons, fully gyrokinetic electrons, and electrostatic fluctuations are taken into account before finally studying the global electromagnetic micro-instabilities. Results from up to five codes involving representatives from different numerical approaches as particle-in-cell methods, Eulerian and Semi-Lagrangian are shown. By means of spectrally resolved growth rates and frequencies and mode structure comparisons, agreement can be confirmed on ion-gyro-radius scales, thus providing confidence in the correct implementation of the underlying equations.
Performance of concatenated Reed-Solomon/Viterbi channel coding
NASA Technical Reports Server (NTRS)
Divsalar, D.; Yuen, J. H.
1982-01-01
The concatenated Reed-Solomon (RS)/Viterbi coding system is reviewed. The performance of the system is analyzed and results are derived with a new simple approach. A functional model for the input RS symbol error probability is presented. Based on this new functional model, we compute the performance of a concatenated system in terms of RS word error probability, output RS symbol error probability, bit error probability due to decoding failure, and bit error probability due to decoding error. Finally we analyze the effects of the noisy carrier reference and the slow fading on the system performance.
Data Representation, Coding, and Communication Standards.
Amin, Milon; Dhir, Rajiv
2015-06-01
The immense volume of cases signed out by surgical pathologists on a daily basis gives little time to think about exactly how data are stored. An understanding of the basics of data representation has implications that affect a pathologist's daily practice. This article covers the basics of data representation and its importance in the design of electronic medical record systems. Coding in surgical pathology is also discussed. Finally, a summary of communication standards in surgical pathology is presented, including suggested resources that establish standards for select aspects of pathology reporting. Copyright © 2015 Elsevier Inc. All rights reserved.
Non-coding RNAs in cardiac fibrosis: emerging biomarkers and therapeutic targets.
Chen, Zhongxiu; Li, Chen; Lin, Ke; Cai, Huawei; Ruan, Weiqiang; Han, Junyang; Rao, Li
2017-12-14
Non-coding RNAs (ncRNAs) are a class of RNA molecules that do not encode proteins. ncRNAs are involved in cell proliferation, apoptosis, differentiation, metabolism, and other physiological processes as well as the pathogenesis of diseases. Cardiac fibrosis is increasingly recognized as a common final pathway in advanced heart diseases. Many studies have shown that the occurrence and development of cardiac fibrosis is closely related to the regulation of ncRNAs. This review will highlight recent updates regarding the involvement of ncRNAs in cardiac fibrosis, and their potential as emerging biomarkers and therapeutic targets.
Vectorized Monte Carlo methods for reactor lattice analysis
NASA Technical Reports Server (NTRS)
Brown, F. B.
1984-01-01
Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.
Maxwell: A semi-analytic 4D code for earthquake cycle modeling of transform fault systems
NASA Astrophysics Data System (ADS)
Sandwell, David; Smith-Konter, Bridget
2018-05-01
We have developed a semi-analytic approach (and computational code) for rapidly calculating 3D time-dependent deformation and stress caused by screw dislocations imbedded within an elastic layer overlying a Maxwell viscoelastic half-space. The maxwell model is developed in the Fourier domain to exploit the computational advantages of the convolution theorem, hence substantially reducing the computational burden associated with an arbitrarily complex distribution of force couples necessary for fault modeling. The new aspect of this development is the ability to model lateral variations in shear modulus. Ten benchmark examples are provided for testing and verification of the algorithms and code. One final example simulates interseismic deformation along the San Andreas Fault System where lateral variations in shear modulus are included to simulate lateral variations in lithospheric structure.
Software Considerations for Subscale Flight Testing of Experimental Control Laws
NASA Technical Reports Server (NTRS)
Murch, Austin M.; Cox, David E.; Cunningham, Kevin
2009-01-01
The NASA AirSTAR system has been designed to address the challenges associated with safe and efficient subscale flight testing of research control laws in adverse flight conditions. In this paper, software elements of this system are described, with an emphasis on components which allow for rapid prototyping and deployment of aircraft control laws. Through model-based design and automatic coding a common code-base is used for desktop analysis, piloted simulation and real-time flight control. The flight control system provides the ability to rapidly integrate and test multiple research control laws and to emulate component or sensor failures. Integrated integrity monitoring systems provide aircraft structural load protection, isolate the system from control algorithm failures, and monitor the health of telemetry streams. Finally, issues associated with software configuration management and code modularity are briefly discussed.
Evidence for the implication of the histone code in building the genome structure.
Prakash, Kirti; Fournier, David
2018-02-01
Histones are punctuated with small chemical modifications that alter their interaction with DNA. One attractive hypothesis stipulates that certain combinations of these histone modifications may function, alone or together, as a part of a predictive histone code to provide ground rules for chromatin folding. We consider four features that relate histone modifications to chromatin folding: charge neutralisation, molecular specificity, robustness and evolvability. Next, we present evidence for the association among different histone modifications at various levels of chromatin organisation and show how these relationships relate to function such as transcription, replication and cell division. Finally, we propose a model where the histone code can set critical checkpoints for chromatin to fold reversibly between different orders of the organisation in response to a biological stimulus. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Reichert, R, S.; Biringen, S.; Howard, J. E.
1999-01-01
LINER is a system of Fortran 77 codes which performs a 2D analysis of acoustic wave propagation and noise suppression in a rectangular channel with a continuous liner at the top wall. This new implementation is designed to streamline the usage of the several codes making up LINER, resulting in a useful design tool. Major input parameters are placed in two main data files, input.inc and nurn.prm. Output data appear in the form of ASCII files as well as a choice of GNUPLOT graphs. Section 2 briefly describes the physical model. Section 3 discusses the numerical methods; Section 4 gives a detailed account of program usage, including input formats and graphical options. A sample run is also provided. Finally, Section 5 briefly describes the individual program files.
ICD-10 procedure codes produce transition challenges
Boyd, Andrew D.; Li, Jianrong ‘John’; Kenost, Colleen; Zaim, Samir Rachid; Krive, Jacob; Mittal, Manish; Satava, Richard A.; Burton, Michael; Smith, Jacob; Lussier, Yves A.
2018-01-01
The transition of procedure coding from ICD-9-CM-Vol-3 to ICD-10-PCS has generated problems for the medical community at large resulting from the lack of clarity required to integrate two non-congruent coding systems. We hypothesized that quantifying these issues with network topology analyses offers a better understanding of the issues, and therefore we developed solutions (online tools) to empower hospital administrators and researchers to address these challenges. Five topologies were identified: “identity”(I), “class-to-subclass”(C2S), “subclass-toclass”(S2C), “convoluted(C)”, and “no mapping”(NM). The procedure codes in the 2010 Illinois Medicaid dataset (3,290 patients, 116 institutions) were categorized as C=55%, C2S=40%, I=3%, NM=2%, and S2C=1%. Majority of the problematic and ambiguous mappings (convoluted) pertained to operations in ophthalmology cardiology, urology, gyneco-obstetrics, and dermatology. Finally, the algorithms were expanded into a user-friendly tool to identify problematic topologies and specify lists of procedural codes utilized by medical professionals and researchers for mitigating error-prone translations, simplifying research, and improving quality.http://www.lussiergroup.org/transition-to-ICD10PCS PMID:29888037
Scaling features of noncoding DNA
NASA Technical Reports Server (NTRS)
Stanley, H. E.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.
1999-01-01
We review evidence supporting the idea that the DNA sequence in genes containing noncoding regions is correlated, and that the correlation is remarkably long range--indeed, base pairs thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene, and utilize this fact to build a Coding Sequence Finder Algorithm, which uses statistical ideas to locate the coding regions of an unknown DNA sequence. Finally, we describe briefly some recent work adapting to DNA the Zipf approach to analyzing linguistic texts, and the Shannon approach to quantifying the "redundancy" of a linguistic text in terms of a measurable entropy function, and reporting that noncoding regions in eukaryotes display a larger redundancy than coding regions. Specifically, we consider the possibility that this result is solely a consequence of nucleotide concentration differences as first noted by Bonhoeffer and his collaborators. We find that cytosine-guanine (CG) concentration does have a strong "background" effect on redundancy. However, we find that for the purine-pyrimidine binary mapping rule, which is not affected by the difference in CG concentration, the Shannon redundancy for the set of analyzed sequences is larger for noncoding regions compared to coding regions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MAGEE,GLEN I.
Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flightmore » modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.« less
A project based on multi-configuration Dirac-Fock calculations for plasma spectroscopy
NASA Astrophysics Data System (ADS)
Comet, M.; Pain, J.-C.; Gilleron, F.; Piron, R.
2017-09-01
We present a project dedicated to hot plasma spectroscopy based on a Multi-Configuration Dirac-Fock (MCDF) code, initially developed by J. Bruneau. The code is briefly described and the use of the transition state method for plasma spectroscopy is detailed. Then an opacity code for local-thermodynamic-equilibrium plasmas using MCDF data, named OPAMCDF, is presented. Transition arrays for which the number of lines is too large to be handled in a Detailed Line Accounting (DLA) calculation can be modeled within the Partially Resolved Transition Array method or using the Unresolved Transition Arrays formalism in jj-coupling. An improvement of the original Partially Resolved Transition Array method is presented which gives a better agreement with DLA computations. Comparisons with some absorption and emission experimental spectra are shown. Finally, the capability of the MCDF code to compute atomic data required for collisional-radiative modeling of plasma at non local thermodynamic equilibrium is illustrated. In addition to photoexcitation, this code can be used to calculate photoionization, electron impact excitation and ionization cross-sections as well as autoionization rates in the Distorted-Wave or Close Coupling approximations. Comparisons with cross-sections and rates available in the literature are discussed.
2017-08-03
This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2018 as required by the statute. As required by section 1886(j)(5) of the Social Security Act (the Act), this rule includes the classification and weighting factors for the IRF prospective payment system's (IRF PPS) case-mix groups and a description of the methodologies and data used in computing the prospective payment rates for FY 2018. This final rule also revises the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) diagnosis codes that are used to determine presumptive compliance under the "60 percent rule," removes the 25 percent payment penalty for inpatient rehabilitation facility patient assessment instrument (IRF-PAI) late transmissions, removes the voluntary swallowing status item (Item 27) from the IRF-PAI, summarizes comments regarding the criteria used to classify facilities for payment under the IRF PPS, provides for a subregulatory process for certain annual updates to the presumptive methodology diagnosis code lists, adopts the use of height/weight items on the IRF-PAI to determine patient body mass index (BMI) greater than 50 for cases of single-joint replacement under the presumptive methodology, and revises and updates measures and reporting requirements under the IRF quality reporting program (QRP).
78 FR 33890 - Limitation on Claims Against Proposed Public Transportation Projects
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-05
... DEPARTMENT OF TRANSPORTATION Federal Transit Administration Limitation on Claims Against Proposed Public Transportation Projects AGENCY: Federal Transit Administration (FTA), DOT. ACTION: Notice. SUMMARY... advising the public of final agency actions subject to Section 139(l) of Title 23, United States Code (U.S...
78 FR 4191 - Limitation on Claims Against Proposed Public Transportation Projects
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-18
... DEPARTMENT OF TRANSPORTATION Federal Transit Administration Limitation on Claims Against Proposed Public Transportation Projects AGENCY: Federal Transit Administration (FTA), DOT. ACTION: Notice. SUMMARY... advising the public of final agency actions subject to Section 139(l) of Title 23, United States Code (U.S...
78 FR 16764 - Limitation on Claims Against Proposed Public Transportation Projects
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-18
... DEPARTMENT OF TRANSPORTATION Federal Transit Administration Limitation on Claims Against Proposed Public Transportation Projects AGENCY: Federal Transit Administration (FTA), DOT. ACTION: Notice. SUMMARY... advising the public of final agency actions subject to Section 139(l) of Title 23, United States Code (U.S...
75 FR 69881 - Responding to Disruptive Patients
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-16
... to visitations and communications, clothing, personal possessions, money, social interaction..., notice of proposed rulemaking published at 75 FR 30,306. Effect of Rulemaking Title 38 of the Code of... annually for inflation) in any given year. This final rule will have no such effect on State, local, or...
77 FR 43535 - Grantee Codes for Certified Radiofrequency Equipment
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-25
... Radiofrequency Equipment AGENCY: Federal Communications Commission. ACTION: Final rule. SUMMARY: This document... certify new equipment. DATES: Effective August 24, 2012. FOR FURTHER INFORMATION CONTACT: Hugh Van Tuyl... may also be downloaded at: www.fcc.gov . Summary of the Order 1. The Commission operates an equipment...
Sajad, Amirsaman; Sadeh, Morteza; Yan, Xiaogang; Wang, Hongying
2016-01-01
Abstract The frontal eye fields (FEFs) participate in both working memory and sensorimotor transformations for saccades, but their role in integrating these functions through time remains unclear. Here, we tracked FEF spatial codes through time using a novel analytic method applied to the classic memory-delay saccade task. Three-dimensional recordings of head-unrestrained gaze shifts were made in two monkeys trained to make gaze shifts toward briefly flashed targets after a variable delay (450-1500 ms). A preliminary analysis of visual and motor response fields in 74 FEF neurons eliminated most potential models for spatial coding at the neuron population level, as in our previous study (Sajad et al., 2015). We then focused on the spatiotemporal transition from an eye-centered target code (T; preferred in the visual response) to an eye-centered intended gaze position code (G; preferred in the movement response) during the memory delay interval. We treated neural population codes as a continuous spatiotemporal variable by dividing the space spanning T and G into intermediate T–G models and dividing the task into discrete steps through time. We found that FEF delay activity, especially in visuomovement cells, progressively transitions from T through intermediate T–G codes that approach, but do not reach, G. This was followed by a final discrete transition from these intermediate T–G delay codes to a “pure” G code in movement cells without delay activity. These results demonstrate that FEF activity undergoes a series of sensory–memory–motor transformations, including a dynamically evolving spatial memory signal and an imperfect memory-to-motor transformation. PMID:27092335
Nouraei, S A R; O'Hanlon, S; Butler, C R; Hadovsky, A; Donald, E; Benjamin, E; Sandhu, G S
2009-02-01
To audit the accuracy of otolaryngology clinical coding and identify ways of improving it. Prospective multidisciplinary audit, using the 'national standard clinical coding audit' methodology supplemented by 'double-reading and arbitration'. Teaching-hospital otolaryngology and clinical coding departments. Otolaryngology inpatient and day-surgery cases. Concordance between initial coding performed by a coder (first cycle) and final coding by a clinician-coder multidisciplinary team (MDT; second cycle) for primary and secondary diagnoses and procedures, and Health Resource Groupings (HRG) assignment. 1250 randomly-selected cases were studied. Coding errors occurred in 24.1% of cases (301/1250). The clinician-coder MDT reassigned 48 primary diagnoses and 186 primary procedures and identified a further 209 initially-missed secondary diagnoses and procedures. In 203 cases, patient's initial HRG changed. Incorrect coding caused an average revenue loss of 174.90 pounds per patient (14.7%) of which 60% of the total income variance was due to miscoding of a eight highly-complex head and neck cancer cases. The 'HRG drift' created the appearance of disproportionate resource utilisation when treating 'simple' cases. At our institution the total cost of maintaining a clinician-coder MDT was 4.8 times lower than the income regained through the double-reading process. This large audit of otolaryngology practice identifies a large degree of error in coding on discharge. This leads to significant loss of departmental revenue, and given that the same data is used for benchmarking and for making decisions about resource allocation, it distorts the picture of clinical practice. These can be rectified through implementing a cost-effective clinician-coder double-reading multidisciplinary team as part of a data-assurance clinical governance framework which we recommend should be established in hospitals.
Code for Multiblock CFD and Heat-Transfer Computations
NASA Technical Reports Server (NTRS)
Fabian, John C.; Heidmann, James D.; Lucci, Barbara L.; Ameri, Ali A.; Rigby, David L.; Steinthorsson, Erlendur
2006-01-01
The NASA Glenn Research Center General Multi-Block Navier-Stokes Convective Heat Transfer Code, Glenn-HT, has been used extensively to predict heat transfer and fluid flow for a variety of steady gas turbine engine problems. Recently, the Glenn-HT code has been completely rewritten in Fortran 90/95, a more object-oriented language that allows programmers to create code that is more modular and makes more efficient use of data structures. The new implementation takes full advantage of the capabilities of the Fortran 90/95 programming language. As a result, the Glenn-HT code now provides dynamic memory allocation, modular design, and unsteady flow capability. This allows for the heat-transfer analysis of a full turbine stage. The code has been demonstrated for an unsteady inflow condition, and gridding efforts have been initiated for a full turbine stage unsteady calculation. This analysis will be the first to simultaneously include the effects of rotation, blade interaction, film cooling, and tip clearance with recessed tip on turbine heat transfer and cooling performance. Future plans call for the application of the new Glenn-HT code to a range of gas turbine engine problems of current interest to the heat-transfer community. The new unsteady flow capability will allow researchers to predict the effect of unsteady flow phenomena upon the convective heat transfer of turbine blades and vanes. Work will also continue on the development of conjugate heat-transfer capability in the code, where simultaneous solution of convective and conductive heat-transfer domains is accomplished. Finally, advanced turbulence and fluid flow models and automatic gridding techniques are being developed that will be applied to the Glenn-HT code and solution process.
Final Technical Report for GO17004 Regulatory Logic: Codes and Standards for the Hydrogen Economy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakarado, Gary L.
The objectives of this project are to: develop a robust supporting research and development program to provide critical hydrogen behavior data and a detailed understanding of hydrogen combustion and safety across a range of scenarios, needed to establish setback distances in building codes and minimize the overall data gaps in code development; support and facilitate the completion of technical specifications by the International Organization for Standardization (ISO) for gaseous hydrogen refueling (TS 20012) and standards for on-board liquid (ISO 13985) and gaseous or gaseous blend (ISO 15869) hydrogen storage by 2007; support and facilitate the effort, led by the NFPA,more » to complete the draft Hydrogen Technologies Code (NFPA 2) by 2008; with experimental data and input from Technology Validation Program element activities, support and facilitate the completion of standards for bulk hydrogen storage (e.g., NFPA 55) by 2008; facilitate the adoption of the most recently available model codes (e.g., from the International Code Council [ICC]) in key regions; complete preliminary research and development on hydrogen release scenarios to support the establishment of setback distances in building codes and provide a sound basis for model code development and adoption; support and facilitate the development of Global Technical Regulations (GTRs) by 2010 for hydrogen vehicle systems under the United Nations Economic Commission for Europe, World Forum for Harmonization of Vehicle Regulations and Working Party on Pollution and Energy Program (ECE-WP29/GRPE); and to Support and facilitate the completion by 2012 of necessary codes and standards needed for the early commercialization and market entry of hydrogen energy technologies.« less
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.; ...
2016-10-01
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
Bergna, Miguel A; García, Gabriel R; Alchapar, Ramon; Altieri, Hector; Casas, Juan C Figueroa; Larrateguy, Luis; Nannini, Luis J; Pascansky, Daniel; Grabre, Pedro; Zabert, Gustavo; Miravitlles, Marc
2015-06-01
The CODE questionnaire (COPD detection questionnaire), a simple, binary response scale (yes/no), screening questionnaire, was developed for the identification of patients with chronic obstructive pulmonary disease (COPD). We conducted a survey of 468 subjects with a smoking history in 10 public hospitals in Argentina. Patients with a previous diagnosis of COPD, asthma and other respiratory illness were excluded. Items that measured conceptual domains in terms of characteristics of symptoms, smoking history and demographics data were considered. 96 (20.5%) subjects had a diagnosis of COPD according to the 2010 Global Initiative for Chronic Obstructive Lung Disease strategy document. The variables selected for the final questionnaire were based on univariate and multivariate analyses and clinical criteria. Finally, we selected the presence or absence of six variables (age ≥50 years, smoking history ≥30 pack-years, male sex, chronic cough, chronic phlegm and dyspnoea). Of patients without any of these six variables (0 points), none had COPD. The ability of the CODE questionnaire to discriminate between subjects with and without COPD was good (the area under the receiver operating characteristic curve was 0.75). Higher scores were associated with a greater probability of COPD. The CODE questionnaire is a brief, accurate questionnaire that can identify smoking individuals likely to have COPD. Copyright ©ERS 2015.
Navier-Stokes computation of compressible turbulent flows with a second order closure, part 1
NASA Technical Reports Server (NTRS)
Haminh, Hieu; Kollmann, Wolfgang; Vandromme, Dany
1990-01-01
A second order closure turbulence model for compressible flows is developed and implemented in a 2D Reynolds-averaged Navier-Stokes solver. From the beginning where a kappa-epsilon turbulence model was implemented in the bidiagonal implicit method of MACCORMACK (referred to as the MAC3 code) to the final stage of implementing a full second order closure in the efficient line Gauss-Seidel algorithm, numerous work was done, individually and collectively. Besides the collaboration itself, the final product of this work is a second order closure derived from the Launder, Reece, and Rodi model to account for near wall effects, which has been called FRAME model, which stands for FRench-AMerican-Effort. During the reporting period, two different problems were worked out. The first was to provide Ames researchers with a reliable compressible boundary layer code including a wide collection of turbulence models for quick testing of new terms, both in two equations and in second order closure (LRR and FRAME). The second topic was to complete the implementation of the FRAME model in the MAC5 code. The work related to these two different contributions is reported. dilatation in presence of stron shocks. This work, which has been conducted during a work at the Center for Turbulence Research with Zeman aimed also to cros-check earlier assumptions by Rubesin and Vandromme.
Global Nursing Issues and Development: Analysis of World Health Organization Documents.
Wong, Frances Kam Yuet; Liu, Huaping; Wang, Hui; Anderson, Debra; Seib, Charrlotte; Molasiotis, Alex
2015-11-01
To analyze World Health Organization (WHO) documents to identify global nursing issues and development. Qualitative content analysis. Documents published by the six WHO regions between 2007 and 2012 and with key words related to nurse/midwife or nursing/midwifery were included. Themes, categories, and subcategories were derived. The final coding reached 80% agreement among three independent coders, and the final coding for the discrepant coding was reached by consensus. Thirty-two documents from the regions of Europe (n = 19), the Americas (n = 6), the Western Pacific (n = 4), Africa (n = 1), the Eastern Mediterranean (n = 1), and Southeast Asia (n = 1) were examined. A total of 385 units of analysis dispersed in 31 subcategories under four themes were derived. The four themes derived (number of unit of analysis, %) were Management & Leadership (206, 53.5), Practice (75, 19.5), Education (70, 18.2), and Research (34, 8.8). The key nursing issues of concern at the global level are workforce, the impacts of nursing in health care, professional status, and education of nurses. International alliances can help advance nursing, but the visibility of nursing in the WHO needs to be strengthened. Organizational leadership is important in order to optimize the use of nursing competence in practice and inform policy makers regarding the value of nursing to promote people's health. © 2015 Sigma Theta Tau International.
Yoriyaz, Hélio; Moralles, Maurício; Siqueira, Paulo de Tarso Dalledone; Guimarães, Carla da Costa; Cintra, Felipe Belonsi; dos Santos, Adimir
2009-11-01
Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons. Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.
Zipf's Law in Short-Time Timbral Codings of Speech, Music, and Environmental Sound Signals
Haro, Martín; Serrà, Joan; Herrera, Perfecto; Corral, Álvaro
2012-01-01
Timbre is a key perceptual feature that allows discrimination between different sounds. Timbral sensations are highly dependent on the temporal evolution of the power spectrum of an audio signal. In order to quantitatively characterize such sensations, the shape of the power spectrum has to be encoded in a way that preserves certain physical and perceptual properties. Therefore, it is common practice to encode short-time power spectra using psychoacoustical frequency scales. In this paper, we study and characterize the statistical properties of such encodings, here called timbral code-words. In particular, we report on rank-frequency distributions of timbral code-words extracted from 740 hours of audio coming from disparate sources such as speech, music, and environmental sounds. Analogously to text corpora, we find a heavy-tailed Zipfian distribution with exponent close to one. Importantly, this distribution is found independently of different encoding decisions and regardless of the audio source. Further analysis on the intrinsic characteristics of most and least frequent code-words reveals that the most frequent code-words tend to have a more homogeneous structure. We also find that speech and music databases have specific, distinctive code-words while, in the case of the environmental sounds, this database-specific code-words are not present. Finally, we find that a Yule-Simon process with memory provides a reasonable quantitative approximation for our data, suggesting the existence of a common simple generative mechanism for all considered sound sources. PMID:22479497
CARES/LIFE Software Commercialization
NASA Technical Reports Server (NTRS)
1995-01-01
The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.
Code Blue Emergencies: A Team Task Analysis and Educational Initiative.
Price, James W; Applegarth, Oliver; Vu, Mark; Price, John R
2012-01-01
The objective of this study was to identify factors that have a positive or negative influence on resuscitation team performance during emergencies in the operating room (OR) and post-operative recovery unit (PAR) at a major Canadian teaching hospital. This information was then used to implement a team training program for code blue emergencies. In 2009/10, all OR and PAR nurses and 19 anesthesiologists at Vancouver General Hospital (VGH) were invited to complete an anonymous, 10 minute written questionnaire regarding their code blue experience. Survey questions were devised by 10 recovery room and operation room nurses as well as 5 anesthesiologists representing 4 different hospitals in British Columbia. Three iterations of the survey were reviewed by a pilot group of nurses and anesthesiologists and their feedback was integrated into the final version of the survey. Both nursing staff (n = 49) and anesthesiologists (n = 19) supported code blue training and believed that team training would improve patient outcome. Nurses noted that it was often difficult to identify the leader of the resuscitation team. Both nursing staff and anesthesiologists strongly agreed that too many people attending the code blue with no assigned role hindered team performance. Identifiable leadership and clear communication of roles were identified as keys to resuscitation team functioning. Decreasing the number of people attending code blue emergencies with no specific role, increased access to mock code blue training, and debriefing after crises were all identified as areas requiring improvement. Initial team training exercises have been well received by staff.
Overview of the NASA Glenn Flux Reconstruction Based High-Order Unstructured Grid Code
NASA Technical Reports Server (NTRS)
Spiegel, Seth C.; DeBonis, James R.; Huynh, H. T.
2016-01-01
A computational fluid dynamics code based on the flux reconstruction (FR) method is currently being developed at NASA Glenn Research Center to ultimately provide a large- eddy simulation capability that is both accurate and efficient for complex aeropropulsion flows. The FR approach offers a simple and efficient method that is easy to implement and accurate to an arbitrary order on common grid cell geometries. The governing compressible Navier-Stokes equations are discretized in time using various explicit Runge-Kutta schemes, with the default being the 3-stage/3rd-order strong stability preserving scheme. The code is written in modern Fortran (i.e., Fortran 2008) and parallelization is attained through MPI for execution on distributed-memory high-performance computing systems. An h- refinement study of the isentropic Euler vortex problem is able to empirically demonstrate the capability of the FR method to achieve super-accuracy for inviscid flows. Additionally, the code is applied to the Taylor-Green vortex problem, performing numerous implicit large-eddy simulations across a range of grid resolutions and solution orders. The solution found by a pseudo-spectral code is commonly used as a reference solution to this problem, and the FR code is able to reproduce this solution using approximately the same grid resolution. Finally, an examination of the code's performance demonstrates good parallel scaling, as well as an implementation of the FR method with a computational cost/degree- of-freedom/time-step that is essentially independent of the solution order of accuracy for structured geometries.
Identification of coding and non-coding mutational hotspots in cancer genomes.
Piraino, Scott W; Furney, Simon J
2017-01-05
The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from likely passenger regions susceptible to somatic mutation.
New Methods for Design and Computation of Freeform Optics
2015-07-09
338, Springer-Verlag Berlin Heidelberg, 2009. [18] R. Winston , J. C. Miñano, and P. Beńıtez, with contributions by N. Shatz and J. Bortz, Nonimaging Optics , Elsevier Academic Press, Amsterdam, 2005. 8 ...AFRL-OSR-VA-TR-2015-0160 New Methods for Design and Computation of Free-form Optics Vladimir Oliker EMORY UNIVERSITY Final Report 07/09/2015...Include area code) 01-07-2015 Final Technical Report May 01, 2012 - April 30, 2015 New Methods for Design and Computation of Freeform Optics FA9550-12--1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Catterall, Simon
This final report summarizes the work carried out by the Syracuse component of a multi-institutional SciDAC grant led by USQCD. This grant supported software development for theoretical high energy physics. The Syracuse component specifically targeted the development of code for the numerical simulation of N=4 super Yang-Mills theory. The work described in the final report includes this and a summary of results achieve in exploring the structure of this theory. It also describes the personnel - students and a postdoc who were directly or indirectly involved in this project. A list of publication is also described.
Final technical report for DE-SC00012633 AToM (Advanced Tokamak Modeling)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Christopher; Orlov, Dmitri; Izzo, Valerie
This final report for the AToM project documents contributions from University of California, San Diego researchers over the period of 9/1/2014 – 8/31/2017. The primary focus of these efforts was on performing validation studies of core tokamak transport models using the OMFIT framework, including development of OMFIT workflow scripts. Additional work was performed to develop tools for use of the nonlinear magnetohydrodynamics code NIMROD in OMFIT, and its use in the study of runaway electron dynamics in tokamak disruptions.
Test-Case Generation using an Explicit State Model Checker Final Report
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Gao, Jimin
2003-01-01
In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.
Concreteness Effects and Syntactic Modification in Written Composition.
ERIC Educational Resources Information Center
Sadoski, Mark; Goetz, Ernest T.
1998-01-01
Investigates whether concreteness was related to a key characteristic of written composition--the cumulative sentence with a final modifier--which has been consistently associated with higher quality writing. Supports the conceptual-peg hypothesis of dual coding theory, with concrete verbs providing the pegs on which cumulative sentences are…
MHEG Based Distance Learning System on Information Superhighway.
ERIC Educational Resources Information Center
Lee, SeiHoon; Yoon, KyungSeob; Wang, ChangJong
As the need for distance education grows, requirements for the development of high-speed network-based real-time distance learning systems increases. MHEG-5 is the fifth part of the MHEG (Multimedia and Hypermedia information coding Experts Group) standard, and it defines a final-form representation for application interchange. This paper…
Consolidated Checklist for C8 Title 40 of the Code of Federal Regulations (CFR) Part 268
This Consolidated Checklist corresponds to the 40 CFR Part 268, published on July 1, 2002, and as amended by the following final rules: 67 FR 48393, July 24, 2002 (Revision Checklist 200); and 67 FR 62618, October 7, 2002 (Revision Checklist 201).
78 FR 75471 - Section 3504 Agent Employment Tax Liability
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... under section 3504 of the Internal Revenue Code to perform acts required of employers who are home care... home care services, which are subject to taxes under the Federal Unemployment Tax Act. The final... amendments to the existing regulatory language designed to update citations and be consistent with the...
NASA Technical Reports Server (NTRS)
Kowalski, E. J.
1979-01-01
A computerized method which utilizes the engine performance data is described. The method estimates the installed performance of aircraft gas turbine engines. This installation includes: engine weight and dimensions, inlet and nozzle internal performance and drag, inlet and nacelle weight, and nacelle drag.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
...-AM78 Prevailing Rate Systems; North American Industry Classification System Based Federal Wage System... 2007 North American Industry Classification System (NAICS) codes currently used in Federal Wage System... (OPM) issued a final rule (73 FR 45853) to update the 2002 North American Industry Classification...
76 FR 21712 - Notice of Availability for Final PEA and Draft FONSI
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-18
... operation of small-scale wind energy projects at United States Marine Corps (USMC) facilities throughout the... NEPA (40 Code of Federal Regulations [CFR] Parts 1500-1508), and Marine Corps NEPA directives (Marine... FONSI are available for electronic viewing at http://marines.mil/unit/marforres/MFRHQ/FACILITIES...
GINSU: Guaranteed Internet Stack Utilization
2005-11-01
Computer Architecture Data Links, Internet , Protocol Stacks 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT UNCLASSIFIED 18. SECURITY...AFRL-IF-RS-TR-2005-383 Final Technical Report November 2005 GINSU: GUARANTEED INTERNET STACK UTILIZATION Trusted... Information Systems, Inc. Sponsored by Defense Advanced Research Projects Agency DARPA Order No. ARPS APPROVED FOR PUBLIC
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
40 CFR 1060.105 - What diurnal requirements apply for equipment?
Code of Federal Regulations, 2014 CFR
2014-07-01
...) AIR POLLUTION CONTROLS CONTROL OF EVAPORATIVE EMISSIONS FROM NEW AND IN-USE NONROAD AND STATIONARY... meet the diurnal emission standards adopted by the California Air Resources Board in the Final Regulation Order, Article 1, Chapter 15, Division 3, Title 13, California Code of Regulations, July 26, 2004...
40 CFR 1060.105 - What diurnal requirements apply for equipment?
Code of Federal Regulations, 2013 CFR
2013-07-01
...) AIR POLLUTION CONTROLS CONTROL OF EVAPORATIVE EMISSIONS FROM NEW AND IN-USE NONROAD AND STATIONARY... meet the diurnal emission standards adopted by the California Air Resources Board in the Final Regulation Order, Article 1, Chapter 15, Division 3, Title 13, California Code of Regulations, July 26, 2004...
40 CFR 1060.105 - What diurnal requirements apply for equipment?
Code of Federal Regulations, 2010 CFR
2010-07-01
...) AIR POLLUTION CONTROLS CONTROL OF EVAPORATIVE EMISSIONS FROM NEW AND IN-USE NONROAD AND STATIONARY... equipment may optionally meet the diurnal emission standards adopted by the California Air Resources Board in the Final Regulation Order, Article 1, Chapter 15, Division 3, Title 13, California Code of...
40 CFR 1060.105 - What diurnal requirements apply for equipment?
Code of Federal Regulations, 2011 CFR
2011-07-01
...) AIR POLLUTION CONTROLS CONTROL OF EVAPORATIVE EMISSIONS FROM NEW AND IN-USE NONROAD AND STATIONARY... meet the diurnal emission standards adopted by the California Air Resources Board in the Final Regulation Order, Article 1, Chapter 15, Division 3, Title 13, California Code of Regulations, July 26, 2004...
40 CFR 1060.105 - What diurnal requirements apply for equipment?
Code of Federal Regulations, 2012 CFR
2012-07-01
...) AIR POLLUTION CONTROLS CONTROL OF EVAPORATIVE EMISSIONS FROM NEW AND IN-USE NONROAD AND STATIONARY... meet the diurnal emission standards adopted by the California Air Resources Board in the Final Regulation Order, Article 1, Chapter 15, Division 3, Title 13, California Code of Regulations, July 26, 2004...
Automatic mathematical modeling for real time simulation program (AI application)
NASA Technical Reports Server (NTRS)
Wang, Caroline; Purinton, Steve
1989-01-01
A methodology is described for automatic mathematical modeling and generating simulation models. The major objective was to create a user friendly environment for engineers to design, maintain, and verify their models; to automatically convert the mathematical models into conventional code for computation; and finally, to document the model automatically.
21 CFR 133.5 - Methods of analysis.
Code of Federal Regulations, 2014 CFR
2014-04-01
... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html): (a) Moisture content—section 16.233 “Method I (52)—Official Final Action”, under the heading “Moisture”. (b) Milkfat...
77 FR 54917 - Findings of Research Misconduct
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-06
... values for inter-observer reliabilities when coding was done by only one observer, in both cases leading... Research Integrity (ORI) has taken final action in the following case: Marc Hauser, Ph.D., Harvard... collaborators that he miscoded some of the trials and that the study failed to provide support for the initial...
26 CFR 601.202 - Closing agreements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 20 2010-04-01 2010-04-01 false Closing agreements. 601.202 Section 601.202... STATEMENT OF PROCEDURAL RULES Rulings and Other Specific Matters § 601.202 Closing agreements. (a) General... fact, shall be final and conclusive. (2) Closing agreements under section 7121 of the Code may relate...
77 FR 50211 - Proposed Collection; Comment Request for Regulation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-20
... taxation of fringe benefits and exclusions from gross income for certain fringe Benefits, listed property...-63-88 (Final and temporary regulations) Taxation of Fringe Benefits and Exclusions From Gross Income... Code section 274(d). The regulation also provides guidance on the taxation of fringe benefits and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... Promulgation of Implementation Plans; North Carolina; State Implementation Plan Miscellaneous Revisions AGENCY... a revision to the North Carolina State Implementation Plan submitted on February 3, 2010, through... particulate matter found in the Code of Federal Regulations. In the Final Rules Section of this Federal...
75 FR 11637 - Proposed Collection; Comment Request for Regulation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-11
... DEPARTMENT OF THE TREASURY Internal Revenue Service [REG-114998-99] Proposed Collection; Comment... comments concerning an existing final regulation, REG-114998-99 (TD 8941), Obligations of States and... Number: REG-114998-99. Abstract: Section 421(f)(4) of the Internal Revenue Code of 1986 permits a person...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-05
... Counselor Certification Code of Professional Ethics; (2) VR services to transition-age youth; (3... social and electronic media, especially as it relates to confidentiality and appropriateness of the use of the media; (9) exposure to the business perspective; (10) critical thinking and decision-making...
76 FR 53906 - Availability of Final Toxicological Profile for RDX
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-30
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Toxic Substances and Disease Registry [ATSDR... Disease Registry (ATSDR), Department of Health and Human Services (HHS). ACTION: Notice of availability... 10 of the U.S. Code directs the Secretary of Defense to notify the Secretary of Health and Human...
75 FR 32659 - Contributed Property
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-09
... with the intent of subchapter K of the Code. The final regulations affect partnerships and their... general subchapter K anti-abuse rule found in Sec. 1.701-2. In light of the fact that these regulations... the partners' Federal tax liability in a manner inconsistent with the intent of subchapter K, the IRS...
Learning Compact Binary Face Descriptor for Face Recognition.
Lu, Jiwen; Liong, Venice Erin; Zhou, Xiuzhuang; Zhou, Jie
2015-10-01
Binary feature descriptors such as local binary patterns (LBP) and its variations have been widely used in many face recognition systems due to their excellent robustness and strong discriminative power. However, most existing binary face descriptors are hand-crafted, which require strong prior knowledge to engineer them by hand. In this paper, we propose a compact binary face descriptor (CBFD) feature learning method for face representation and recognition. Given each face image, we first extract pixel difference vectors (PDVs) in local patches by computing the difference between each pixel and its neighboring pixels. Then, we learn a feature mapping to project these pixel difference vectors into low-dimensional binary vectors in an unsupervised manner, where 1) the variance of all binary codes in the training set is maximized, 2) the loss between the original real-valued codes and the learned binary codes is minimized, and 3) binary codes evenly distribute at each learned bin, so that the redundancy information in PDVs is removed and compact binary codes are obtained. Lastly, we cluster and pool these binary codes into a histogram feature as the final representation for each face image. Moreover, we propose a coupled CBFD (C-CBFD) method by reducing the modality gap of heterogeneous faces at the feature level to make our method applicable to heterogeneous face recognition. Extensive experimental results on five widely used face datasets show that our methods outperform state-of-the-art face descriptors.
NASA Astrophysics Data System (ADS)
Eaves, Nick A.; Zhang, Qingan; Liu, Fengshan; Guo, Hongsheng; Dworkin, Seth B.; Thomson, Murray J.
2016-10-01
Mitigation of soot emissions from combustion devices is a global concern. For example, recent EURO 6 regulations for vehicles have placed stringent limits on soot emissions. In order to allow design engineers to achieve the goal of reduced soot emissions, they must have the tools to so. Due to the complex nature of soot formation, which includes growth and oxidation, detailed numerical models are required to gain fundamental insights into the mechanisms of soot formation. A detailed description of the CoFlame FORTRAN code which models sooting laminar coflow diffusion flames is given. The code solves axial and radial velocity, temperature, species conservation, and soot aggregate and primary particle number density equations. The sectional particle dynamics model includes nucleation, PAH condensation and HACA surface growth, surface oxidation, coagulation, fragmentation, particle diffusion, and thermophoresis. The code utilizes a distributed memory parallelization scheme with strip-domain decomposition. The public release of the CoFlame code, which has been refined in terms of coding structure, to the research community accompanies this paper. CoFlame is validated against experimental data for reattachment length in an axi-symmetric pipe with a sudden expansion, and ethylene-air and methane-air diffusion flames for multiple soot morphological parameters and gas-phase species. Finally, the parallel performance and computational costs of the code is investigated.
Uppal, Shitanshu; Shahin, Mark S; Rathbun, Jill A; Goff, Barbara A
2017-02-01
In 2015, there was an 18% reduction in the Relative Value Units (RVUs) that the Center for Medicare and Medicaid Services (CMS) assigned to the Current Procedural Terminology (CPT) code 58571 (Laparoscopy, surgical, with total hysterectomy, for uterus 250g or less; with removal of tube(s) and/or ovary(s)→TLH+BSO). The other CPT codes for laparoscopic hysterectomy and laparoscopic supracervical hysterectomy (58541-58544 and 58570-58573) lost between 12 and 23% of their assigned RVUs. In 2016, the laparoscopic lymph node dissection codes 38570 (Laparoscopy, surgical; with retroperitoneal lymph node sampling (biopsy), single or multiple), 38571 (Laparoscopy, surgical; with bilateral total pelvic lymphadenectomy), and 38572 (Laparoscopy, surgical; with bilateral total pelvic lymphadenectomy and para-aortic lymph node sampling (biopsy), single or multiple) lost between 5.5 and 16.3% of their RVU's. The goals of this article from the Society of Gynecologic Oncology (SGO) Task force on Coding and Reimbursement are 1) to inform the SGO members on why CMS identified these codes as a part of their misvalued services screening program and then finalized a reduction in their payment levels; and 2) outline the role individual providers have in CMS' methodology used to determine the reimbursement of a surgical procedure. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Myhill, Elizabeth A.; Boss, Alan P.
1993-01-01
In Boss & Myhill (1992) we described the derivation and testing of a spherical coordinate-based scheme for solving the hydrodynamic equations governing the gravitational collapse of nonisothermal, nonmagnetic, inviscid, radiative, three-dimensional protostellar clouds. Here we discuss a Cartesian coordinate-based scheme based on the same set of hydrodynamic equations. As with the spherical coorrdinate-based code, the Cartesian coordinate-based scheme employs explicit Eulerian methods which are both spatially and temporally second-order accurate. We begin by describing the hydrodynamic equations in Cartesian coordinates and the numerical methods used in this particular code. Following Finn & Hawley (1989), we pay special attention to the proper implementations of high-order accuracy, finite difference methods. We evaluate the ability of the Cartesian scheme to handle shock propagation problems, and through convergence testing, we show that the code is indeed second-order accurate. To compare the Cartesian scheme discussed here with the spherical coordinate-based scheme discussed in Boss & Myhill (1992), the two codes are used to calculate the standard isothermal collapse test case described by Bodenheimer & Boss (1981). We find that with the improved codes, the intermediate bar-configuration found previously disappears, and the cloud fragments directly into a binary protostellar system. Finally, we present the results from both codes of a new test for nonisothermal protostellar collapse.
Klann, Jeffrey G; Phillips, Lori C; Turchin, Alexander; Weiler, Sarah; Mandl, Kenneth D; Murphy, Shawn N
2015-12-11
Interoperable phenotyping algorithms, needed to identify patient cohorts meeting eligibility criteria for observational studies or clinical trials, require medical data in a consistent structured, coded format. Data heterogeneity limits such algorithms' applicability. Existing approaches are often: not widely interoperable; or, have low sensitivity due to reliance on the lowest common denominator (ICD-9 diagnoses). In the Scalable Collaborative Infrastructure for a Learning Healthcare System (SCILHS) we endeavor to use the widely-available Current Procedural Terminology (CPT) procedure codes with ICD-9. Unfortunately, CPT changes drastically year-to-year - codes are retired/replaced. Longitudinal analysis requires grouping retired and current codes. BioPortal provides a navigable CPT hierarchy, which we imported into the Informatics for Integrating Biology and the Bedside (i2b2) data warehouse and analytics platform. However, this hierarchy does not include retired codes. We compared BioPortal's 2014AA CPT hierarchy with Partners Healthcare's SCILHS datamart, comprising three-million patients' data over 15 years. 573 CPT codes were not present in 2014AA (6.5 million occurrences). No existing terminology provided hierarchical linkages for these missing codes, so we developed a method that automatically places missing codes in the most specific "grouper" category, using the numerical similarity of CPT codes. Two informaticians reviewed the results. We incorporated the final table into our i2b2 SCILHS/PCORnet ontology, deployed it at seven sites, and performed a gap analysis and an evaluation against several phenotyping algorithms. The reviewers found the method placed the code correctly with 97 % precision when considering only miscategorizations ("correctness precision") and 52 % precision using a gold-standard of optimal placement ("optimality precision"). High correctness precision meant that codes were placed in a reasonable hierarchal position that a reviewer can quickly validate. Lower optimality precision meant that codes were not often placed in the optimal hierarchical subfolder. The seven sites encountered few occurrences of codes outside our ontology, 93 % of which comprised just four codes. Our hierarchical approach correctly grouped retired and non-retired codes in most cases and extended the temporal reach of several important phenotyping algorithms. We developed a simple, easily-validated, automated method to place retired CPT codes into the BioPortal CPT hierarchy. This complements existing hierarchical terminologies, which do not include retired codes. The approach's utility is confirmed by the high correctness precision and successful grouping of retired with non-retired codes.
Overview of the CHarring Ablator Response (CHAR) Code
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Oliver, A. Brandon; Kirk, Benjamin S.; Salazar, Giovanni; Droba, Justin
2016-01-01
An overview of the capabilities of the CHarring Ablator Response (CHAR) code is presented. CHAR is a one-, two-, and three-dimensional unstructured continuous Galerkin finite-element heat conduction and ablation solver with both direct and inverse modes. Additionally, CHAR includes a coupled linear thermoelastic solver for determination of internal stresses induced from the temperature field and surface loading. Background on the development process, governing equations, material models, discretization techniques, and numerical methods is provided. Special focus is put on the available boundary conditions including thermochemical ablation and contact interfaces, and example simulations are included. Finally, a discussion of ongoing development efforts is presented.
Overview of the CHarring Ablator Response (CHAR) Code
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Oliver, A. Brandon; Kirk, Benjamin S.; Salazar, Giovanni; Droba, Justin
2016-01-01
An overview of the capabilities of the CHarring Ablator Response (CHAR) code is presented. CHAR is a one-, two-, and three-dimensional unstructured continuous Galerkin finite-element heat conduction and ablation solver with both direct and inverse modes. Additionally, CHAR includes a coupled linear thermoelastic solver for determination of internal stresses induced from the temperature field and surface loading. Background on the development process, governing equations, material models, discretization techniques, and numerical methods is provided. Special focus is put on the available boundary conditions including thermochemical ablation, surface-to-surface radiation exchange, and flowfield coupling. Finally, a discussion of ongoing development efforts is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharr, S.; Peckham, C.J.; Sharp, D.G.
1995-11-01
Coded wire tags applied to pink salmon fry in 1992 at four hatcheries in Prince William Sound were recovered in the commercial catch of 1993 and used to provide inseason estimates of hatchery contributions. These estimates were used by fishery managers to target the numerically superior hatchery returns, and reduce the pressure on oil-damaged wild stocks. Inseason estimates were made in two stages. The postseason analysis revealed that of a catch of 3.51 million pink salmon, 1.12 million were estimated to be of wild origin.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
NASA Technical Reports Server (NTRS)
Carpenter, M. H.
1988-01-01
The generalized chemistry version of the computer code SPARK is extended to include two higher-order numerical schemes, yielding fourth-order spatial accuracy for the inviscid terms. The new and old formulations are used to study the influences of finite rate chemical processes on nozzle performance. A determination is made of the computationally optimum reaction scheme for use in high-enthalpy nozzles. Finite rate calculations are compared with the frozen and equilibrium limits to assess the validity of each formulation. In addition, the finite rate SPARK results are compared with the constant ratio of specific heats (gamma) SEAGULL code, to determine its accuracy in variable gamma flow situations. Finally, the higher-order SPARK code is used to calculate nozzle flows having species stratification. Flame quenching occurs at low nozzle pressures, while for high pressures, significant burning continues in the nozzle.
Dusty Plasmas in Planetary Magnetospheres Award
NASA Technical Reports Server (NTRS)
Horanyi, Mihaly
2005-01-01
This is my final report for the grant Dusty Plasmas in Planetary Magnetospheres. The funding from this grant supported our research on dusty plasmas to study: a) dust plasma interactions in general plasma environments, and b) dusty plasma processes in planetary magnetospheres (Earth, Jupiter and Saturn). We have developed a general purpose transport code in order to follow the spatial and temporal evolution of dust density distributions in magnetized plasma environments. The code allows the central body to be represented by a multipole expansion of its gravitational and magnetic fields. The density and the temperature of the possibly many-component plasma environment can be pre-defined as a function of coordinates and, if necessary, the time as well. The code simultaneously integrates the equations of motion with the equations describing the charging processes. The charging currents are dependent not only on the instantaneous plasma parameters but on the velocity, as well as on the previous charging history of the dust grains.
The ethically trained physician: myth or reality?
Balkos, G. K.
1983-01-01
Through a questionnaire distributed to 300 physicians in the Toronto area, three aspects of their ethical awareness were examined: the formal codes, the need for consultation in making decisions and the need for training in medical ethics. Most of the physicians (81%) felt that they were facing ethical problems in their daily practice. A majority of these would try to solve the problems either themselves (30%) or through discussion with a colleague (43%). When they turned outside the profession it was sometimes to a lawyer (12%), which suggests concern with the legalities of some situations. Only a small proportion of the respondents were found to be familiar with two of the established codes of ethics, yet 13% would still turn to the code of the Canadian Medical Association for guidance. Finally, there was widespread recognition of the need for proper training in medical ethics and for the establishment of a specialty in this field. PMID:6825034
Creating reference gene annotation for the mouse C57BL6/J genome assembly.
Mudge, Jonathan M; Harrow, Jennifer
2015-10-01
Annotation on the reference genome of the C57BL6/J mouse has been an ongoing project ever since the draft genome was first published. Initially, the principle focus was on the identification of all protein-coding genes, although today the importance of describing long non-coding RNAs, small RNAs, and pseudogenes is recognized. Here, we describe the progress of the GENCODE mouse annotation project, which combines manual annotation from the HAVANA group with Ensembl computational annotation, alongside experimental and in silico validation pipelines from other members of the consortium. We discuss the more recent incorporation of next-generation sequencing datasets into this workflow, including the usage of mass-spectrometry data to potentially identify novel protein-coding genes. Finally, we will outline how the C57BL6/J genebuild can be used to gain insights into the variant sites that distinguish different mouse strains and species.
NASA Technical Reports Server (NTRS)
Thompson, David E.
2005-01-01
Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.
Numerical simulation of turbulent jet noise, part 2
NASA Technical Reports Server (NTRS)
Metcalfe, R. W.; Orszag, S. A.
1976-01-01
Results on the numerical simulation of jet flow fields were used to study the radiated sound field, and in addition, to extend and test the capabilities of the turbulent jet simulation codes. The principal result of the investigation was the computation of the radiated sound field from a turbulent jet. In addition, the computer codes were extended to account for the effects of compressibility and eddy viscosity, and the treatment of the nonlinear terms of the Navier-Stokes equations was modified so that they can be computed in a semi-implicit way. A summary of the flow model and a description of the numerical methods used for its solution are presented. Calculations of the radiated sound field are reported. In addition, the extensions that were made to the fundamental dynamical codes are described. Finally, the current state-of-the-art for computer simulation of turbulent jet noise is summarized.
Tartarus: A relativistic Green's function quantum average atom code
Gill, Nathanael Matthew; Starrett, Charles Edward
2017-06-28
A relativistic Green’s Function quantum average atom model is implemented in the Tartarus code for the calculation of equation of state data in dense plasmas. We first present the relativistic extension of the quantum Green’s Function average atom model described by Starrett [1]. The Green’s Function approach addresses the numerical challenges arising from resonances in the continuum density of states without the need for resonance tracking algorithms or adaptive meshes, though there are still numerical challenges inherent to this algorithm. We discuss how these challenges are addressed in the Tartarus algorithm. The outputs of the calculation are shown in comparisonmore » to PIMC/DFT-MD simulations of the Principal Shock Hugoniot in Silicon. Finally, we also present the calculation of the Hugoniot for Silver coming from both the relativistic and nonrelativistic modes of the Tartarus code.« less
The ePLAS code for Ignition Studies
NASA Astrophysics Data System (ADS)
Faehl, R. J.; Mason, R. J.; Kirkpatrick, R. C.
2012-10-01
The ePLAS code is a multi-fluid/PIC hybrid developing self-consistent E & B-fields by the Implicit Moment Method for stable calculations of high density plasma problems with voids on the electron Courant time scale. See: http://www.researchapplicationscorp.com. Here, we outline typical applications to: 1) short pulse driven electron transport along void (or high Z) insulated wires, and 2) the 2D development of shock ignition pressure peaks with B-fields. We outline the code's recent inclusion of SESAME EOS data, a DT/DD burn capability, a new option for K-alpha imaging of modeling output, and demonstrate a foil expansion tracked with either fluid or particle ions. Also, we describe a new super-hybrid extension of our implicit solver that permits full target dynamics studies on the ion Courant scale. Finally, we will touch on the very recent application of ePLAS to possible non-local/kinetic hydro effects NIF capsules.
Assessing Attentional Prioritization of Front-of-Pack Nutrition Labels using Change Detection
Becker, Mark W.; Sundar, Raghav Prashant; Bello, Nora; Alzahabi, Reem; Weatherspoon, Lorraine; Bix, Laura
2015-01-01
We used a change detection method to evaluate attentional prioritization of nutrition information that appears in the traditional “Nutrition Facts Panel” and in front-of-pack nutrition labels. Results provide compelling evidence that front-of-pack labels attract attention more readily than the Nutrition Facts Panel, even when participants are not specifically tasked with searching for nutrition information. Further, color-coding the relative nutritional value of key nutrients within the front-of-pack label resulted in increased attentional prioritization of nutrition information, but coding using facial icons did not significantly increase attention to the label. Finally, the general pattern of attentional prioritization across front-of-pack designs was consistent across a diverse sample of participants. Our results indicate that color-coded, front-of-pack nutrition labels increase attention to the nutrition information of packaged food, a finding that has implications for current policy discussions regarding labeling change. PMID:26851468
Load management strategy for Particle-In-Cell simulations in high energy particle acceleration
NASA Astrophysics Data System (ADS)
Beck, A.; Frederiksen, J. T.; Dérouillat, J.
2016-09-01
In the wake of the intense effort made for the experimental CILEX project, numerical simulation campaigns have been carried out in order to finalize the design of the facility and to identify optimal laser and plasma parameters. These simulations bring, of course, important insight into the fundamental physics at play. As a by-product, they also characterize the quality of our theoretical and numerical models. In this paper, we compare the results given by different codes and point out algorithmic limitations both in terms of physical accuracy and computational performances. These limitations are illustrated in the context of electron laser wakefield acceleration (LWFA). The main limitation we identify in state-of-the-art Particle-In-Cell (PIC) codes is computational load imbalance. We propose an innovative algorithm to deal with this specific issue as well as milestones towards a modern, accurate high-performance PIC code for high energy particle acceleration.
Sparse representation-based image restoration via nonlocal supervised coding
NASA Astrophysics Data System (ADS)
Li, Ao; Chen, Deyun; Sun, Guanglu; Lin, Kezheng
2016-10-01
Sparse representation (SR) and nonlocal technique (NLT) have shown great potential in low-level image processing. However, due to the degradation of the observed image, SR and NLT may not be accurate enough to obtain a faithful restoration results when they are used independently. To improve the performance, in this paper, a nonlocal supervised coding strategy-based NLT for image restoration is proposed. The novel method has three main contributions. First, to exploit the useful nonlocal patches, a nonnegative sparse representation is introduced, whose coefficients can be utilized as the supervised weights among patches. Second, a novel objective function is proposed, which integrated the supervised weights learning and the nonlocal sparse coding to guarantee a more promising solution. Finally, to make the minimization tractable and convergence, a numerical scheme based on iterative shrinkage thresholding is developed to solve the above underdetermined inverse problem. The extensive experiments validate the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Bogdanchikov, A.; Zhaparov, M.; Suliyev, R.
2013-04-01
Today we have a lot of programming languages that can realize our needs, but the most important question is how to teach programming to beginner students. In this paper we suggest using Python for this purpose, because it is a programming language that has neatly organized syntax and powerful tools to solve any task. Moreover it is very close to simple math thinking. Python is chosen as a primary programming language for freshmen in most of leading universities. Writing code in python is easy. In this paper we give some examples of program codes written in Java, C++ and Python language, and we make a comparison between them. Firstly, this paper proposes advantages of Python language in relation to C++ and JAVA. Then it shows the results of a comparison of short program codes written in three different languages, followed by a discussion on how students understand programming. Finally experimental results of students' success in programming courses are shown.
NASA Technical Reports Server (NTRS)
Mularz, Edward J.; Sockol, Peter M.
1987-01-01
Future aerospace propulsion concepts involve the combination of liquid or gaseous fuels in a highly turbulent internal air stream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at Lewis to better understand chemical reacting flows with the long term goal of establishing these reliable computer codes. The approach to understanding chemical reacting flows is to look at separate simple parts of this complex phenomena as well as to study the full turbulent reacting flow process. As a result research on the fluid mechanics associated with chemical reacting flows was initiated. The chemistry of fuel-air combustion is also being studied. Finally, the phenomena of turbulence-combustion interaction is being investigated. This presentation will highlight research, both experimental and analytical, in each of these three major areas.
NASA Technical Reports Server (NTRS)
Mularz, Edward J.; Sockol, Peter M.
1990-01-01
Future aerospace propulsion concepts involve the combustion of liquid or gaseous fuels in a highly turbulent internal airstream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence-combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at LeRC to better understand chemical reacting flows with the long-term goal of establishing these reliable computer codes. Our approach to understand chemical reacting flows is to look at separate, more simple parts of this complex phenomenon as well as to study the full turbulent reacting flow process. As a result, we are engaged in research on the fluid mechanics associated with chemical reacting flows. We are also studying the chemistry of fuel-air combustion. Finally, we are investigating the phenomenon of turbulence-combustion interaction. Research, both experimental and analytical, is highlighted in each of these three major areas.
Tartarus: A relativistic Green's function quantum average atom code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gill, Nathanael Matthew; Starrett, Charles Edward
A relativistic Green’s Function quantum average atom model is implemented in the Tartarus code for the calculation of equation of state data in dense plasmas. We first present the relativistic extension of the quantum Green’s Function average atom model described by Starrett [1]. The Green’s Function approach addresses the numerical challenges arising from resonances in the continuum density of states without the need for resonance tracking algorithms or adaptive meshes, though there are still numerical challenges inherent to this algorithm. We discuss how these challenges are addressed in the Tartarus algorithm. The outputs of the calculation are shown in comparisonmore » to PIMC/DFT-MD simulations of the Principal Shock Hugoniot in Silicon. Finally, we also present the calculation of the Hugoniot for Silver coming from both the relativistic and nonrelativistic modes of the Tartarus code.« less
LTCP 2D Graphical User Interface. Application Description and User's Guide
NASA Technical Reports Server (NTRS)
Ball, Robert; Navaz, Homayun K.
1996-01-01
A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.
Region-Based Prediction for Image Compression in the Cloud.
Begaint, Jean; Thoreau, Dominique; Guillotel, Philippe; Guillemot, Christine
2018-04-01
Thanks to the increasing number of images stored in the cloud, external image similarities can be leveraged to efficiently compress images by exploiting inter-images correlations. In this paper, we propose a novel image prediction scheme for cloud storage. Unlike current state-of-the-art methods, we use a semi-local approach to exploit inter-image correlation. The reference image is first segmented into multiple planar regions determined from matched local features and super-pixels. The geometric and photometric disparities between the matched regions of the reference image and the current image are then compensated. Finally, multiple references are generated from the estimated compensation models and organized in a pseudo-sequence to differentially encode the input image using classical video coding tools. Experimental results demonstrate that the proposed approach yields significant rate-distortion performance improvements compared with the current image inter-coding solutions such as high efficiency video coding.
A Simple Secure Hash Function Scheme Using Multiple Chaotic Maps
NASA Astrophysics Data System (ADS)
Ahmad, Musheer; Khurana, Shruti; Singh, Sushmita; AlSharari, Hamed D.
2017-06-01
The chaotic maps posses high parameter sensitivity, random-like behavior and one-way computations, which favor the construction of cryptographic hash functions. In this paper, we propose to present a novel hash function scheme which uses multiple chaotic maps to generate efficient variable-sized hash functions. The message is divided into four parts, each part is processed by a different 1D chaotic map unit yielding intermediate hash code. The four codes are concatenated to two blocks, then each block is processed through 2D chaotic map unit separately. The final hash value is generated by combining the two partial hash codes. The simulation analyses such as distribution of hashes, statistical properties of confusion and diffusion, message and key sensitivity, collision resistance and flexibility are performed. The results reveal that the proposed anticipated hash scheme is simple, efficient and holds comparable capabilities when compared with some recent chaos-based hash algorithms.
Portable multi-node LQCD Monte Carlo simulations using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Calore, Enrico; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Sanfilippo, Francesco; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele
This paper describes a state-of-the-art parallel Lattice QCD Monte Carlo code for staggered fermions, purposely designed to be portable across different computer architectures, including GPUs and commodity CPUs. Portability is achieved using the OpenACC parallel programming model, used to develop a code that can be compiled for several processor architectures. The paper focuses on parallelization on multiple computing nodes using OpenACC to manage parallelism within the node, and OpenMPI to manage parallelism among the nodes. We first discuss the available strategies to be adopted to maximize performances, we then describe selected relevant details of the code, and finally measure the level of performance and scaling-performance that we are able to achieve. The work focuses mainly on GPUs, which offer a significantly high level of performances for this application, but also compares with results measured on other processors.
NASA Technical Reports Server (NTRS)
Tew, Roy; Ibrahim, Mounir; Simon, Terry; Mantell, Susan; Gedeon, David; Qiu, Songgang; Wood, Gary
2004-01-01
This paper win report on continuation through the third year of a NASA grant for multi-dimensional Stirling CFD code development and validation; continuation through the third and final year of a Department of Energy, Golden Field Office (DOE), regenerator research effort and a NASA grant for continuation of the effort through two additional years; and a new NASA Research Award for design, microfabrication and testing of a "Next Generation Stirling Engine Regenerator." Cleveland State University (CSU) is the lead organization for all three efforts, with the University of Minnesota (UMN) and Gedeon Associates as subcontractors. The Stirling Technology Company and Sun power, Inc. acted as unfunded consultants or participants through the third years of both the NASA multi-D code development and DOE regenerator research efforts; they win both be subcontractors on the new regenerator microfabrication contract.
KEWPIE2: A cascade code for the study of dynamical decay of excited nuclei
NASA Astrophysics Data System (ADS)
Lü, Hongliang; Marchix, Anthony; Abe, Yasuhisa; Boilley, David
2016-03-01
KEWPIE-a cascade code devoted to investigating the dynamical decay of excited nuclei, specially designed for treating very low probability events related to the synthesis of super-heavy nuclei formed in fusion-evaporation reactions-has been improved and rewritten in C++ programming language to become KEWPIE2. The current version of the code comprises various nuclear models concerning the light-particle emission, fission process and statistical properties of excited nuclei. General features of the code, such as the numerical scheme and the main physical ingredients, are described in detail. Some typical calculations having been performed in the present paper clearly show that theoretical predictions are generally in accordance with experimental data. Furthermore, since the values of some input parameters cannot be determined neither theoretically nor experimentally, a sensibility analysis is presented. To this end, we systematically investigate the effects of using different parameter values and reaction models on the final results. As expected, in the case of heavy nuclei, the fission process has the most crucial role to play in theoretical predictions. This work would be essential for numerical modeling of fusion-evaporation reactions.
Quick reproduction of blast-wave flow-field properties of nuclear, TNT, and ANFO explosions
NASA Astrophysics Data System (ADS)
Groth, C. P. T.
1986-04-01
In many instances, extensive blast-wave flow-field properties are required in gasdynamics research studies of blast-wave loading and structure response, and in evaluating the effects of explosions on their environment. This report provides a very useful computer code, which can be used in conjunction with the DNA Nuclear Blast Standard subroutines and code, to quickly reconstruct complete and fairly accurate blast-wave data for almost any free-air (spherical) and surface-burst (hemispherical) nuclear, trinitrotoluene (TNT), or ammonium nitrate-fuel oil (ANFO) explosion. This code is capable of computing all of the main flow properties as functions of radius and time, as well as providing additional information regarding air viscosity, reflected shock-wave properties, and the initial decay of the flow properties just behind the shock front. Both spatial and temporal distributions of the major blast-wave flow properties are also made readily available. Finally, provisions are also included in the code to provide additional information regarding the peak or shock-front flow properties over a range of radii, for a specific explosion of interest.
Low-rate image coding using vector quantization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makur, A.
1990-01-01
This thesis deals with the development and analysis of a computationally simple vector quantization image compression system for coding monochrome images at low bit rate. Vector quantization has been known to be an effective compression scheme when a low bit rate is desirable, but the intensive computation required in a vector quantization encoder has been a handicap in using it for low rate image coding. The present work shows that, without substantially increasing the coder complexity, it is indeed possible to achieve acceptable picture quality while attaining a high compression ratio. Several modifications to the conventional vector quantization coder aremore » proposed in the thesis. These modifications are shown to offer better subjective quality when compared to the basic coder. Distributed blocks are used instead of spatial blocks to construct the input vectors. A class of input-dependent weighted distortion functions is used to incorporate psychovisual characteristics in the distortion measure. Computationally simple filtering techniques are applied to further improve the decoded image quality. Finally, unique designs of the vector quantization coder using electronic neural networks are described, so that the coding delay is reduced considerably.« less
NASA Astrophysics Data System (ADS)
Kim, Seong-Whan; Suthaharan, Shan; Lee, Heung-Kyu; Rao, K. R.
2001-01-01
Quality of Service (QoS)-guarantee in real-time communication for multimedia applications is significantly important. An architectural framework for multimedia networks based on substreams or flows is effectively exploited for combining source and channel coding for multimedia data. But the existing frame by frame approach which includes Moving Pictures Expert Group (MPEG) cannot be neglected because it is a standard. In this paper, first, we designed an MPEG transcoder which converts an MPEG coded stream into variable rate packet sequences to be used for our joint source/channel coding (JSCC) scheme. Second, we designed a classification scheme to partition the packet stream into multiple substreams which have their own QoS requirements. Finally, we designed a management (reservation and scheduling) scheme for substreams to support better perceptual video quality such as the bound of end-to-end jitter. We have shown that our JSCC scheme is better than two other two popular techniques by simulation and real video experiments on the TCP/IP environment.
Modeling the Martian neutron and gamma-ray leakage fluxes using Geant4
NASA Astrophysics Data System (ADS)
Pirard, Benoit; Desorgher, Laurent; Diez, Benedicte; Gasnault, Olivier
A new evaluation of the Martian neutron and gamma-ray (continuum and line) leakage fluxes has been performed using the Geant4 code. Even if numerous studies have recently been carried out with Monte Carlo methods to characterize planetary radiation environments, only a few however have been able to reproduce in detail the neutron and gamma-ray spectra observed in orbit. We report on the efforts performed to adapt and validate the Geant4-based PLAN- ETOCOSMICS code for use in planetary neutron and gamma-ray spectroscopy data analysis. Beside the advantage of high transparency and modularity common to Geant4 applications, the new code uses reviewed nuclear cross section data, realistic atmospheric profiles and soil layering, as well as specific effects such as gravity acceleration for low energy neutrons. Results from first simulations are presented for some Martian reference compositions and show a high consistency with corresponding neutron and gamma-ray spectra measured on board Mars Odyssey. Finally we discuss the advantages and perspectives of the improved code for precise simulation of planetary radiation environments.
Vector quantization for efficient coding of upper subbands
NASA Technical Reports Server (NTRS)
Zeng, W. J.; Huang, Y. F.
1994-01-01
This paper examines the application of vector quantization (VQ) to exploit both intra-band and inter-band redundancy in subband coding. The focus here is on the exploitation of inter-band dependency. It is shown that VQ is particularly suitable and effective for coding the upper subbands. Three subband decomposition-based VQ coding schemes are proposed here to exploit the inter-band dependency by making full use of the extra flexibility of VQ approach over scalar quantization. A quadtree-based variable rate VQ (VRVQ) scheme which takes full advantage of the intra-band and inter-band redundancy is first proposed. Then, a more easily implementable alternative based on an efficient block-based edge estimation technique is employed to overcome the implementational barriers of the first scheme. Finally, a predictive VQ scheme formulated in the context of finite state VQ is proposed to further exploit the dependency among different subbands. A VRVQ scheme proposed elsewhere is extended to provide an efficient bit allocation procedure. Simulation results show that these three hybrid techniques have advantages, in terms of peak signal-to-noise ratio (PSNR) and complexity, over other existing subband-VQ approaches.
Particle model of a cylindrical inductively coupled ion source
NASA Astrophysics Data System (ADS)
Ippolito, N. D.; Taccogna, F.; Minelli, P.; Cavenago, M.; Veltri, P.
2017-08-01
In spite of the wide use of RF sources, a complete understanding of the mechanisms regulating the RF-coupling of the plasma is still lacking so self-consistent simulations of the involved physics are highly desirable. For this reason we are developing a 2.5D fully kinetic Particle-In-Cell Monte-Carlo-Collision (PIC-MCC) model of a cylindrical ICP-RF source, keeping the time step of the simulation small enough to resolve the plasma frequency scale. The grid cell dimension is now about seven times larger than the average Debye length, because of the large computational demand of the code. It will be scaled down in the next phase of the development of the code. The filling gas is Xenon, in order to minimize the time lost by the MCC collision module in the first stage of development of the code. The results presented here are preliminary, with the code already showing a good robustness. The final goal will be the modeling of the NIO1 (Negative Ion Optimization phase 1) source, operating in Padua at Consorzio RFX.
Status of the Space Radiation Monte Carlos Simulation Based on FLUKA and ROOT
NASA Technical Reports Server (NTRS)
Andersen, Victor; Carminati, Federico; Empl, Anton; Ferrari, Alfredo; Pinsky, Lawrence; Sala, Paola; Wilson, Thomas L.
2002-01-01
The NASA-funded project reported on at the first IWSSRR in Arona to develop a Monte-Carlo simulation program for use in simulating the space radiation environment based on the FLUKA and ROOT codes is well into its second year of development, and considerable progress has been made. The general tasks required to achieve the final goals include the addition of heavy-ion interactions into the FLUKA code and the provision of a ROOT-based interface to FLUKA. The most significant progress to date includes the incorporation of the DPMJET event generator code within FLUKA to handle heavy-ion interactions for incident projectile energies greater than 3GeV/A. The ongoing effort intends to extend the treatment of these interactions down to 10 MeV, and at present two alternative approaches are being explored. The ROOT interface is being pursued in conjunction with the CERN LHC ALICE software team through an adaptation of their existing AliROOT software. As a check on the validity of the code, a simulation of the recent data taken by the ATIC experiment is underway.
Electro-Thermal-Mechanical Simulation Capability Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, D
This is the Final Report for LDRD 04-ERD-086, 'Electro-Thermal-Mechanical Simulation Capability'. The accomplishments are well documented in five peer-reviewed publications and six conference presentations and hence will not be detailed here. The purpose of this LDRD was to research and develop numerical algorithms for three-dimensional (3D) Electro-Thermal-Mechanical simulations. LLNL has long been a world leader in the area of computational mechanics, and recently several mechanics codes have become 'multiphysics' codes with the addition of fluid dynamics, heat transfer, and chemistry. However, these multiphysics codes do not incorporate the electromagnetics that is required for a coupled Electro-Thermal-Mechanical (ETM) simulation. There aremore » numerous applications for an ETM simulation capability, such as explosively-driven magnetic flux compressors, electromagnetic launchers, inductive heating and mixing of metals, and MEMS. A robust ETM simulation capability will enable LLNL physicists and engineers to better support current DOE programs, and will prepare LLNL for some very exciting long-term DoD opportunities. We define a coupled Electro-Thermal-Mechanical (ETM) simulation as a simulation that solves, in a self-consistent manner, the equations of electromagnetics (primarily statics and diffusion), heat transfer (primarily conduction), and non-linear mechanics (elastic-plastic deformation, and contact with friction). There is no existing parallel 3D code for simulating ETM systems at LLNL or elsewhere. While there are numerous magnetohydrodynamic codes, these codes are designed for astrophysics, magnetic fusion energy, laser-plasma interaction, etc. and do not attempt to accurately model electromagnetically driven solid mechanics. This project responds to the Engineering R&D Focus Areas of Simulation and Energy Manipulation, and addresses the specific problem of Electro-Thermal-Mechanical simulation for design and analysis of energy manipulation systems such as magnetic flux compression generators and railguns. This project compliments ongoing DNT projects that have an experimental emphasis. Our research efforts have been encapsulated in the Diablo and ALE3D simulation codes. This new ETM capability already has both internal and external users, and has spawned additional research in plasma railgun technology. By developing this capability Engineering has become a world-leader in ETM design, analysis, and simulation. This research has positioned LLNL to be able to compete for new business opportunities with the DoD in the area of railgun design. We currently have a three-year $1.5M project with the Office of Naval Research to apply our ETM simulation capability to railgun bore life issues and we expect to be a key player in the railgun community.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnold, Bill Walter; Chang, Fu-lin; Mattie, Patrick D.
2006-02-01
Sandia National Laboratories (SNL) and Taiwan's Institute for Nuclear Energy Research (INER) have teamed together to evaluate several candidate sites for Low-Level Radioactive Waste (LLW) disposal in Taiwan. Taiwan currently has three nuclear power plants, with another under construction. Taiwan also has a research reactor, as well as medical and industrial wastes to contend with. Eventually the reactors will be decomissioned. Operational and decommissioning wastes will need to be disposed in a licensed disposal facility starting in 2014. Taiwan has adopted regulations similar to the US Nuclear Regulatory Commission's (NRC's) low-level radioactive waste rules (10 CFR 61) to govern themore » disposal of LLW. Taiwan has proposed several potential sites for the final disposal of LLW that is now in temporary storage on Lanyu Island and on-site at operating nuclear power plants, and for waste generated in the future through 2045. The planned final disposal facility will have a capacity of approximately 966,000 55-gallon drums. Taiwan is in the process of evaluating the best candidate site to pursue for licensing. Among these proposed sites there are basically two disposal concepts: shallow land burial and cavern disposal. A representative potential site for shallow land burial is located on a small island in the Taiwan Strait with basalt bedrock and interbedded sedimentary rocks. An engineered cover system would be constructed to limit infiltration for shallow land burial. A representative potential site for cavern disposal is located along the southeastern coast of Taiwan in a tunnel system that would be about 500 to 800 m below the surface. Bedrock at this site consists of argillite and meta-sedimentary rocks. Performance assessment analyses will be performed to evaluate future performance of the facility and the potential dose/risk to exposed populations. Preliminary performance assessment analyses will be used in the site-selection process and to aid in design of the disposal system. Final performance assessment analyses will be used in the regulatory process of licensing a site. The SNL/INER team has developed a performance assessment methodology that is used to simulate processes associated with the potential release of radionuclides to evaluate these sites. The following software codes are utilized in the performance assessment methodology: GoldSim (to implement a probabilistic analysis that will explicitly address uncertainties); the NRC's Breach, Leach, and Transport - Multiple Species (BLT-MS) code (to simulate waste-container degradation, waste-form leaching, and transport through the host rock); the Finite Element Heat and Mass Transfer code (FEHM) (to simulate groundwater flow and estimate flow velocities); the Hydrologic Evaluation of Landfill performance Model (HELP) code (to evaluate infiltration through the disposal cover); the AMBER code (to evaluate human health exposures); and the NRC's Disposal Unit Source Term -- Multiple Species (DUST-MS) code (to screen applicable radionuclides). Preliminary results of the evaluations of the two disposal concept sites are presented.« less
Adaptive software-defined coded modulation for ultra-high-speed optical transport
NASA Astrophysics Data System (ADS)
Djordjevic, Ivan B.; Zhang, Yequn
2013-10-01
In optically-routed networks, different wavelength channels carrying the traffic to different destinations can have quite different optical signal-to-noise ratios (OSNRs) and signal is differently impacted by various channel impairments. Regardless of the data destination, an optical transport system (OTS) must provide the target bit-error rate (BER) performance. To provide target BER regardless of the data destination we adjust the forward error correction (FEC) strength. Depending on the information obtained from the monitoring channels, we select the appropriate code rate matching to the OSNR range that current channel OSNR falls into. To avoid frame synchronization issues, we keep the codeword length fixed independent of the FEC code being employed. The common denominator is the employment of quasi-cyclic (QC-) LDPC codes in FEC. For high-speed implementation, low-complexity LDPC decoding algorithms are needed, and some of them will be described in this invited paper. Instead of conventional QAM based modulation schemes, we employ the signal constellations obtained by optimum signal constellation design (OSCD) algorithm. To improve the spectral efficiency, we perform the simultaneous rate adaptation and signal constellation size selection so that the product of number of bits per symbol × code rate is closest to the channel capacity. Further, we describe the advantages of using 4D signaling instead of polarization-division multiplexed (PDM) QAM, by using the 4D MAP detection, combined with LDPC coding, in a turbo equalization fashion. Finally, to solve the problems related to the limited bandwidth of information infrastructure, high energy consumption, and heterogeneity of optical networks, we describe an adaptive energy-efficient hybrid coded-modulation scheme, which in addition to amplitude, phase, and polarization state employs the spatial modes as additional basis functions for multidimensional coded-modulation.
Henry, Kenneth S.; Kale, Sushrut; Heinz, Michael G.
2014-01-01
While changes in cochlear frequency tuning are thought to play an important role in the perceptual difficulties of people with sensorineural hearing loss (SNHL), the possible role of temporal processing deficits remains less clear. Our knowledge of temporal envelope coding in the impaired cochlea is limited to two studies that examined auditory-nerve fiber responses to narrowband amplitude modulated stimuli. In the present study, we used Wiener-kernel analyses of auditory-nerve fiber responses to broadband Gaussian noise in anesthetized chinchillas to quantify changes in temporal envelope coding with noise-induced SNHL. Temporal modulation transfer functions (TMTFs) and temporal windows of sensitivity to acoustic stimulation were computed from 2nd-order Wiener kernels and analyzed to estimate the temporal precision, amplitude, and latency of envelope coding. Noise overexposure was associated with slower (less negative) TMTF roll-off with increasing modulation frequency and reduced temporal window duration. The results show that at equal stimulus sensation level, SNHL increases the temporal precision of envelope coding by 20–30%. Furthermore, SNHL increased the amplitude of envelope coding by 50% in fibers with CFs from 1–2 kHz and decreased mean response latency by 0.4 ms. While a previous study of envelope coding demonstrated a similar increase in response amplitude, the present study is the first to show enhanced temporal precision. This new finding may relate to the use of a more complex stimulus with broad frequency bandwidth and a dynamic temporal envelope. Exaggerated neural coding of fast envelope modulations may contribute to perceptual difficulties in people with SNHL by acting as a distraction from more relevant acoustic cues, especially in fluctuating background noise. Finally, the results underscore the value of studying sensory systems with more natural, real-world stimuli. PMID:24596545
Increased length of inpatient stay and poor clinical coding: audit of patients with diabetes.
Daultrey, Harriet; Gooday, Catherine; Dhatariya, Ketan
2011-11-01
People with diabetes stay in hospital for longer than those without diabetes for similar conditions. Clinical coding is poor across all specialties. Inpatients with diabetes often have unrecognized foot problems. We wanted to look at the relationships between these factors. A single day audit, looking at the prevalence of diabetes in all adult inpatients. Also looking at their feet to find out how many were high-risk or had existing problems. A 998-bed university teaching hospital. All adult inpatients. (a) To see if patients with diabetes and foot problems were in hospital for longer than the national average length of stay compared with national data; (b) to see if there were people in hospital with acute foot problems who were not known to the specialist diabetic foot team; and (c) to assess the accuracy of clinical coding. We identified 110 people with diabetes. However, discharge coding data for inpatients on that day showed 119 people with diabetes. Length of stay (LOS) was substantially higher for those with diabetes compared to those without (± SD) at 22.39 (22.26) days, vs. 11.68 (6.46) (P < 0.001). Finally, clinical coding was poor with some people who had been identified as having diabetes on the audit, who were not coded as such on discharge. Clinical coding - which is dependent on discharge summaries - poorly reflects diagnoses. Additionally, length of stay is significantly longer than previous estimates. The discrepancy between coding and diagnosis needs addressing by increasing the levels of awareness and education of coders and physicians. We suggest that our data be used by healthcare planners when deciding on future tariffs.
Cenik, Can; Chua, Hon Nian; Singh, Guramrit; Akef, Abdalla; Snyder, Michael P; Palazzo, Alexander F; Moore, Melissa J; Roth, Frederick P
2017-03-01
Introns are found in 5' untranslated regions (5'UTRs) for 35% of all human transcripts. These 5'UTR introns are not randomly distributed: Genes that encode secreted, membrane-bound and mitochondrial proteins are less likely to have them. Curiously, transcripts lacking 5'UTR introns tend to harbor specific RNA sequence elements in their early coding regions. To model and understand the connection between coding-region sequence and 5'UTR intron status, we developed a classifier that can predict 5'UTR intron status with >80% accuracy using only sequence features in the early coding region. Thus, the classifier identifies transcripts with 5 ' proximal- i ntron- m inus-like-coding regions ("5IM" transcripts). Unexpectedly, we found that the early coding sequence features defining 5IM transcripts are widespread, appearing in 21% of all human RefSeq transcripts. The 5IM class of transcripts is enriched for non-AUG start codons, more extensive secondary structure both preceding the start codon and near the 5' cap, greater dependence on eIF4E for translation, and association with ER-proximal ribosomes. 5IM transcripts are bound by the exon junction complex (EJC) at noncanonical 5' proximal positions. Finally, N 1 -methyladenosines are specifically enriched in the early coding regions of 5IM transcripts. Taken together, our analyses point to the existence of a distinct 5IM class comprising ∼20% of human transcripts. This class is defined by depletion of 5' proximal introns, presence of specific RNA sequence features associated with low translation efficiency, N 1 -methyladenosines in the early coding region, and enrichment for noncanonical binding by the EJC. © 2017 Cenik et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Location Based Service in Indoor Environment Using Quick Response Code Technology
NASA Astrophysics Data System (ADS)
Hakimpour, F.; Zare Zardiny, A.
2014-10-01
Today by extensive use of intelligent mobile phones, increased size of screens and enriching the mobile phones by Global Positioning System (GPS) technology use of location based services have been considered by public users more than ever.. Based on the position of users, they can receive the desired information from different LBS providers. Any LBS system generally includes five main parts: mobile devices, communication network, positioning system, service provider and data provider. By now many advances have been gained in relation to any of these parts; however the users positioning especially in indoor environments is propounded as an essential and critical issue in LBS. It is well known that GPS performs too poorly inside buildings to provide usable indoor positioning. On the other hand, current indoor positioning technologies such as using RFID or WiFi network need different hardware and software infrastructures. In this paper, we propose a new method to overcome these challenges. This method is using the Quick Response (QR) Code Technology. QR Code is a 2D encrypted barcode with a matrix structure which consists of black modules arranged in a square grid. Scanning and data retrieving process from QR Code is possible by use of different camera-enabled mobile phones only by installing the barcode reader software. This paper reviews the capabilities of QR Code technology and then discusses the advantages of using QR Code in Indoor LBS (ILBS) system in comparison to other technologies. Finally, some prospects of using QR Code are illustrated through implementation of a scenario. The most important advantages of using this new technology in ILBS are easy implementation, spending less expenses, quick data retrieval, possibility of printing the QR Code on different products and no need for complicated hardware and software infrastructures.
Increased length of inpatient stay and poor clinical coding: audit of patients with diabetes
Daultrey, Harriet; Gooday, Catherine; Dhatariya, Ketan
2011-01-01
Objectives People with diabetes stay in hospital for longer than those without diabetes for similar conditions. Clinical coding is poor across all specialties. Inpatients with diabetes often have unrecognized foot problems. We wanted to look at the relationships between these factors. Design A single day audit, looking at the prevalence of diabetes in all adult inpatients. Also looking at their feet to find out how many were high-risk or had existing problems. Setting A 998-bed university teaching hospital. Participants All adult inpatients. Main outcome measures (a) To see if patients with diabetes and foot problems were in hospital for longer than the national average length of stay compared with national data; (b) to see if there were people in hospital with acute foot problems who were not known to the specialist diabetic foot team; and (c) to assess the accuracy of clinical coding. Results We identified 110 people with diabetes. However, discharge coding data for inpatients on that day showed 119 people with diabetes. Length of stay (LOS) was substantially higher for those with diabetes compared to those without (± SD) at 22.39 (22.26) days, vs. 11.68 (6.46) (P < 0.001). Finally, clinical coding was poor with some people who had been identified as having diabetes on the audit, who were not coded as such on discharge. Conclusion Clinical coding – which is dependent on discharge summaries – poorly reflects diagnoses. Additionally, length of stay is significantly longer than previous estimates. The discrepancy between coding and diagnosis needs addressing by increasing the levels of awareness and education of coders and physicians. We suggest that our data be used by healthcare planners when deciding on future tariffs. PMID:22140609
NASA Astrophysics Data System (ADS)
Gao, Shanghua; Fu, Guangyu; Liu, Tai; Zhang, Guoqing
2017-03-01
Tanaka et al. (Geophys J Int 164:273-289, 2006, Geophys J Int 170:1031-1052, 2007) proposed the spherical dislocation theory (SDT) in a spherically symmetric, self-gravitating visco-elastic earth model. However, to date there have been no reports on easily adopted, widely used software that utilizes Tanaka's theory. In this study we introduce a new code to compute post-seismic deformations (PSD), including displacements as well as Geoid and gravity changes, caused by a seismic source at any position. This new code is based on the above-mentioned SDT. The code consists of two parts. The first part is the numerical frame of the dislocation Green function (DGF), which contains a set of two-dimensional discrete numerical frames of DGFs on a symmetric earth model. The second part is an integration function, which performs bi-quadratic spline interpolation operations on the frame of DGFs. The inputs are the information on the seismic fault models and the information on the observation points. After the user prepares the inputs in a file with given format, the code will automatically compute the PSD. As an example, we use the new code to calculate the co-seismic displacements caused by the Tohoku-Oki Mw 9.0 earthquake. We compare the result with observations and the result from a full-elastic SDT, and we found that the Root Mean Square error between the calculated and observed results is 7.4 cm. This verifies the suitability of our new code. Finally, we discuss several issues that require attention when using the code, which should be helpful for users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Guoyong; Budny, Robert; Gorelenkov, Nikolai
We report here the work done for the FY14 OFES Theory Performance Target as given below: "Understanding alpha particle confinement in ITER, the world's first burning plasma experiment, is a key priority for the fusion program. In FY 2014, determine linear instability trends and thresholds of energetic particle-driven shear Alfven eigenmodes in ITER for a range of parameters and profiles using a set of complementary simulation models (gyrokinetic, hybrid, and gyrofluid). Carry out initial nonlinear simulations to assess the effects of the unstable modes on energetic particle transport". In the past year (FY14), a systematic study of the alpha-driven Alfvenmore » modes in ITER has been carried out jointly by researchers from six institutions involving seven codes including the transport simulation code TRANSP (R. Budny and F. Poli, PPPL), three gyrokinetic codes: GEM (Y. Chen, Univ. of Colorado), GTC (J. McClenaghan, Z. Lin, UCI), and GYRO (E. Bass, R. Waltz, UCSD/GA), the hybrid code M3D-K (G.Y. Fu, PPPL), the gyro-fluid code TAEFL (D. Spong, ORNL), and the linear kinetic stability code NOVA-K (N. Gorelenkov, PPPL). A range of ITER parameters and profiles are specified by TRANSP simulation of a hybrid scenario case and a steady-state scenario case. Based on the specified ITER equilibria linear stability calculations are done to determine the stability boundary of alpha-driven high-n TAEs using the five initial value codes (GEM, GTC, GYRO, M3D-K, and TAEFL) and the kinetic stability code (NOVA-K). Both the effects of alpha particles and beam ions have been considered. Finally, the effects of the unstable modes on energetic particle transport have been explored using GEM and M3D-K.« less
Analysis of impact melt and vapor production in CTH for planetary applications
Quintana, S. N.; Crawford, D. A.; Schultz, P. H.
2015-05-19
This study explores impact melt and vapor generation for a variety of impact speeds and materials using the shock physics code CTH. The study first compares the results of two common methods of impact melt and vapor generation to demonstrate that both the peak pressure method and final temperature method are appropriate for high-speed impact models (speeds greater than 10 km/s). However, for low-speed impact models (speeds less than 10 km/s), only the final temperature method is consistent with laboratory analyses to yield melting and vaporization. Finally, a constitutive model for material strength is important for low-speed impacts because strengthmore » can cause an increase in melting and vaporization.« less
2005-03-25
This interim final rule with comment period adopts the substance of the April 15, 2004 temporary interim amendment (TIA) 00-1 (101), Alcohol Based Hand Rub Solutions, an amendment to the 2000 edition of the Life Safety Code, published by the National Fire Protection Association (NFPA). This amendment will allow certain health care facilities to place alcohol-based hand rub dispensers in egress corridors under specified conditions. This interim final rule with comment period also requires that nursing facilities install smoke detectors in resident rooms and public areas if they do not have a sprinkler system installed throughout the facility or a hard-wired smoke detection system in those areas.
Analysis of impact melt and vapor production in CTH for planetary applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quintana, S. N.; Crawford, D. A.; Schultz, P. H.
This study explores impact melt and vapor generation for a variety of impact speeds and materials using the shock physics code CTH. The study first compares the results of two common methods of impact melt and vapor generation to demonstrate that both the peak pressure method and final temperature method are appropriate for high-speed impact models (speeds greater than 10 km/s). However, for low-speed impact models (speeds less than 10 km/s), only the final temperature method is consistent with laboratory analyses to yield melting and vaporization. Finally, a constitutive model for material strength is important for low-speed impacts because strengthmore » can cause an increase in melting and vaporization.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-06
...The Commodity Futures Trading Commission (the ``Commission'') is issuing final rules implementing new statutory provisions enacted by Title VII of the Dodd-Frank Wall Street Reform and Consumer Protection Act (the ``Dodd-Frank Act''). Specifically, the final rule contained herein imposes requirements on swap dealers (``SDs'') and major swap participants (``MSPs'') with respect to the treatment of collateral posted by their counterparties to margin, guarantee, or secure uncleared swaps. Additionally, the final rule includes revisions to ensure that, for purposes of subchapter IV of chapter 7 of the Bankruptcy Code, securities held in a portfolio margining account that is a futures account or a Cleared Swaps Customer Account constitute ``customer property''; and owners of such account constitute ``customers.''
Sajad, Amirsaman; Sadeh, Morteza; Keith, Gerald P.; Yan, Xiaogang; Wang, Hongying; Crawford, John Douglas
2015-01-01
A fundamental question in sensorimotor control concerns the transformation of spatial signals from the retina into eye and head motor commands required for accurate gaze shifts. Here, we investigated these transformations by identifying the spatial codes embedded in visually evoked and movement-related responses in the frontal eye fields (FEFs) during head-unrestrained gaze shifts. Monkeys made delayed gaze shifts to the remembered location of briefly presented visual stimuli, with delay serving to dissociate visual and movement responses. A statistical analysis of nonparametric model fits to response field data from 57 neurons (38 with visual and 49 with movement activities) eliminated most effector-specific, head-fixed, and space-fixed models, but confirmed the dominance of eye-centered codes observed in head-restrained studies. More importantly, the visual response encoded target location, whereas the movement response mainly encoded the final position of the imminent gaze shift (including gaze errors). This spatiotemporal distinction between target and gaze coding was present not only at the population level, but even at the single-cell level. We propose that an imperfect visual–motor transformation occurs during the brief memory interval between perception and action, and further transformations from the FEF's eye-centered gaze motor code to effector-specific codes in motor frames occur downstream in the subcortical areas. PMID:25491118
On Flowfield Periodicity in the NASA Transonic Flutter Cascade. Part 2; Numerical Study
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.; McFarland, Eric R.; Wood, Jerry R.; Lepicovsky, Jan
2000-01-01
The transonic flutter cascade facility at NASA Glenn Research Center was redesigned based on a combined program of experimental measurements and numerical analyses. The objectives of the redesign were to improve the periodicity of the cascade in steady operation, and to better quantify the inlet and exit flow conditions needed for CFD predictions. Part I of this paper describes the experimental measurements, which included static pressure measurements on the blade and endwalls made using both static taps and pressure sensitive paints, cobra probe measurements of the endwall boundary layers and blade wakes, and shadowgraphs of the wave structure. Part II of this paper describes three CFD codes used to analyze the facility, including a multibody panel code, a quasi-three-dimensional viscous code, and a fully three-dimensional viscous code. The measurements and analyses both showed that the operation of the cascade was heavily dependent on the configuration of the sidewalls. Four configurations of the sidewalls were studied and the results are described. For the final configuration, the quasi-three-dimensional viscous code was used to predict the location of mid-passage streamlines for a perfectly periodic cascade. By arranging the tunnel sidewalls to approximate these streamlines, sidewall interference was minimized and excellent periodicity was obtained.
Quantum Error Correction Protects Quantum Search Algorithms Against Decoherence
Botsinis, Panagiotis; Babar, Zunaira; Alanis, Dimitrios; Chandra, Daryus; Nguyen, Hung; Ng, Soon Xin; Hanzo, Lajos
2016-01-01
When quantum computing becomes a wide-spread commercial reality, Quantum Search Algorithms (QSA) and especially Grover’s QSA will inevitably be one of their main applications, constituting their cornerstone. Most of the literature assumes that the quantum circuits are free from decoherence. Practically, decoherence will remain unavoidable as is the Gaussian noise of classic circuits imposed by the Brownian motion of electrons, hence it may have to be mitigated. In this contribution, we investigate the effect of quantum noise on the performance of QSAs, in terms of their success probability as a function of the database size to be searched, when decoherence is modelled by depolarizing channels’ deleterious effects imposed on the quantum gates. Moreover, we employ quantum error correction codes for limiting the effects of quantum noise and for correcting quantum flips. More specifically, we demonstrate that, when we search for a single solution in a database having 4096 entries using Grover’s QSA at an aggressive depolarizing probability of 10−3, the success probability of the search is 0.22 when no quantum coding is used, which is improved to 0.96 when Steane’s quantum error correction code is employed. Finally, apart from Steane’s code, the employment of Quantum Bose-Chaudhuri-Hocquenghem (QBCH) codes is also considered. PMID:27924865
43 CFR 35.42 - Judicial review.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Judicial review. 35.42 Section 35.42... CLAIMS AND STATEMENTS § 35.42 Judicial review. Section 3805 of title 31, U.S. Code, authorizes judicial review by an appropriate U.S. District Court of a final decision of the Secretary imposing penalties or...
14 CFR 1264.141 - Judicial review.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Judicial review. 1264.141 Section 1264.141... PENALTIES ACT OF 1986 § 1264.141 Judicial review. Section 3805 of Title 31, United States Code, authorizes judicial review by an appropriate United States District Court of a final decision of the authority head...
31 CFR 16.42 - Judicial review.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Judicial review. 16.42 Section 16.42... FRAUD CIVIL REMEDIES ACT OF 1986 § 16.42 Judicial review. Section 3805 of title 31, United States Code, authorizes judicial review by an appropriate United States District Court of a final decision of the...
77 FR 17333 - Bylaws of the Board of Governors
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-26
... POSTAL SERVICE 39 CFR Parts 4, 6, and 7 Bylaws of the Board of Governors AGENCY: Postal Service. ACTION: Final rule. SUMMARY: On March 24, 2010, the Board of Governors of the United States Postal... the Code of Federal Regulations are effective March 26, 2012. FOR FURTHER INFORMATION CONTACT: Julie S...
2017-05-23
Systems and the NRL Code 5763 Radio Frequency (RF) Stimulator. It includes and covers system descriptions , setup, data collection, and test goals that...6 4. Test Asset Descriptions ...7 4.1. Description of FOXTROT Anti-ship Missile (ASM) Simulator ......................................... 7
Use of a Computer Language in Teaching Dynamic Programming. Final Report.
ERIC Educational Resources Information Center
Trimble, C. J.; And Others
Most optimization problems of any degree of complexity must be solved using a computer. In the teaching of dynamic programing courses, it is often desirable to use a computer in problem solution. The solution process involves conceptual formulation and computational Solution. Generalized computer codes for dynamic programing problem solution…