Knowledge and Processes in Design
1992-09-03
Orqanization Name(s) and Address(es). Self-explanatory. Block 16. Price Code. Enter approoriate price Block 8. Performing Organization Report code...NTIS on/y). Number. Enter the unique alphanumerc report number(s) assigned by the organization periorming the report. Blocks 17.-19...statement codings were then organized into larger control-flow structures centered around design components called modules. The general assumption was
1980-04-01
greatly intensify the thermoelectric effect by providing a source of warm particles in a freezing environment. The presence of contaminants can...Organization Code 7. Autor~s)8. Performing Organization Report No. F~ AA en2 L13666 9. Performing Organization Name and Address50 -4-30 NAS5A Langley...Publication National Aeronautics and Space Administration 14. Sponsoring Agency Code Washington, DC 20546 and Florida Institute of Technology Melbourne, FL
1987-06-01
DECLASSIFICATION OWNGRAONG SCIEDULE distribution is unlimited. 4 PERFORMING ORGANIATION REPORT NUMBIR(S) S MONITORING ORGANIZATION REPORT NUVBER(S) 6a NAME OF...PERFORMING ORGANIZATION 60 OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION (if applicable) Naval Postgraduate SchoolJ Code 74 Naval Postgraduate School 6c...FUNOINGi SPONSORING Sb OFFICE SYMBOL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If dappicable) 8c AODRESS (City, State. ard ZIP Code
1988-01-21
DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release; 2b. DECLASSIFICATION /’DOWNGRADING SCHEDULE Distribution unlimited 4. PERFORMING ORGANIZATION ...REPORT NUMBER(S) 5. MONITORING ORGANIZATION REPORT NUMBER(S) AFGL-TR-88-0016 6a, NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF...MONITORING ORGANIZATION Air Force Geophysics (If applicable) Laboratory I oc. ADDRESS (City, State, and ZIP Code) 7b ADDRESS (City, Stare, and ZIP Code
1993-01-01
SECURITY CLASSIFICATION AUTHORITY 3. DISTRIBUTION JAVAILABIUITY OF REPORT 2b. DECLASSIFICATION I OWNGRAD)ING SCHEDULE I4. PERFORMING ORGANIZATION ...REPORT NUMBER(S) 5. MONITORING ORGANIZATION REPORT NUMBER(S) RESEARCH REPORT NO. 9 6a. NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF...MONITORING ORGANIZATION Markman & Associates, Inc.(I plcbe 6c. ADDRESS (City. State. and ZIP Code) 7b. ADDRESS (City. State. and ZIP Code) 824 N. Bl
1988-04-01
Ditribufion is unlimited. 4 PERFORMING ORGANIZATION REPORT NUMBER(S) 5. MONITORING ORGANIZATION REPORT NUMBER(S) 88-0825 6a NAME OF PERFORMING ORGANIZATION 6b...OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION ACS C/EDC (If applicable) 6c. ADDRESS (City, State, and ZIP Code) 7b. ADDRESS (City, State, and...ZIP Code) MAXWELL AFB AL 36112-5542 8a. NAME OF FUNDING/SPONSORING 8b. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If
Mechanism of Cytotoxicity of the AIDS Virus, HTLV-III/LAV
1989-05-21
distribution unlimited 4. PERFORMING OR3ANIZATION REPORT NUMBER(S) S. MONITORING ORGANIZATION REPORT NUMBER($) 143-065-3611-Al 6s. NAME OF PERFORMING... ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME Of MONITORING ORGANIZATIONj (I aI cb) Washinton University k€. ADORESS (City, State, and ZIP Code) 7b. ADDRESS...IDENTIFICATION NUMBER ORGANIZATiON U.S. Army Medical (if awible) Resch. & Development Command DArJM-17-87-C-7101 Sc. ADDRESS (Oil, State, and ZIP Code
Verifying the Chemical Weapons Convention: The Case for a United Nations Verification Agency
1991-12-01
ORGANIZATION REPORT NUMBER(S) 6&. NAME OF PERFORMING ORGANIZATION j6b. OFFICE SYMBOL 7&. NAME OF MONITORING ORGANIZATION Naval Postgraduate School J(if applicaip...Naval Postgraduate School 6c. ADDRESS (City, State, and ZIP Code) 7b. ADDRESS (City, State, and ZIP Code) Monterey. CA 93943-5000 Monterey, CA 93943...Governinent. 17. COSATI CODES 18. SUBJECT TERMS (continue on reverse if necessaty and identify by black number) -FIELD GROUP SUBGROUP Chemical
Return with Honor: Code of Conduct Training in the National Military Strategy Security Environment
2004-09-01
maximize the number of deaths and injuries among the most vulnerable civilians, such as children, women and the elderly… The terrorist leaders - who do...Return with Honor: Code of Conduct Training in the National Military Strategy Security Environment 6. AUTHOR(S) Major Laura M. Ryan 5. FUNDING NUMBERS ...7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER
Bahadori, Amir A; Sato, Tatsuhiko; Slaba, Tony C; Shavers, Mark R; Semones, Edward J; Van Baalen, Mary; Bolch, Wesley E
2013-10-21
NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.
NASA Astrophysics Data System (ADS)
Bahadori, Amir A.; Sato, Tatsuhiko; Slaba, Tony C.; Shavers, Mark R.; Semones, Edward J.; Van Baalen, Mary; Bolch, Wesley E.
2013-10-01
NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.
Spare a Little Change? Towards a 5-Nines Internet in 250 Lines of Code
2011-05-01
NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Carnegie Mellon University,School of Computer Science,Pittsburgh,PA,15213 8. PERFORMING ...Std Z39-18 Keywords: Internet reliability, BGP performance , Quagga This document includes excerpts of the source code for the Linux operating system...Behavior and Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . .. Other Related Work
1990-12-01
S) Naval Postgraduate School 6a. NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION (if applicable ) Code 33 6c...FUNDING/SPONSORING Bb. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If applicable ) 8c. ADDRESS (City, State, and ZIP Code...system’s individual components. Then one derives the overall system reliability from that information, using a simple mathematical model, to be
Identifying Pediatric Severe Sepsis and Septic Shock: Accuracy of Diagnosis Codes.
Balamuth, Fran; Weiss, Scott L; Hall, Matt; Neuman, Mark I; Scott, Halden; Brady, Patrick W; Paul, Raina; Farris, Reid W D; McClead, Richard; Centkowski, Sierra; Baumer-Mouradian, Shannon; Weiser, Jason; Hayes, Katie; Shah, Samir S; Alpern, Elizabeth R
2015-12-01
To evaluate accuracy of 2 established administrative methods of identifying children with sepsis using a medical record review reference standard. Multicenter retrospective study at 6 US children's hospitals. Subjects were children >60 days to <19 years of age and identified in 4 groups based on International Classification of Diseases, Ninth Revision, Clinical Modification codes: (1) severe sepsis/septic shock (sepsis codes); (2) infection plus organ dysfunction (combination codes); (3) subjects without codes for infection, organ dysfunction, or severe sepsis; and (4) infection but not severe sepsis or organ dysfunction. Combination codes were allowed, but not required within the sepsis codes group. We determined the presence of reference standard severe sepsis according to consensus criteria. Logistic regression was performed to determine whether addition of codes for sepsis therapies improved case identification. A total of 130 out of 432 subjects met reference SD of severe sepsis. Sepsis codes had sensitivity 73% (95% CI 70-86), specificity 92% (95% CI 87-95), and positive predictive value 79% (95% CI 70-86). Combination codes had sensitivity 15% (95% CI 9-22), specificity 71% (95% CI 65-76), and positive predictive value 18% (95% CI 11-27). Slight improvements in model characteristics were observed when codes for vasoactive medications and endotracheal intubation were added to sepsis codes (c-statistic 0.83 vs 0.87, P = .008). Sepsis specific International Classification of Diseases, Ninth Revision, Clinical Modification codes identify pediatric patients with severe sepsis in administrative data more accurately than a combination of codes for infection plus organ dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.
Thermospray Liquid Chromatography/Mass Spectrometry of Mustard and Its Metabolites
1989-05-01
MONITORING ORGANIZATION REPORT NUMBER(S) CRDEC-TR-066 6a. NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION (If applicable...see reverse 6c- ADDRESS (Cty, State, and ZIP Code) 7b. ADDRESS (City, State, and ZIP Code) Ba. NAME OF FUNDING/SPONSORING 8b OFFICE SYMBOL 9...Ather und Thioather in Dioxan- Wasser -Gemischen," Chem, Ber. Vol. 81, p 123 (1948). 2. Capon, B., and McManus, S. P., Neighboring Group Participation
1991-09-01
NAVAL POSTGRADUATE SCHOOL Monterey, California AD-A246 188 7 R DTIC fl ELECTE FEB2 1992 U THESIS THE TELECOMMUNICATIONS EMERGENCY DECISION SUPPORT...ORGANIZATION REPORT NUMBER(S) a. NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOl 7a. NAME OF MONITORING ORGANIZATION Naval Postgraduate School J ""X...s Naval Postgraduate School c. ADDRESS (City, State and ZIP Code) 7b. ADDRESS (City, State, and ZIP Code) Monterey, CA 93943-5000 Monterey, CA 93943
The Architecture of a Cooperative Respondent (Dissertation Proposal)
1989-02-01
DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release; distribution unlimited. 4. PERFORMING ORGANIZATION REPORT NUMBER(S) 5. MONITORING... ORGANIZATION REPORT NUMBER(S) 6«. NAME OF PERFORMING ORGANIZATION Center of Excellence in AI University of Pennsylvania Sb. OFFICE SYMBOL (if...applicable) 7a. NAME OF MONITORING ORGANIZATION U. S. Army Research Office 6c AODRESS (Gty, Staff, and ZIP Code) Dept. of Computer & Information
A Three Dimensional Electronic Retina Architecture.
1987-12-01
not guarantee that a biological entity is in fact the best design because of the unique constraining factors of a biological organism and the associated...4. PERFORMING ORGANIZATION REPORT NUMBER(S) 5. MONITORING ORGANIZATION REPORT NUMBER(S) AFIT/GCS/ENG/87D-23 6a. NAME OF PERFORMING ORGANIZATION 6b...OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION (If applicable) School of Engineering AFIT/ENG 6c. ADDRESS (City, State, and ZIP Code) 7b. ADDRESS
Cost Metric Algorithms for Internetwork Applications
1989-04-01
5000. Released by Under authority of M. B. Vineberg, Head . X E Jahn, Head System Design and Battle Force and Theater Architechture Branch...for public release; distribution unlimited. 4. PERFORMING ORGANIZATION REPORT NUMBER(S) S. MONITORING ORGANIZATION REPORT NUMBER(S) NOSC TR 1284 6a...NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBO 7a. NAME OF MONITORING ORGANIZATION Naval Ocean Systems Center Code 854 6c. ADDRESS (C, SftW&WZPCa
Macrocognition in Complex Team Problem Solving
2007-06-01
Organization: Office of Naval Research Complete Address: Dr Michael Letsky Office of Naval Research Life Sciences Department Code 341 Rm 1051 875...S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Office of Naval Research ,Code 341 Rm...distribution unlimited 13. SUPPLEMENTARY NOTES Twelfth International Command and Control Research and Technology Symposium (12th ICCRTS), 19-21 June
Simulated Raman Spectral Analysis of Organic Molecules
NASA Astrophysics Data System (ADS)
Lu, Lu
The advent of the laser technology in the 1960s solved the main difficulty of Raman spectroscopy, resulted in simplified Raman spectroscopy instruments and also boosted the sensitivity of the technique. Up till now, Raman spectroscopy is commonly used in chemistry and biology. As vibrational information is specific to the chemical bonds, Raman spectroscopy provides fingerprints to identify the type of molecules in the sample. In this thesis, we simulate the Raman Spectrum of organic and inorganic materials by General Atomic and Molecular Electronic Structure System (GAMESS) and Gaussian, two computational codes that perform several general chemistry calculations. We run these codes on our CPU-based high-performance cluster (HPC). Through the message passing interface (MPI), a standardized and portable message-passing system which can make the codes run in parallel, we are able to decrease the amount of time for computation and increase the sizes and capacities of systems simulated by the codes. From our simulations, we will set up a database that allows search algorithm to quickly identify N-H and O-H bonds in different materials. Our ultimate goal is to analyze and identify the spectra of organic matter compositions from meteorites and compared these spectra with terrestrial biologically-produced amino acids and residues.
An Efficient Method for Verifying Gyrokinetic Microstability Codes
NASA Astrophysics Data System (ADS)
Bravenec, R.; Candy, J.; Dorland, W.; Holland, C.
2009-11-01
Benchmarks for gyrokinetic microstability codes can be developed through successful ``apples-to-apples'' comparisons among them. Unlike previous efforts, we perform the comparisons for actual discharges, rendering the verification efforts relevant to existing experiments and future devices (ITER). The process requires i) assembling the experimental analyses at multiple times, radii, discharges, and devices, ii) creating the input files ensuring that the input parameters are faithfully translated code-to-code, iii) running the codes, and iv) comparing the results, all in an organized fashion. The purpose of this work is to automate this process as much as possible: At present, a python routine is used to generate and organize GYRO input files from TRANSP or ONETWO analyses. Another routine translates the GYRO input files into GS2 input files. (Translation software for other codes has not yet been written.) Other python codes submit the multiple GYRO and GS2 jobs, organize the results, and collect them into a table suitable for plotting. (These separate python routines could easily be consolidated.) An example of the process -- a linear comparison between GYRO and GS2 for a DIII-D discharge at multiple radii -- will be presented.
Lubricating Oil Burn-Off in Coast Guard Power Plants
1975-02-01
of the in-line fuel filters and wear of the fuel pumps . The overwhelming majority of tests performed by others in L• lube oil burn-off programs...February 1975 IAJBRICATING OIL BURN-OFF IN 6. perfarming Organization Code COAST GUARD POWER PLANTS ___________A. Performing Organization Report No. 7...Authorls) J.R. Hobbs and R. A. WalterDOTC-JC76 9. Performing Organization Name and Address 10. Work Unit No. (TRAIS) U.S. Department of Transportation
1987-01-01
OWNGRADING SCHEDULE Approved for public release 4. PERFORMING ORGANIZATION REPORT NUMBER(S) 5. MONITORING ORGANIZATION REPORT NUMBER(S) DIOR/ST11-87...Pr- 2 6a. NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION WHS, Directorate for (if applicable) nformation...SPONSORING 1Sb. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (if applicable) Sc. ADDRESS (City, State, and ZIP Code) 10. SOURCE
307TH Engineer Battalion Prop Blast - An Airborne Tradition
1991-04-24
American servicemen and women have trained and fought gallantly as airborne soldiers . Most officers who have been honored to serve with the airborne...distribution is unlimited. 4. PERFORMING ORGANIZATION REPORT NUMBER (S) 5. MONITORING ORGANIZATION REPORT NUMBER (S) 6a. NAME OF PERFORMING ORGANIZATION 6b...and ZIP Code) Carlisle Barracks, PA 17013-5002 Ba. NAME OF FUNDING/SPONSORING 8b. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER
Quality Assurance System. Volume 1. Report (Technology Transfer Program)
1980-03-03
WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Surface Warfare Center CD Code 2230 - Design Integration Tools Building...192 Room 128-9500 MacArthur Blvd Bethesda, MD 20817-5700 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS...presented herein. TABLE OF CONTENTS VOLUME I - FINDINGS AND CONCLUSIONS SECTION PARAGRAPH TITLE 1 INTRODUCTION 1.1 Purpose and Scope 1.2 Organization of
2008-09-01
Organization Code 7 . Author(s) 8. Performing Organization Report No. Prinzo OV, Campbell A 9. Performing Organization Name and Address 10...Data Com Human Factors Workng Group. We thank all the people at Amercan, Contnental, Delta, and Unted Arlnes who were nstrumental n the...ng Plots, ( 7 ) Language Experences n Natve Englsh-Speakng Arspace/Ar- ports, (8) Natve Englsh-Speakng Controllers Com- muncatng Wth Non
Understanding Patterns of Team Collaboration Employed To Solve Unique Problems
2008-06-01
ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval...Postgraduate School,Code IS/Hs,589 Dyer Road,Monterey,CA,93943 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND... organizations , systems, infrastructure, and processes to create and share data, information, and knowledge that is needed for the team to plan
2004-10-01
Gas Chromatographic/Mass Spectrometric Differentiation of Atenolol, Metoprolol , Propranolol, and an Interfering Metabolite Product of Metoprolol ...4. Title and Subtitle 5. Report Date October 2004 Gas Chromatographic/Mass Spectrometric Differentiation of Atenolol, Metoprolol , Propranolol...and an Interfering Metabolite Product of Metoprolol 6. Performing Organization Code 7. Author(s) 8. Performing Organization Report No. Angier MK
Independent Assessment Plan: LAV-25
1989-06-27
Pages. Enter the total Block 7. Performing Organization Name(s) and number of pages. Address(es. Self -explanatory. Block 16. Price Code, Enter...organization Blocks 17. - 19. Security Classifications. performing the report. Self -explanatory. Enter U.S. Security Classification in accordance with U.S...Security Block 9. S oonsorina/Monitoring Acenc Regulations (i.e., UNCLASSIFIED). If form .Names(s) and Address(es). Self -explanatory. contains classified
Advances in Engineering Software for Lift Transportation Systems
NASA Astrophysics Data System (ADS)
Kazakoff, Alexander Borisoff
2012-03-01
In this paper an attempt is performed at computer modelling of ropeway ski lift systems. The logic in these systems is based on a travel form between the two terminals, which operates with high capacity cabins, chairs, gondolas or draw-bars. Computer codes AUTOCAD, MATLAB and Compaq-Visual Fortran - version 6.6 are used in the computer modelling. The rope systems computer modelling is organized in two stages in this paper. The first stage is organization of the ground relief profile and a design of the lift system as a whole, according to the terrain profile and the climatic and atmospheric conditions. The ground profile is prepared by the geodesists and is presented in an AUTOCAD view. The next step is the design of the lift itself which is performed by programmes using the computer code MATLAB. The second stage of the computer modelling is performed after the optimization of the co-ordinates and the lift profile using the computer code MATLAB. Then the co-ordinates and the parameters are inserted into a program written in Compaq Visual Fortran - version 6.6., which calculates 171 lift parameters, organized in 42 tables. The objective of the work presented in this paper is an attempt at computer modelling of the design and parameters derivation of the rope way systems and their computer variation and optimization.
Variational Formulation and Finite Element Implementation of Pagano’s Theory of Laminated Plates
1991-07-12
ORGANIZATION (N Wmkabl,) Contract No. F33615-85-C-3213 Sc. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT...PERFORMING ORGANIZATION REPORT NUMBER (S) S. MONITORING ORGANIZATION REPORT NUMBER (S) RF Project 764779/717297 WL-TR-91-3016 6a. NAME OF PERFORMING...Wright-Patterson Air Force Base, Ohio 45433- 6553 Ba. NAME OF FUNDING/SPONSORING 8b. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER
Benchmarking: your performance measurement and improvement tool.
Senn, G F
2000-01-01
Many respected professional healthcare organizations and societies today are seeking to establish data-driven performance measurement strategies such as benchmarking. Clinicians are, however, resistant to "benchmarking" that is based on financial data alone, concerned that it may be adverse to the patients' best interests. Benchmarking of clinical procedures that uses physician's codes such as Current Procedural Terminology (CPTs) has greater credibility with practitioners. Better Performers, organizations that can perform procedures successfully at lower cost and in less time, become the "benchmark" against which other organizations can measure themselves. The Better Performers' strategies can be adopted by other facilities to save time or money while maintaining quality patient care.
48 CFR 52.226-6 - Promoting excess food donation to nonprofit organizations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... quality and labeling standards imposed by Federal, State, and local laws and regulations even though the... the Internal Revenue Code of 1986; and (2) Exempt from tax under section 501(a) of that Code. (b) In... tier, who will perform, under this contract, the provision, service, or sale of food in the United...
1981-07-01
CONTRACT OR GRANT NUMBER(e) Naval Facilities Engineering Command 200 Stovall Street r Alexandria, VA 22332 (Code 0453) s. PERFORMING ORGANIZATION NAME...AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK • Naval Facilities Engineering Command AREA & WORK UNIT NUMBERS < 200 Stovall Street Engineering and...Design Alexandria, VA 22332 It. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE ~ Naval Facilities Engineering Command (Code10432) July 1981 200
Geographic Information Systems: A Primer
1990-10-01
AVAILABILITY OF REPORT Approved for public release; distribution 2b DECLASSjFICATION/ DOWNGRADING SCHEDULE unlimited. 4 PERFORMING ORGANIZATION REPORT...utilizing sophisticated integrated databases (usually vector-based), avoid the indirect value coding scheme by recognizing names or direct magnitudes...intricate involvement required by the operator in order to establish a functional coding scheme . A simple raster system, in which cell values indicate
Glycopeptides as Analgesics: Non-Toxic Alternatives to Morphine for Combat Casualty Care
2013-12-05
Management and Budget, Paperwork Reduction Project (0704-0188), Washington, DC 20503 1. Agency Use Only (Leave blank) 2. Report Date 5 December 2013...Performing Organization Name (Include Name, City, State, Zip Code and Email for Principal Investigator) The University of Arizona Tucson, Arizona...85722-3308 E-Mail: poltSu. arizona . edu 8. Performing Organization Report Number (Leave Blank) 9. Sponsoring/Monitoring Agency Name and Address
Role of Cyclin E as an Early Event in Ovarian Carcinogenesis
2010-04-01
PERFORMING ORGANIZATION NAME(S) AND ADDRESS( ES ) 8. PERFORMING ORGANIZATION REPORT NUMBER Cedars-Sinai Medical Center Los Angeles, CA...90048-9004 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS( ES ) 10. SPONSOR/MONITOR’S ACRONYM(S) U.S. Army Medical Research...a. REPORT U b. ABSTRACT U c . THIS PAGE U UU 30 19b. TELEPHONE NUMBER (include area code) Standard Form 298 (Rev. 8-98) Prescribed by
Flight Attendant Fatigue. Part IV. Analysis of Incident Reports
2009-12-01
Flight Attendant Fatigue, Part IV: Analysis of Incident Reports Kali Holcomb Katrina Avers Lena Dobbins Joy Banks Lauren Blackwell Thomas Nesthus...Incident Reports 6. Performing Organization Code 7. Author(s) 8. Performing Organization Report No. Holcomb K, Avers K, Dobbins L, Banks J...observed by erC members of the flight attendant ASAP programs, a survey was developed. Surveys were distributed via e -Mail to 23 participants for
Cognitive and Neural Sciences Division 1991 Programs
1991-08-01
FUNDING NUMBERS Cognitive and Neural Sciences Division 1991 Programs PE 61153N 6. AUTHOR(S) Edited by Willard S. Vaughan 7. PERFORMING ORGANIZATION...NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Office of Naval Research 0CNR !1491-19 Cognitive and Neural Sciences Division Code 1142...NOTES iN This is a compilation of abstracts representing R&D sponsored by the ONR Cognitive and Neural Sciences Division. 12a. DISTRIBUTION
Three-Dimensional Shallow Water Acoustics
2016-03-30
Wooos HoLE OcEANOGRAPHIC INSTITUTION Applied Ocean Physics and Engineering Department March 30,2016 Dr. Kyle Becker Office ofNaval Research, Code...Naval Research Laboratory Grant and Contract Services (WHOI) AOPE Department Office (WHOI) MS#12 • Woods Hole , MA 02543 USA • 508.289.2230 • Fax...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Woods Hole Oceanographic Institu t ion 266 Woods
Helical Explosive Flux Compression Generator Research at the Air Force Research Laboratory
1999-06-01
Air Force Research Laboratory Kirtland AFB...ORGANIZATION NAME(S) AND ADDRESS(ES) Directed Energy Directorate, Air Force Research Laboratory Kirtland AFB, NM 8. PERFORMING ORGANIZATION REPORT...in support of the Air Force Research Laboratory ( AFRL ) explosive pulsed power program. These include circuit codes such as Microcap and
FRAGS: estimation of coding sequence substitution rates from fragmentary data
Swart, Estienne C; Hide, Winston A; Seoighe, Cathal
2004-01-01
Background Rates of substitution in protein-coding sequences can provide important insights into evolutionary processes that are of biomedical and theoretical interest. Increased availability of coding sequence data has enabled researchers to estimate more accurately the coding sequence divergence of pairs of organisms. However the use of different data sources, alignment protocols and methods to estimate substitution rates leads to widely varying estimates of key parameters that define the coding sequence divergence of orthologous genes. Although complete genome sequence data are not available for all organisms, fragmentary sequence data can provide accurate estimates of substitution rates provided that an appropriate and consistent methodology is used and that differences in the estimates obtainable from different data sources are taken into account. Results We have developed FRAGS, an application framework that uses existing, freely available software components to construct in-frame alignments and estimate coding substitution rates from fragmentary sequence data. Coding sequence substitution estimates for human and chimpanzee sequences, generated by FRAGS, reveal that methodological differences can give rise to significantly different estimates of important substitution parameters. The estimated substitution rates were also used to infer upper-bounds on the amount of sequencing error in the datasets that we have analysed. Conclusion We have developed a system that performs robust estimation of substitution rates for orthologous sequences from a pair of organisms. Our system can be used when fragmentary genomic or transcript data is available from one of the organisms and the other is a completely sequenced genome within the Ensembl database. As well as estimating substitution statistics our system enables the user to manage and query alignment and substitution data. PMID:15005802
Genomics dataset of unidentified disclosed isolates.
Rekadwad, Bhagwan N
2016-09-01
Analysis of DNA sequences is necessary for higher hierarchical classification of the organisms. It gives clues about the characteristics of organisms and their taxonomic position. This dataset is chosen to find complexities in the unidentified DNA in the disclosed patents. A total of 17 unidentified DNA sequences were thoroughly analyzed. The quick response codes were generated. AT/GC content of the DNA sequences analysis was carried out. The QR is helpful for quick identification of isolates. AT/GC content is helpful for studying their stability at different temperatures. Additionally, a dataset on cleavage code and enzyme code studied under the restriction digestion study, which helpful for performing studies using short DNA sequences was reported. The dataset disclosed here is the new revelatory data for exploration of unique DNA sequences for evaluation, identification, comparison and analysis.
32 CFR Appendix A to Part 169a - Codes and Definitions of Functional Areas
Code of Federal Regulations, 2011 CFR
2011-07-01
...) intermediate/direct/general maintenance performed by fixed activities that are not designed for deployment to combat areas and that provide direct support of organizations performing or designed to perform combat... commercial activities that are especially designed and constructed for the low-cost and efficient storage and...
32 CFR Appendix A to Part 169a - Codes and Definitions of Functional Areas
Code of Federal Regulations, 2010 CFR
2010-07-01
...) intermediate/direct/general maintenance performed by fixed activities that are not designed for deployment to combat areas and that provide direct support of organizations performing or designed to perform combat... commercial activities that are especially designed and constructed for the low-cost and efficient storage and...
32 CFR Appendix A to Part 169a - Codes and Definitions of Functional Areas
Code of Federal Regulations, 2012 CFR
2012-07-01
...) intermediate/direct/general maintenance performed by fixed activities that are not designed for deployment to combat areas and that provide direct support of organizations performing or designed to perform combat... commercial activities that are especially designed and constructed for the low-cost and efficient storage and...
32 CFR Appendix A to Part 169a - Codes and Definitions of Functional Areas
Code of Federal Regulations, 2014 CFR
2014-07-01
...) intermediate/direct/general maintenance performed by fixed activities that are not designed for deployment to combat areas and that provide direct support of organizations performing or designed to perform combat... commercial activities that are especially designed and constructed for the low-cost and efficient storage and...
32 CFR Appendix A to Part 169a - Codes and Definitions of Functional Areas
Code of Federal Regulations, 2013 CFR
2013-07-01
...) intermediate/direct/general maintenance performed by fixed activities that are not designed for deployment to combat areas and that provide direct support of organizations performing or designed to perform combat... commercial activities that are especially designed and constructed for the low-cost and efficient storage and...
Initial Design and Experimental Implementation of the Traffic Advisory Service of ATARS
1980-11-03
Traffic 6. Performing Organization Code Advisory Service of ATARS 7. Author(s) 8. Performing Organization Report No Jeffrey L. Gertz ATC-101 9...and Resolution Service ( ATARS ) is a ground-based collision avoidance system which utilizes surveillance data from the Discrete Address Beacon System...to aircraft via the DABS data link. ATARS provides both a traffic advisory and a resolution (collision avoidance) service to aircraft equipped with a
2010-04-01
Layer Interaction, Real Gas, Radiation and Plasma Phenomena in Contemporary CFD Codes Michael S. Holden, PhD CUBRC , Inc. 4455 Genesee Street Buffalo...NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) CUBRC , Inc. 4455 Genesee Street Buffalo, NY 14225, USA 8. PERFORMING...HyFly Navy EMRG Reentry-F Slide 2 X-43 HIFiRE-2 Figure 17: Transition in Hypervelocity Flows: CUBRC Focus – Fully Duplicated Ground Test
Spontaneous Analogy by Piggybacking on a Perceptual System
2013-08-01
1992). High-level Perception, Representation, and Analogy: A Critique of Artificial Intelligence Methodology. J. Exp. Theor. Artif . Intell., 4(3...nrl.navy.mil David W. Aha Navy Center for Applied Research in Artificial Intelligence Naval Research Laboratory (Code 5510); Washington, DC 20375 david.aha...Research Laboratory,Center for Applied Research in Artificial Intelligence (Code 5510),4555 Overlook Ave., SW,Washington,DC,20375 8. PERFORMING ORGANIZATION
Digital microarray analysis for digital artifact genomics
NASA Astrophysics Data System (ADS)
Jaenisch, Holger; Handley, James; Williams, Deborah
2013-06-01
We implement a Spatial Voting (SV) based analogy of microarray analysis for digital gene marker identification in malware code sections. We examine a famous set of malware formally analyzed by Mandiant and code named Advanced Persistent Threat (APT1). APT1 is a Chinese organization formed with specific intent to infiltrate and exploit US resources. Manidant provided a detailed behavior and sting analysis report for the 288 malware samples available. We performed an independent analysis using a new alternative to the traditional dynamic analysis and static analysis we call Spatial Analysis (SA). We perform unsupervised SA on the APT1 originating malware code sections and report our findings. We also show the results of SA performed on some members of the families associated by Manidant. We conclude that SV based SA is a practical fast alternative to dynamics analysis and static analysis.
Results of DATAS Investigation of Illegal Mode S ID’s at JFK Airport
1992-12-01
December 1992 RESULTS OF DATAS INVESTIGATION OF ILLEGAL MODE S 6. Performing Organization Code ID’S AT JFK AIRPORT ACD-320 _8. Performing Organization...collection effort with JFK Airport engineers and Air Traffic personnel. The deployment of the TCAS monitor at the JFK Airport would not have been possible...Surveillance Radar (ASR)-9. The TCAS monitor was deployed at the John F. Kennedy ( JFK ) Airport . Since the TCAS monitor provides only directional
1999-10-01
TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Surface Warfare Center CD Code 2230-Design Integration...Tools Bldg 192, Room 128 9500 MacArthur Blvd Bethesda, MD 20817-5700 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S...Implementation of Task 2.4 • Task 7.0 Conduct Workshops • Task 8.0 Final Report To ensure success with the project, the research needed to be performed at the
An Economical Multifactor within-Subject Design Robust against Trend and Carryover Effects.
1985-10-17
ORGANIZATION REPORT NUMBER (S) S. MONIT ,,M.,,...---. 6a. NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION Essex...Road Orlando, FL 32813 Orlando, FL 32803 Ba. NAME OF FUNDING/SPONSORING " Sb. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ...ORGANIZATION (If applicable) S6~1332- &/. 0.-/195 Sc. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT
Lack of harmonization in sweat testing for cystic fibrosis - a national survey.
Christiansen, Anne Lindegaard; Nybo, Mads
2014-11-01
Sweat testing is used in the diagnosis of cystic fibrosis. Interpretation of the sweat test depends, however, on the method performed since conductivity, osmolality and chloride concentration all can be measured as part of a sweat test. The aim of this study was to investigate how performance of the test is organized in Denmark. Departments conducting the sweat test were contacted and interviewed following a premade questionnaire. They were asked about methods performed, applied NPU (Nomenclature for Properties and Units) code, reference interval, recommended interpretation and referred literature. 14 departments performed the sweat test. One department measured chloride and sodium concentration, while 13 departments measured conductivity. One department used a non-existing NPU code, two departments applied NPU codes inconsistent with the method performed, four departments applied no NPU code and seven applied a correct NPU code. Ten of the departments measuring conductivity applied reference intervals. Nine departments measuring conductivity had recommendations of a normal area, a grey zone and a pathological value, while four departments only applied a normal and grey zone or a pathological value. Cut-off values for normal, grey and pathological areas were like the reference intervals inconsistent. There is inconsistent use of NPU codes, reference intervals and interpretation of sweat conductivity used in the process of diagnosing cystic fibrosis. Because diagnosing cystic fibrosis is a combined effort between local pediatric departments, biochemical and genetic departments and cystic fibrosis centers, a national harmonization is necessary to assure correct clinical use.
A comparison of TSS and TRASYS in form factor calculation
NASA Technical Reports Server (NTRS)
Golliher, Eric
1993-01-01
As the workstation and personal computer become more popular than a centralized mainframe to perform thermal analysis, the methods for space vehicle thermal analysis will change. Already, many thermal analysis codes are now available for workstations, which were not in existence just five years ago. As these changes occur, some organizations will adopt the new codes and analysis techniques, while others will not. This might lead to misunderstandings between thermal shops in different organizations. If thermal analysts make an effort to understand the major differences between the new and old methods, a smoother transition to a more efficient and more versatile thermal analysis environment will be realized.
2010-12-01
Recommendation II: Flight Attendant Work/Rest Patterns, Alertness, and Performance Assessment DOT/FAA/AM-10/22 Office of Aerospace Medicine Washington, DC...Recipient’s Catalog No. DOT/FAA/AM-10/22 4. Title and Subtitle 5. Report Date December 2010 6. Performing Organization Code Flight...Attendant Fatigue Recommendation II: Flight Attendant Work/Rest Patterns, Alertness, and Performance Assessment 7. Author(s) 8. Performing
Validity of Empirical Studies of Information System Effectiveness
1989-06-01
34The Methodologies of Symbolic Interactionism : A Critical Review of Research Techniques," (Perprint), 1968. Durand, Douglas E., Rex 0. Bennet and Samuel...is unlimited 4 PERFORMING ORGANIZATION REPORT NuMBER(S) 5 MONITORING ORGANZAT,O% REPORT NurM8C’ZS, 6a NAME OF PERFORMING ORGANIZATION 6b OFiCE SYMBOL ...7b ADDRESS (City. State and ZIP Code) Monterey, California 93943-5000 Monterey, California 93943-5000 8a NAME Of rJ.DfNG SPONSOR NrT 8b OFF CE SYMBOL
1993-04-01
to failure at 1.25 mm/min.(.05 in./min.) by a hydraulic, 267 kN (60,000 lb.) capacity, Satec testing machine. Strain output was conditioned through...D.Hoyns 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS( ES ) ’S. PERFORMING ORGANIZATION REPORT NUMBER Naval Surface Warfare Center Carderock Division...Annapolis Detachment CRDKNSWC-SSM-64-92/22 Code 2844/644 9. SPONSORING /MON7ORING AGENCY NAME(S) AND ADDRESS( ES ) 10. SPONSORING /MONITOR INGAGENCY REPORT
Prostate Cancer in African-American Men: Serum Biomarkers for Early Detection Using Nanoparticles
2009-11-01
NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8 . PERFORMING ORGANIZATION REPORT...b. ABSTRACT U c. THIS PAGE U UU 25 19b. TELEPHONE NUMBER (include area code) Standard Form 298 (Rev. 8 -98) Prescribed by ANSI Std. Z39.18 2...mapping; 8 . Gel electrophoresis; and 9. ‘Home-made’ QDs; We published a paper on the application of the b io-conjugated quantum dots (QDs) for
2003-03-01
organizations . Reducing attrition rates through optimal selection decisions can “reduce training cost, improve job performance, and enhance...capturing the weights for use in the SNR method is not straightforward. A special VBA application had to be written to capture and organize the network...before the VBA application can be used. Appendix D provides the VBA code used to import and organize the network weights and input standardization
Anisotropic Effects on Constitutive Model Parameters of Aluminum Alloys
2012-01-01
constants are required input to computer codes (LS-DYNA, DYNA3D or SPH ) to accurately simulate fragment impact on structural components made of high...different temperatures. These model constants are required input to computer codes (LS-DYNA, DYNA3D or SPH ) to accurately simulate fragment impact on...ADDRESS(ES) Naval Surface Warfare Center,4104Evans Way Suite 102,Indian Head,MD,20640 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING
Performance and Architecture Lab Modeling Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-06-19
Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less
Learn about the NSPS regulation for equipment leaks of Volatile Organic Compounds (VOC) from onshore natural gas processing plants by reading the rule summary, rule history, federal register citations, and the code of federal regulations
Reformation of Regulatory Technical Standards for Nuclear Power Generation Equipments in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikio Kurihara; Masahiro Aoki; Yu Maruyama
2006-07-01
Comprehensive reformation of the regulatory system has been introduced in Japan in order to apply recent technical progress in a timely manner. 'The Technical Standards for Nuclear Power Generation Equipments', known as the Ordinance No.622) of the Ministry of International Trade and Industry, which is used for detailed design, construction and operating stage of Nuclear Power Plants, was being modified to performance specifications with the consensus codes and standards being used as prescriptive specifications, in order to facilitate prompt review of the Ordinance with response to technological innovation. The activities on modification were performed by the Nuclear and Industrial Safetymore » Agency (NISA), the regulatory body in Japan, with support of the Japan Nuclear Energy Safety Organization (JNES), a technical support organization. The revised Ordinance No.62 was issued on July 1, 2005 and is enforced from January 1 2006. During the period from the issuance to the enforcement, JNES carried out to prepare enforceable regulatory guide which complies with each provisions of the Ordinance No.62, and also made technical assessment to endorse the applicability of consensus codes and standards, in response to NISA's request. Some consensus codes and standards were re-assessed since they were already used in regulatory review of the construction plan submitted by licensee. Other consensus codes and standards were newly assessed for endorsement. In case that proper consensus code or standards were not prepared, details of regulatory requirements were described in the regulatory guide as immediate measures. At the same time, appropriate standards developing bodies were requested to prepare those consensus code or standards. Supplementary note which provides background information on the modification, applicable examples etc. was prepared for convenience to the users of the Ordinance No. 62. This paper shows the activities on modification and the results, following the NISA's presentation at ICONE-13 that introduced the framework of the performance specifications and the modification process of the Ordinance NO. 62. (authors)« less
Refactoring the Genetic Code for Increased Evolvability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pines, Gur; Winkler, James D.; Pines, Assaf
ABSTRACT The standard genetic code is robust to mutations during transcription and translation. Point mutations are likely to be synonymous or to preserve the chemical properties of the original amino acid. Saturation mutagenesis experiments suggest that in some cases the best-performing mutant requires replacement of more than a single nucleotide within a codon. These replacements are essentially inaccessible to common error-based laboratory engineering techniques that alter a single nucleotide per mutation event, due to the extreme rarity of adjacent mutations. In this theoretical study, we suggest a radical reordering of the genetic code that maximizes the mutagenic potential of singlemore » nucleotide replacements. We explore several possible genetic codes that allow a greater degree of accessibility to the mutational landscape and may result in a hyperevolvable organism that could serve as an ideal platform for directed evolution experiments. We then conclude by evaluating the challenges of constructing such recoded organisms and their potential applications within the field of synthetic biology. IMPORTANCE The conservative nature of the genetic code prevents bioengineers from efficiently accessing the full mutational landscape of a gene via common error-prone methods. Here, we present two computational approaches to generate alternative genetic codes with increased accessibility. These new codes allow mutational transitions to a larger pool of amino acids and with a greater extent of chemical differences, based on a single nucleotide replacement within the codon, thus increasing evolvability both at the single-gene and at the genome levels. Given the widespread use of these techniques for strain and protein improvement, along with more fundamental evolutionary biology questions, the use of recoded organisms that maximize evolvability should significantly improve the efficiency of directed evolution, library generation, and fitness maximization.« less
Refactoring the Genetic Code for Increased Evolvability
Pines, Gur; Winkler, James D.; Pines, Assaf; ...
2017-11-14
ABSTRACT The standard genetic code is robust to mutations during transcription and translation. Point mutations are likely to be synonymous or to preserve the chemical properties of the original amino acid. Saturation mutagenesis experiments suggest that in some cases the best-performing mutant requires replacement of more than a single nucleotide within a codon. These replacements are essentially inaccessible to common error-based laboratory engineering techniques that alter a single nucleotide per mutation event, due to the extreme rarity of adjacent mutations. In this theoretical study, we suggest a radical reordering of the genetic code that maximizes the mutagenic potential of singlemore » nucleotide replacements. We explore several possible genetic codes that allow a greater degree of accessibility to the mutational landscape and may result in a hyperevolvable organism that could serve as an ideal platform for directed evolution experiments. We then conclude by evaluating the challenges of constructing such recoded organisms and their potential applications within the field of synthetic biology. IMPORTANCE The conservative nature of the genetic code prevents bioengineers from efficiently accessing the full mutational landscape of a gene via common error-prone methods. Here, we present two computational approaches to generate alternative genetic codes with increased accessibility. These new codes allow mutational transitions to a larger pool of amino acids and with a greater extent of chemical differences, based on a single nucleotide replacement within the codon, thus increasing evolvability both at the single-gene and at the genome levels. Given the widespread use of these techniques for strain and protein improvement, along with more fundamental evolutionary biology questions, the use of recoded organisms that maximize evolvability should significantly improve the efficiency of directed evolution, library generation, and fitness maximization.« less
Family medicine practice performance and knowledge management.
Orzano, A John; McInerney, Claire R; Tallia, Alfred F; Scharf, Davida; Crabtree, Benjamin F
2008-01-01
Knowledge management (KM) is the process by which people in organizations find, share, and develop knowledge for action. KM affects performance by influencing work relationships to enhance learning and decision making. To identify how family medicine practices exhibit KM. A model and a template of KM concepts were derived from a comprehensive organizational literature review. Two higher and two lower performing family medicine practices were purposefully selected from existing comparative case studies based on prevention delivery rates and innovation. Interviews, fieldnotes of operations, and clinical encounters were coded independently using the template. Face-to-face discussions resolved coding differences. All practices had processes and tools for finding, sharing, and developing knowledge; however, KM overall was limited despite implementation of expensive technologies like an electronic medical record. Where present, KM processes and tools were used by individuals but not integrated throughout the organization. Loss of information was prominent, and finding knowledge was underdeveloped. The use of technical tools and developing knowledge by reconfiguration and measurement were particularly limited. Socially related tools, such as face-to-face-communication for sharing and developing knowledge, were more developed. As in other organizations, tool use was tailored for specific outcomes and leveraged by other organizational capacities. Differences in KM occur within family practices and between family practices and other organizations and may have implications for improving practice performance. Understanding interaction patterns of work relationships and KM may explain why costly technical or externally imposed "one size fits all" practice organizational interventions have had mixed results and limited sustainability.
Cohen, Aaron M
2008-01-01
We participated in the i2b2 smoking status classification challenge task. The purpose of this task was to evaluate the ability of systems to automatically identify patient smoking status from discharge summaries. Our submission included several techniques that we compared and studied, including hot-spot identification, zero-vector filtering, inverse class frequency weighting, error-correcting output codes, and post-processing rules. We evaluated our approaches using the same methods as the i2b2 task organizers, using micro- and macro-averaged F1 as the primary performance metric. Our best performing system achieved a micro-F1 of 0.9000 on the test collection, equivalent to the best performing system submitted to the i2b2 challenge. Hot-spot identification, zero-vector filtering, classifier weighting, and error correcting output coding contributed additively to increased performance, with hot-spot identification having by far the largest positive effect. High performance on automatic identification of patient smoking status from discharge summaries is achievable with the efficient and straightforward machine learning techniques studied here.
Effective Identification of Similar Patients Through Sequential Matching over ICD Code Embedding.
Nguyen, Dang; Luo, Wei; Venkatesh, Svetha; Phung, Dinh
2018-04-11
Evidence-based medicine often involves the identification of patients with similar conditions, which are often captured in ICD (International Classification of Diseases (World Health Organization 2013)) code sequences. With no satisfying prior solutions for matching ICD-10 code sequences, this paper presents a method which effectively captures the clinical similarity among routine patients who have multiple comorbidities and complex care needs. Our method leverages the recent progress in representation learning of individual ICD-10 codes, and it explicitly uses the sequential order of codes for matching. Empirical evaluation on a state-wide cancer data collection shows that our proposed method achieves significantly higher matching performance compared with state-of-the-art methods ignoring the sequential order. Our method better identifies similar patients in a number of clinical outcomes including readmission and mortality outlook. Although this paper focuses on ICD-10 diagnosis code sequences, our method can be adapted to work with other codified sequence data.
Redundant disk arrays: Reliable, parallel secondary storage. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Gibson, Garth Alan
1990-01-01
During the past decade, advances in processor and memory technology have given rise to increases in computational performance that far outstrip increases in the performance of secondary storage technology. Coupled with emerging small-disk technology, disk arrays provide the cost, volume, and capacity of current disk subsystems, by leveraging parallelism, many times their performance. Unfortunately, arrays of small disks may have much higher failure rates than the single large disks they replace. Redundant arrays of inexpensive disks (RAID) use simple redundancy schemes to provide high data reliability. The data encoding, performance, and reliability of redundant disk arrays are investigated. Organizing redundant data into a disk array is treated as a coding problem. Among alternatives examined, codes as simple as parity are shown to effectively correct single, self-identifying disk failures.
A rat RNA-Seq transcriptomic BodyMap across 11 organs and 4 developmental stages
Yu, Ying; Fuscoe, James C.; Zhao, Chen; Guo, Chao; Jia, Meiwen; Qing, Tao; Bannon, Desmond I.; Lancashire, Lee; Bao, Wenjun; Du, Tingting; Luo, Heng; Su, Zhenqiang; Jones, Wendell D.; Moland, Carrie L.; Branham, William S.; Qian, Feng; Ning, Baitang; Li, Yan; Hong, Huixiao; Guo, Lei; Mei, Nan; Shi, Tieliu; Wang, Kevin Y.; Wolfinger, Russell D.; Nikolsky, Yuri; Walker, Stephen J.; Duerksen-Hughes, Penelope; Mason, Christopher E.; Tong, Weida; Thierry-Mieg, Jean; Thierry-Mieg, Danielle; Shi, Leming; Wang, Charles
2014-01-01
The rat has been used extensively as a model for evaluating chemical toxicities and for understanding drug mechanisms. However, its transcriptome across multiple organs, or developmental stages, has not yet been reported. Here we show, as part of the SEQC consortium efforts, a comprehensive rat transcriptomic BodyMap created by performing RNA-Seq on 320 samples from 11 organs of both sexes of juvenile, adolescent, adult and aged Fischer 344 rats. We catalogue the expression profiles of 40,064 genes, 65,167 transcripts, 31,909 alternatively spliced transcript variants and 2,367 non-coding genes/non-coding RNAs (ncRNAs) annotated in AceView. We find that organ-enriched, differentially expressed genes reflect the known organ-specific biological activities. A large number of transcripts show organ-specific, age-dependent or sex-specific differential expression patterns. We create a web-based, open-access rat BodyMap database of expression profiles with crosslinks to other widely used databases, anticipating that it will serve as a primary resource for biomedical research using the rat model. PMID:24510058
Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine
2014-03-01
Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less
The complete mitochondrial genome of Papilio glaucus and its phylogenetic implications.
Shen, Jinhui; Cong, Qian; Grishin, Nick V
2015-09-01
Due to the intriguing morphology, lifecycle, and diversity of butterflies and moths, Lepidoptera are emerging as model organisms for the study of genetics, evolution and speciation. The progress of these studies relies on decoding Lepidoptera genomes, both nuclear and mitochondrial. Here we describe a protocol to obtain mitogenomes from Next Generation Sequencing reads performed for whole-genome sequencing and report the complete mitogenome of Papilio (Pterourus) glaucus. The circular mitogenome is 15,306 bp in length and rich in A and T. It contains 13 protein-coding genes (PCGs), 22 transfer-RNA-coding genes (tRNA), and 2 ribosomal-RNA-coding genes (rRNA), with a gene order typical for mitogenomes of Lepidoptera. We performed phylogenetic analyses based on PCG and RNA-coding genes or protein sequences using Bayesian Inference and Maximum Likelihood methods. The phylogenetic trees consistently show that among species with available mitogenomes Papilio glaucus is the closest to Papilio (Agehana) maraho from Asia.
Service Wear Test Evaluation of Structural/Proximity Firefighters Gloves
1991-06-05
CLOTHING AND TEXTILE RESEARCH FACILITY NATICK, MASSACHUSETTS Approved for public release; Technical Report No. NCTRF 188 distribution unlimited. 92 12 ;e3...ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER NAVY CLOTHING AND TEXTILE RESEARCH FACILITY P.O. BOX 59 NCTRF REPORT NO...CODE APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 13. ABSTRACT (Maximum 200 words) The Navy Clothing and Textile Research Facility (NCTRF
The Continual Intercomparison of Radiation Codes: Results from Phase I
NASA Technical Reports Server (NTRS)
Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri;
2011-01-01
The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality, and will guide the development of future phases of CIRC
Index to FAA Office of Aerospace Medicine Reports: 1961 through 2004
2005-01-01
Office of Aerospace Medicine Reports:1961 Through 2004 January 2005 6. Performing Organization Code 7. Author( s ) 8. Performing Organization...the index� s sections and explains how to obtain copies of published Office of Aerospace Medicine technical reports. A historical vignette describes...ill in gs , 1 3. D r. B ar ro n, 1 4. D r. W en tz , 1 5. D r. A lb er s , 1 6. M r. H ar ri s , 17 . D r. Sn yd er , 1 8. D r. Ph ill
High-Performance Design Patterns for Modern Fortran
Haveraaen, Magne; Morris, Karla; Rouson, Damian; ...
2015-01-01
This paper presents ideas for using coordinate-free numerics in modern Fortran to achieve code flexibility in the partial differential equation (PDE) domain. We also show how Fortran, over the last few decades, has changed to become a language well-suited for state-of-the-art software development. Fortran’s new coarray distributed data structure, the language’s class mechanism, and its side-effect-free, pure procedure capability provide the scaffolding on which we implement HPC software. These features empower compilers to organize parallel computations with efficient communication. We present some programming patterns that support asynchronous evaluation of expressions comprised of parallel operations on distributed data. We implemented thesemore » patterns using coarrays and the message passing interface (MPI). We compared the codes’ complexity and performance. The MPI code is much more complex and depends on external libraries. The MPI code on Cray hardware using the Cray compiler is 1.5–2 times faster than the coarray code on the same hardware. The Intel compiler implements coarrays atop Intel’s MPI library with the result apparently being 2–2.5 times slower than manually coded MPI despite exhibiting nearly linear scaling efficiency. As compilers mature and further improvements to coarrays comes in Fortran 2015, we expect this performance gap to narrow.« less
Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study.
Weems, Shelley; Heller, Pamela; Fenton, Susan H
2015-01-01
The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: 1. Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity. 2. Coder training and type of record (inpatient versus outpatient) affect coding productivity. 3. Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity.
Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study
Weems, Shelley; Heller, Pamela; Fenton, Susan H.
2015-01-01
The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity.Coder training and type of record (inpatient versus outpatient) affect coding productivity.Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity. PMID:26396553
Conducting Retrospective Ontological Clinical Trials in ICD-9-CM in the Age of ICD-10-CM.
Venepalli, Neeta K; Shergill, Ardaman; Dorestani, Parvaneh; Boyd, Andrew D
2014-01-01
To quantify the impact of International Classification of Disease 10th Revision Clinical Modification (ICD-10-CM) transition in cancer clinical trials by comparing coding accuracy and data discontinuity in backward ICD-10-CM to ICD-9-CM mapping via two tools, and to develop a standard ICD-9-CM and ICD-10-CM bridging methodology for retrospective analyses. While the transition to ICD-10-CM has been delayed until October 2015, its impact on cancer-related studies utilizing ICD-9-CM diagnoses has been inadequately explored. Three high impact journals with broad national and international readerships were reviewed for cancer-related studies utilizing ICD-9-CM diagnoses codes in study design, methods, or results. Forward ICD-9-CM to ICD-10-CM mapping was performing using a translational methodology with the Motif web portal ICD-9-CM conversion tool. Backward mapping from ICD-10-CM to ICD-9-CM was performed using both Centers for Medicare and Medicaid Services (CMS) general equivalence mappings (GEMs) files and the Motif web portal tool. Generated ICD-9-CM codes were compared with the original ICD-9-CM codes to assess data accuracy and discontinuity. While both methods yielded additional ICD-9-CM codes, the CMS GEMs method provided incomplete coverage with 16 of the original ICD-9-CM codes missing, whereas the Motif web portal method provided complete coverage. Of these 16 codes, 12 ICD-9-CM codes were present in 2010 Illinois Medicaid data, and accounted for 0.52% of patient encounters and 0.35% of total Medicaid reimbursements. Extraneous ICD-9-CM codes from both methods (Centers for Medicare and Medicaid Services general equivalent mapping [CMS GEMs, n = 161; Motif web portal, n = 246]) in excess of original ICD-9-CM codes accounted for 2.1% and 2.3% of total patient encounters and 3.4% and 4.1% of total Medicaid reimbursements from the 2010 Illinois Medicare database. Longitudinal data analyses post-ICD-10-CM transition will require backward ICD-10-CM to ICD-9-CM coding, and data comparison for accuracy. Researchers must be aware that all methods for backward coding are not comparable in yielding original ICD-9-CM codes. The mandated delay is an opportunity for organizations to better understand areas of financial risk with regards to data management via backward coding. Our methodology is relevant for all healthcare-related coding data, and can be replicated by organizations as a strategy to mitigate financial risk.
1983-08-01
CRITICAL TECHÜfOLOOT J \\ A(JG 19$ J 4 PERFORMING ORGANISATION REPORT NUMBER(S) TECHNICAL REPORT #11 •> WJMTOCtNG ORGANIZATION REPORT...NUMBEP.(S) TECHNICAL REPORT #11 i-j NAME OF PERfOKMlNG ORGANIZATION MICROEXPERT SYSTEMS. INC. 6b OFFICE SYMBOL (If applicable) N/A AME OF...MONITORING ORGANIZATION NWSC, CRANE, IN 6c. ADDRESS (City, it ate. and ZIP Code) 24007 VENTURA BLVD. SUITE 210 CALABASAS, CA 91302 7b ADDRESS
Single Crystal Fibers of MGO:LiNbO3
1990-08-07
MONITORING ORGANIZATION REPORT NUMBER(S) F49620-88-C-0084 V AFOSR.TR.N/AI ’ 1 189 6a. NAME OF PERFORMING ORGANIZATION 16bAFICE SYMBOL 7a. NAME OF...FUNDING/ SPONSORING 8b. OFFICE SYMBOL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION S(If applible) USAF, AFSC I _____ Contract # F49620...DTIC USERS (U) 2Za NAME OF RESPONSIBLE INnIVIIL . 22b TELEPHONE (Include Area Code) 22c. OFFICE SYMBOL I) I /) F hr0 Form 1473, JUN 86 Prev4u seditions
General 3D Airborne Antenna Radiation Pattern Code Users Manual.
1983-02-01
AD-A 30 359 GENERAL 3D AIRBORNEANTENNA RADIATION PATTERN CODE USERS MANUA (U) OHIO STATE UNIV COLUMBUS ELECTROSCIENCE LAB H HCHUNGET AL FEB 83 RADC...F30602-79-C-0068 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASKAREA A WORK UNIT NUMEEfRS The Ohio State University...Computer Program 20, ABSTRACT (Coaffivme on reverse side it ntecessar a" 141etifIr &V block mUbef) This report describes a computer program and how it may
Neural network for image compression
NASA Astrophysics Data System (ADS)
Panchanathan, Sethuraman; Yeap, Tet H.; Pilache, B.
1992-09-01
In this paper, we propose a new scheme for image compression using neural networks. Image data compression deals with minimization of the amount of data required to represent an image while maintaining an acceptable quality. Several image compression techniques have been developed in recent years. We note that the coding performance of these techniques may be improved by employing adaptivity. Over the last few years neural network has emerged as an effective tool for solving a wide range of problems involving adaptivity and learning. A multilayer feed-forward neural network trained using the backward error propagation algorithm is used in many applications. However, this model is not suitable for image compression because of its poor coding performance. Recently, a self-organizing feature map (SOFM) algorithm has been proposed which yields a good coding performance. However, this algorithm requires a long training time because the network starts with random initial weights. In this paper we have used the backward error propagation algorithm (BEP) to quickly obtain the initial weights which are then used to speedup the training time required by the SOFM algorithm. The proposed approach (BEP-SOFM) combines the advantages of the two techniques and, hence, achieves a good coding performance in a shorter training time. Our simulation results demonstrate the potential gains using the proposed technique.
Matsumoto, Masaki; Yamanaka, Tsuneyasu; Hayakawa, Nobuhiro; Iwai, Satoshi; Sugiura, Nobuyuki
2015-03-01
This paper describes the Basic Radionuclide vAlue for Internal Dosimetry (BRAID) code, which was developed to calculate the time-dependent activity distribution in each organ and tissue characterised by the biokinetic compartmental models provided by the International Commission on Radiological Protection (ICRP). Translocation from one compartment to the next is taken to be governed by first-order kinetics, which is formulated by the first-order differential equations. In the source program of this code, the conservation equations are solved for the mass balance that describes the transfer of a radionuclide between compartments. This code is applicable to the evaluation of the radioactivity of nuclides in an organ or tissue without modification of the source program. It is also possible to handle easily the cases of the revision of the biokinetic model or the application of a uniquely defined model by a user, because this code is designed so that all information on the biokinetic model structure is imported from an input file. The sample calculations are performed with the ICRP model, and the results are compared with the analytic solutions using simple models. It is suggested that this code provides sufficient result for the dose estimation and interpretation of monitoring data. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Flow Coupling between a Rotor and a Stator in Turbomachinery
1990-04-01
potential-flow effects which would occur if the working fluid were perfectly inviscid. All observations made in practical situations represent a combination...interest. They are primarily working papers intended for internal use. They carry an identifying number which indicates their type and the numerical code of...release; distribution is unlimited. 4. PERFORMING ORGANIZATION REPORT NUMBER(S) s. MONITORING ORGANIZATION REPORT NUMBER(S) DTRC-PAS-90/15 Si. NAME OF
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
Informational structure of genetic sequences and nature of gene splicing
NASA Astrophysics Data System (ADS)
Trifonov, E. N.
1991-10-01
Only about 1/20 of DNA of higher organisms codes for proteins, by means of classical triplet code. The rest of DNA sequences is largely silent, with unclear functions, if any. The triplet code is not the only code (message) carried by the sequences. There are three levels of molecular communication, where the same sequence ``talks'' to various bimolecules, while having, respectively, three different appearances: DNA, RNA and protein. Since the molecular structures and, hence, sequence specific preferences of these are substantially different, the original DNA sequence has to carry simultaneously three types of sequence patterns (codes, messages), thus, being a composite structure in which one had the same letter (nucleotide) is frequently involved in several overlapping codes of different nature. This multiplicity and overlapping of the codes is a unique feature of the Gnomic, language of genetic sequences. The coexisting codes have to be degenerate in various degrees to allow an optimal and concerted performance of all the encoded functions. There is an obvious conflict between the best possible performance of a given function and necessity to compromise the quality of a given sequence pattern in favor of other patterns. It appears that the major role of various changes in the sequences on their ``ontogenetic'' way from DNA to RNA to protein, like RNA editing and splicing, or protein post-translational modifications is to resolve such conflicts. New data are presented strongly indicating that the gene splicing is such a device to resolve the conflict between the code of DNA folding in chromatin and the triplet code for protein synthesis.
[Comparative study of three Western models of deontological codes for dentists].
Macpherson Mayol, Ignacio; Roqué Sánchez, María Victoria; Gonzalvo-Cirac, Margarita; de Ribot, Eduard
2013-01-01
We performed a comparative analysis of the codes of ethics of three official organizations in Dentistry professional ethics: Code of Ethics for Dentists in the European Union, drawn up by the Council of European Dentists (CED); Código Español de Ética y Deontología Dental, published by the Consejo General de Colegios de Odontólogos y Estomatólogos de España (CGCOE); and Principles of Ethics and Code of Professional Conduct, of the American Dental Association (ADA). The analysis of the structure of the codes allows the discovery of different approaches governing professional ethics according to the ethical and legislative tradition from which they derive. While there are common elements inherent in Western culture, there are nuances in the grounds, the layout and wording of articles that allows to deduce the ethical foundations that underlie each code, and reflects the real problems encountered by dentists in the practice of their profession.
1 CFR 21.14 - Deviations from standard organization of the Code of Federal Regulations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 1 General Provisions 1 2010-01-01 2010-01-01 false Deviations from standard organization of the... CODIFICATION General Numbering § 21.14 Deviations from standard organization of the Code of Federal Regulations. (a) Any deviation from standard Code of Federal Regulations designations must be approved in advance...
Total Quality Management Implementation Plan: Defense Depot, Ogden
1989-07-01
NUMBERS Total Quality Management Implementation Plan Defense Depot Ogden 6. AUTHOR(S) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING...PAGES TQM (Total Quality Management ), Continuous Process Improvement, Depot Operations, Process Action Teams 16. PRICE CODE 17. SECURITY...034 A Message From The Commander On Total Quality Management i fully support the DLA aoproacii to Total Quality Management . As stated by General
Self-Shielded Flux Cored Wire Evaluation
1980-12-01
5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS( ES ) Naval Surface Warfare Center CD Code 2230 - Design Integration Tools Building...ADDRESS( ES ) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release...tensile and yield strength, percent elongation, and percent reduction of area reported. This testing was performed with a Satec 400 WHVP tensile
Solving Kinetic Equations on GPU’s
2011-01-01
7 Acknowledgments 23 8 Appendix: CUDA pseudo-codes 27 ∗Dipartimento di Matematica del Politecnico di Milano Piazza Leonardo da Vinci 32, 20133 Milano...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Dipartimento di Matematica del Politecnico di Milano Piazza Leonardo da Vinci 32, 20133 Milano, Italy 8
Interface Superconductivity in Graphite- and CuCl-Based Heterostructures
2015-01-22
PAULO - BRAZIL 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) AIR FORCE OFFICE OF SCIENTIFIC...Campinas-UNICAMP. Adress: Rua Sergio Buarque de Holanda, 777, cidade Universitária Zeferino Vaz, Campinas-São Paulo - Brazil (zip code: 13083-859) Grant
Pritt, Stacy L; Mackta, Jayne
2010-05-01
Business models for transnational organizations include linking different geographies through common codes of conduct, policies, and virtual teams. Global companies with laboratory animal science activities (whether outsourced or performed inhouse) often see the need for these business activities in relation to animal-based research and benefit from them. Global biomedical research organizations can learn how to better foster worldwide cooperation and teamwork by understanding and working with sociocultural differences in ethics and by knowing how to facilitate appropriate virtual team actions. Associated practices include implementing codes and policies transcend cultural, ethnic, or other boundaries and equipping virtual teams with the needed technology, support, and rewards to ensure timely and productive work that ultimately promotes good science and patient safety in drug development.
Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana
2015-01-01
Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology's Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1-4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities.
Selected DOE headquarters publications
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-07-01
Selected DOE Headquarters Publications provides cumulative listings, from October 1, 1977 onward, of two groups of publications issued by headquarters organizations of the Department of Energy, and an index to their title keywords. The two groups consist of publications assigned a DOE/XXX-type report number code and headquarters contractor publications, prepared by contractors (and published by DOE) to describe research and development work they have performed for the Department. Publications such as pamphlets, fact sheets, bulletins, newsletters, and telephone directories, are omitted, as are publications issued under the DOE-tr, CONF, DOE/JPL, and DOE/NASA codes. (RWR)
PHITS simulations of the Matroshka experiment
NASA Astrophysics Data System (ADS)
Gustafsson, Katarina; Sihver, Lembit; Mancusi, Davide; Sato, Tatsuhiko
In order to design a more secure space exploration, radiation exposure estimations are necessary; the radiation environment in space is very different from the one on Earth and it is harmful for humans and for electronic equipments. The threat origins from two sources: Galactic Cosmic Rays and Solar Particle Events. It is important to understand what happens when these particles strike matter such as space vehicle walls, human organs and electronics. We are therefore developing a tool able to estimate the radiation exposure to both humans and electronics. The tool will be based on PHITS, the Particle and Heavy-Ion Transport code System, a three dimensional Monte Carlo code which can calculate interactions and transport of particles and heavy ions in matter. PHITS is developed by a collaboration between RIST (Research Organization for Information Science & Technology), JAEA (Japan Atomic Energy Agency), KEK (High Energy Accelerator Research Organization), Japan and Chalmers University of Technology, Sweden. A method for benchmarking and developing the code is to simulate experiments performed in space or on Earth. We have carried out simulations of the Matroshka experiment which focus on determining the radiation load on astronauts inside and outside the International Space Station by using a torso of a tissue equivalent human phantom, filled with active and passive detectors located in the positions of critical tissues and organs. We will present status and results of our simulations.
Dosimetric evaluation of nanotargeted (188)Re-liposome with the MIRDOSE3 and OLINDA/EXM programs.
Chang, Chih-Hsien; Chang, Ya-Jen; Lee, Te-Wei; Ting, Gann; Chang, Kwo-Ping
2012-06-01
The OLINDA/EXM computer code was created as a replacement for the widely used MIRDOSE3 code for radiation dosimetry in nuclear medicine. A dosimetric analysis with these codes was performed to evaluate nanoliposomes as carriers of radionuclides ((188)Re-liposomes) in colon carcinoma-bearing mice. Pharmacokinetic data for (188)Re-N, N-bis (2-mercaptoethyl)-N',N'-diethylethylenediamine ((188)Re-BMEDA) and (188)Re-liposome were obtained for estimation of absorbed doses in normal organs. Radiation dose estimates for normal tissues were calculated using the MIRDOSE3 and OLINDA/EXM programs for a colon carcinoma solid tumor mouse model. Mean absorbed doses derived from(188)Re-BMEDA and (188)Re-liposome in normal tissues were generally similar as calculated by MIRDOSE3 and OLINDA/EXM programs. One notable exception to this was red marrow, wherein MIRDOSE3 resulted in higher absorbed doses than OLINDA/EXM (1.53- and 1.60-fold for (188)Re-BMEDA and (188)Re-liposome, respectively). MIRDOSE3 and OLINDA have very similar residence times and organ doses. Bone marrow doses were estimated by designating cortical bone rather than bone marrow as a source organ. The bone marrow doses calculated by MIRDOSE3 are higher than those by OLINDA. If the bone marrow is designated as a source organ, the doses estimated by MIRDOSE3 and OLINDA programs will be very similar.
Rangachari, Pavani
2008-01-01
CONTEXT/PURPOSE: With the growing momentum toward hospital quality measurement and reporting by public and private health care payers, hospitals face increasing pressures to improve their medical record documentation and administrative data coding accuracy. This study explores the relationship between the organizational knowledge-sharing structure related to quality and hospital coding accuracy for quality measurement. Simultaneously, this study seeks to identify other leadership/management characteristics associated with coding for quality measurement. Drawing upon complexity theory, the literature on "professional complex systems" has put forth various strategies for managing change and turnaround in professional organizations. In so doing, it has emphasized the importance of knowledge creation and organizational learning through interdisciplinary networks. This study integrates complexity, network structure, and "subgoals" theories to develop a framework for knowledge-sharing network effectiveness in professional complex systems. This framework is used to design an exploratory and comparative research study. The sample consists of 4 hospitals, 2 showing "good coding" accuracy for quality measurement and 2 showing "poor coding" accuracy. Interviews and surveys are conducted with administrators and staff in the quality, medical staff, and coding subgroups in each facility. Findings of this study indicate that good coding performance is systematically associated with a knowledge-sharing network structure rich in brokerage and hierarchy (with leaders connecting different professional subgroups to each other and to the external environment), rather than in density (where everyone is directly connected to everyone else). It also implies that for the hospital organization to adapt to the changing environment of quality transparency, senior leaders must undertake proactive and unceasing efforts to coordinate knowledge exchange across physician and coding subgroups and connect these subgroups with the changing external environment.
Imitation learning based on an intrinsic motivation mechanism for efficient coding
Triesch, Jochen
2013-01-01
A hypothesis regarding the development of imitation learning is presented that is rooted in intrinsic motivations. It is derived from a recently proposed form of intrinsically motivated learning (IML) for efficient coding in active perception, wherein an agent learns to perform actions with its sense organs to facilitate efficient encoding of the sensory data. To this end, actions of the sense organs that improve the encoding of the sensory data trigger an internally generated reinforcement signal. Here it is argued that the same IML mechanism might also support the development of imitation when general actions beyond those of the sense organs are considered: The learner first observes a tutor performing a behavior and learns a model of the the behavior's sensory consequences. The learner then acts itself and receives an internally generated reinforcement signal reflecting how well the sensory consequences of its own behavior are encoded by the sensory model. Actions that are more similar to those of the tutor will lead to sensory signals that are easier to encode and produce a higher reinforcement signal. Through this, the learner's behavior is progressively tuned to make the sensory consequences of its actions match the learned sensory model. I discuss this mechanism in the context of human language acquisition and bird song learning where similar ideas have been proposed. The suggested mechanism also offers an account for the development of mirror neurons and makes a number of predictions. Overall, it establishes a connection between principles of efficient coding, intrinsic motivations and imitation. PMID:24204350
Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design, and Data Management
2014-01-01
Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design , and Data Management Kevin W. Williams1 Bonny Christopher2 Gena...Simulation Studies: Experimental Planning, Design , and Data Management January 2014 6. Performing Organization Code 7. Author(s) 8. Performing...describe the process by which we designed our human-in-the-loop (HITL) simulation study and the methodology used to collect and analyze the results
Connaughton, Veronica M; Amiruddin, Azhani; Clunies-Ross, Karen L; French, Noel; Fox, Allison M
2017-05-01
A major model of the cerebral circuits that underpin arithmetic calculation is the triple-code model of numerical processing. This model proposes that the lateralization of mathematical operations is organized across three circuits: a left-hemispheric dominant verbal code; a bilateral magnitude representation of numbers and a bilateral Arabic number code. This study simultaneously measured the blood flow of both middle cerebral arteries using functional transcranial Doppler ultrasonography to assess hemispheric specialization during the performance of both language and arithmetic tasks. The propositions of the triple-code model were assessed in a non-clinical adult group by measuring cerebral blood flow during the performance of multiplication and subtraction problems. Participants were 17 adults aged between 18-27 years. We obtained laterality indices for each type of mathematical operation and compared these in participants with left-hemispheric language dominance. It was hypothesized that blood flow would lateralize to the left hemisphere during the performance of multiplication operations, but would not lateralize during the performance of subtraction operations. Hemispheric blood flow was significantly left lateralized during the multiplication task, but was not lateralized during the subtraction task. Compared to high spatial resolution neuroimaging techniques previously used to measure cerebral lateralization, functional transcranial Doppler ultrasonography is a cost-effective measure that provides a superior temporal representation of arithmetic cognition. These results provide support for the triple-code model of arithmetic processing and offer complementary evidence that multiplication operations are processed differently in the adult brain compared to subtraction operations. Copyright © 2017 Elsevier B.V. All rights reserved.
Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana
2015-01-01
Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology’s Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1–4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities. PMID:26192805
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Multiphysics Code Demonstrated for Propulsion Applications
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Melis, Matthew E.
1998-01-01
The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.
Di Giulio, Massimo
2017-11-07
The coevolution theory of the origin of the genetic code suggests that the organization of the genetic code coevolved with the biosynthetic relationships between amino acids. The mechanism that allowed this coevolution was based on tRNA-like molecules on which-this theory-would postulate the biosynthetic transformations between amino acids to have occurred. This mechanism makes a prediction on how the role conducted by the aminoacyl-tRNA synthetases (ARSs), in the origin of the genetic code, should have been. Indeed, if the biosynthetic transformations between amino acids occurred on tRNA-like molecules, then there was no need to link amino acids to these molecules because amino acids were already charged on tRNA-like molecules, as the coevolution theory suggests. In spite of the fact that ARSs make the genetic code responsible for the first interaction between a component of nucleic acids and that of proteins, for the coevolution theory the role of ARSs should have been entirely marginal in the genetic code origin. Therefore, I have conducted a further analysis of the distribution of the two classes of ARSs and of their subclasses-in the genetic code table-in order to perform a falsification test of the coevolution theory. Indeed, in the case in which the distribution of ARSs within the genetic code would have been highly significant, then the coevolution theory would be falsified since the mechanism on which it is based would not predict a fundamental role of ARSs in the origin of the genetic code. I found that the statistical significance of the distribution of the two classes of ARSs in the table of the genetic code is low or marginal, whereas that of the subclasses of ARSs statistically significant. However, this is in perfect agreement with the postulates of the coevolution theory. Indeed, the only case of statistical significance-regarding the classes of ARSs-is appreciable for the CAG code, whereas for its complement-the UNN/NUN code-only a marginal significance is measurable. These two codes codify roughly for the two ARS classes, in particular, the CAG code for the class II while the UNN/NUN code for the class I. Furthermore, the subclasses of ARSs show a statistical significance of their distribution in the genetic code table. Nevertheless, the more sensible explanation for these observations would be the following. The observation that would link the two classes of ARSs to the CAG and UNN/NUN codes, and the statistical significance of the distribution of the subclasses of ARSs in the genetic code table, would be only a secondary effect due to the highly significant distribution of the polarity of amino acids and their biosynthetic relationships in the genetic code. That is to say, the polarity of amino acids and their biosynthetic relationships would have conditioned the evolution of ARSs so that their presence in the genetic code would have been detectable. Even if the ARSs would not have-on their own-influenced directly the evolutionary organization of the genetic code. In other words, the role that ARSs had in the origin of the genetic code would have been entirely marginal. This conclusion would be in perfect accord with the predictions of the coevolution theory. Conversely, this conclusion would be in contrast-at least partially-with the physicochemical theories of the origin of the genetic code because they would foresee an absolutely more active role of ARSs in the origin of the organization of the genetic code. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca
Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behaviormore » of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.« less
Lindholm, Henrik; Egels-Zandén, Niklas; Rudén, Christina
2016-10-01
In managing chemical risks to the environment and human health in supply chains, voluntary corporate social responsibility (CSR) measures, such as auditing code of conduct compliance, play an important role. To examine how well suppliers' chemical health and safety performance complies with buyers' CSR policies and whether audited factories improve their performance. CSR audits (n = 288) of garment factories conducted by Fair Wear Foundation (FWF), an independent non-profit organization, were analyzed using descriptive statistics and statistical modeling. Forty-three per cent of factories did not comply with the FWF code of conduct, i.e. received remarks on chemical safety. Only among factories audited 10 or more times was there a significant increase in the number of factories receiving no remarks. Compliance with chemical safety requirements in garment supply chains is low and auditing is statistically correlated with improvements only at factories that have undergone numerous audits.
2016-01-01
Background In managing chemical risks to the environment and human health in supply chains, voluntary corporate social responsibility (CSR) measures, such as auditing code of conduct compliance, play an important role. Objectives To examine how well suppliers’ chemical health and safety performance complies with buyers’ CSR policies and whether audited factories improve their performance. Methods CSR audits (n = 288) of garment factories conducted by Fair Wear Foundation (FWF), an independent non-profit organization, were analyzed using descriptive statistics and statistical modeling. Results Forty-three per cent of factories did not comply with the FWF code of conduct, i.e. received remarks on chemical safety. Only among factories audited 10 or more times was there a significant increase in the number of factories receiving no remarks. Conclusions Compliance with chemical safety requirements in garment supply chains is low and auditing is statistically correlated with improvements only at factories that have undergone numerous audits. PMID:27611103
A generic framework for individual-based modelling and physical-biological interaction
2018-01-01
The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280
1978-03-01
IN PRODUCT ASSURANCE APPLICATIONS AND FOR LOCATING ADHESIVE BOND FRACTURES CAROLYN A. L. WESTERDAHL J. RICHARD HALL MARCH 1978 US ARMY ARMAMENT...AUTHORfc) Carolyn A. L. Westerdahl J. Richard Hall 8. CONTRACT OR GRANT NUMBERfaJ AMCMS Code 6121.05.I1H8.4 9. PERFORMING ORGANIZATION
Proficiency Verification Systems (PVS): Skills Indices for Language Arts. Technical Note.
ERIC Educational Resources Information Center
Humes, Ann
The procedures undertaken in developing and organizing skills indexes for use in coding elementary school language arts textbooks to determine what is actually taught are presented in this paper. The outlined procedures included performing a preliminary analysis on four language arts textbooks to compile an extensive list of skills and performance…
Development and Validation of Rapid In Situ Assays of Environmental Mutagenesis
1990-10-31
has also been suggested (12). Previous work has indicated that wild rodents can be effectively used as insit genetic biomonitors. McBee et al. (13...NUMBER(S) 5. MONITORING ORGANIZATION REPORT NUMBER(S) Oklahoma State Universi ty AEOS~- ~ _ 0 6a. NAME OF PERFORMING ORGANIZATION r6b. OFFICE SYMBOL 7a...ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS Building 410 PROGRAM PROJECT TASK WORK UNIT Bolling AFB DC .20332/6448 ELEMENT NO. INO
Characterization of Explosives Processing Waste Decomposition Due to Composting. Phase 1
1990-01-31
Caidwell, G. S . Fleming, R. M. Edwards, and E. T. Maestas of the Analytical Chemistry Division, L A. Kszos. L. F. Wicker, P. W. Braden, R. D. Bailey...DISTRIBUTION UNLIMITED. V =-PE.ORMING ORGANIZATION REPORT NUMBER( S ) 5. MONITORING ORGANIZATION REPORT NUMBER( S ) ORNL/TM-11573 6a. NAME OF PERFORMING...ORGANIZATIONAN (If apolicable)U.S. ARMY MEDICAL RESEARCHANES.T.PMFNT EDIA SGRD-RMI- S PROJECT ORDER NO. 89PP9921 Sc. ADDRESS (City, State, and ZIP Code) 10
Shared service alternatives offer flexibility and tax benefits.
Danehy, L J; Scutt, R C; Stonehill, E
1985-05-01
Because the performance of shared service and tax-exempt status under Section 501(c)(3) of the Internal Revenue Code can be incompatible, hospitals planning to provide services to each other or to other organizations on a fee-for-service basis may wish to do so through a separate corporate entity. Using either a Section 501(e) shared service organization, a Sub-chapter T cooperative, or a taxable business corporation, a compromise can be reached between operational flexibility and tax benefits.
Quantification of Noise Sources in EMI Surveys
2012-04-09
Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/ 6110 --12-9400 Quantification of Noise Sources in EMI Surveys ESTCP MR-0508 Final Guidance...NUMBER 2 . REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING ORGANIZATION REPORT NUMBER 7. PERFORMING...Barrow,‡ Jonathan T. Miller,‡ and Thomas H. Bell,‡ Naval Research Laboratory, Code 6110 4555 Overlook Avenue, SW Washington, DC 20375-5320 NRL/MR
Biosemiotics: a new understanding of life.
Barbieri, Marcello
2008-07-01
Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes--copying and coding--and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.
A new theory of development: the generation of complexity in ontogenesis.
Barbieri, Marcello
2016-03-13
Today there is a very wide consensus on the idea that embryonic development is the result of a genetic programme and of epigenetic processes. Many models have been proposed in this theoretical framework to account for the various aspects of development, and virtually all of them have one thing in common: they do not acknowledge the presence of organic codes (codes between organic molecules) in ontogenesis. Here it is argued instead that embryonic development is a convergent increase in complexity that necessarily requires organic codes and organic memories, and a few examples of such codes are described. This is the code theory of development, a theory that was originally inspired by an algorithm that is capable of reconstructing structures from incomplete information, an algorithm that here is briefly summarized because it makes it intuitively appealing how a convergent increase in complexity can be achieved. The main thesis of the new theory is that the presence of organic codes in ontogenesis is not only a theoretical necessity but, first and foremost, an idea that can be tested and that has already been found to be in agreement with the evidence. © 2016 The Author(s).
NASA Astrophysics Data System (ADS)
Lee, Choonik; Jung, Jae Won; Pelletier, Christopher; Pyakuryal, Anil; Lamart, Stephanie; Kim, Jong Oh; Lee, Choonsik
2015-03-01
Organ dose estimation for retrospective epidemiological studies of late effects in radiotherapy patients involves two challenges: radiological images to represent patient anatomy are not usually available for patient cohorts who were treated years ago, and efficient dose reconstruction methods for large-scale patient cohorts are not well established. In the current study, we developed methods to reconstruct organ doses for radiotherapy patients by using a series of computational human phantoms coupled with a commercial treatment planning system (TPS) and a radiotherapy-dedicated Monte Carlo transport code, and performed illustrative dose calculations. First, we developed methods to convert the anatomy and organ contours of the pediatric and adult hybrid computational phantom series to Digital Imaging and Communications in Medicine (DICOM)-image and DICOM-structure files, respectively. The resulting DICOM files were imported to a commercial TPS for simulating radiotherapy and dose calculation for in-field organs. The conversion process was validated by comparing electron densities relative to water and organ volumes between the hybrid phantoms and the DICOM files imported in TPS, which showed agreements within 0.1 and 2%, respectively. Second, we developed a procedure to transfer DICOM-RT files generated from the TPS directly to a Monte Carlo transport code, x-ray Voxel Monte Carlo (XVMC) for more accurate dose calculations. Third, to illustrate the performance of the established methods, we simulated a whole brain treatment for the 10 year-old male phantom and a prostate treatment for the adult male phantom. Radiation doses to selected organs were calculated using the TPS and XVMC, and compared to each other. Organ average doses from the two methods matched within 7%, whereas maximum and minimum point doses differed up to 45%. The dosimetry methods and procedures established in this study will be useful for the reconstruction of organ dose to support retrospective epidemiological studies of late effects in radiotherapy patients.
Biosemiotics: a new understanding of life
NASA Astrophysics Data System (ADS)
Barbieri, Marcello
2008-07-01
Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes—copying and coding—and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.
Validation of Living Donor Nephrectomy Codes
Lam, Ngan N.; Lentine, Krista L.; Klarenbach, Scott; Sood, Manish M.; Kuwornu, Paul J.; Naylor, Kyla L.; Knoll, Gregory A.; Kim, S. Joseph; Young, Ann; Garg, Amit X.
2018-01-01
Background: Use of administrative data for outcomes assessment in living kidney donors is increasing given the rarity of complications and challenges with loss to follow-up. Objective: To assess the validity of living donor nephrectomy in health care administrative databases compared with the reference standard of manual chart review. Design: Retrospective cohort study. Setting: 5 major transplant centers in Ontario, Canada. Patients: Living kidney donors between 2003 and 2010. Measurements: Sensitivity and positive predictive value (PPV). Methods: Using administrative databases, we conducted a retrospective study to determine the validity of diagnostic and procedural codes for living donor nephrectomies. The reference standard was living donor nephrectomies identified through the province’s tissue and organ procurement agency, with verification by manual chart review. Operating characteristics (sensitivity and PPV) of various algorithms using diagnostic, procedural, and physician billing codes were calculated. Results: During the study period, there were a total of 1199 living donor nephrectomies. Overall, the best algorithm for identifying living kidney donors was the presence of 1 diagnostic code for kidney donor (ICD-10 Z52.4) and 1 procedural code for kidney procurement/excision (1PC58, 1PC89, 1PC91). Compared with the reference standard, this algorithm had a sensitivity of 97% and a PPV of 90%. The diagnostic and procedural codes performed better than the physician billing codes (sensitivity 60%, PPV 78%). Limitations: The donor chart review and validation study was performed in Ontario and may not be generalizable to other regions. Conclusions: An algorithm consisting of 1 diagnostic and 1 procedural code can be reliably used to conduct health services research that requires the accurate determination of living kidney donors at the population level. PMID:29662679
Interfacing modules for integrating discipline specific structural mechanics codes
NASA Technical Reports Server (NTRS)
Endres, Ned M.
1989-01-01
An outline of the organization and capabilities of the Engine Structures Computational Simulator (Simulator) at NASA Lewis Research Center is given. One of the goals of the research at Lewis is to integrate various discipline specific structural mechanics codes into a software system which can be brought to bear effectively on a wide range of engineering problems. This system must possess the qualities of being effective and efficient while still remaining user friendly. The simulator was initially designed for the finite element simulation of gas jet engine components. Currently, the simulator has been restricted to only the analysis of high pressure turbine blades and the accompanying rotor assembly, although the current installation can be expanded for other applications. The simulator presently assists the user throughout its procedures by performing information management tasks, executing external support tasks, organizing analysis modules and executing these modules in the user defined order while maintaining processing continuity.
1989-07-01
FUNDING NUMBERS DRMS Total Quality Management (TQM) Implementation Plan 6. AUTHOR(S) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING...OF PAGES TOM (Total Quality Management ), Continuous Process Improvement. ’f’ - Management 16. PRICE CODE 17. SECURITY CLASSIFICATION 18. SECURITY...7540-01-280-5500 Standard Form 298 (Rev. 2-89) Pr"cried by ANi SWt 139-i 296-101 DRMS TOTAL QUALITY MANAGEMENT (TQM) IMPLEMENTATION PLAN PURPOSE The
Teaching Speech Organization and Outlining Using a Color-Coded Approach.
ERIC Educational Resources Information Center
Hearn, Ralene
The organization/outlining unit in the basic Public Speaking course can be made more interesting by using a color-coded instructional method that captivates students, facilitates understanding, and provides the opportunity for interesting reinforcement activities. The two part lesson includes a mini-lecture with a color-coded outline and a two…
Ho, Christabel Man-Fong; Oladinrin, Olugbenga Timo
2018-01-30
Due to the economic globalization which is characterized with business scandals, scholars and practitioners are increasingly engaged with the implementation of codes of ethics as a regulatory mechanism for stimulating ethical behaviours within an organization. The aim of this study is to examine various organizational practices regarding the effective implementation of codes of ethics within construction contracting companies. Views on ethics management in construction organizations together with the recommendations for improvement were gleaned through 19 semi-structured interviews, involving construction practitioners from various construction companies in Hong Kong. The findings suggested some practices for effective implementation of codes of ethics in order to diffuse ethical behaviours in an organizational setting which include; introduction of effective reward schemes, arrangement of ethics training for employees, and leadership responsiveness to reported wrongdoings. Since most of the construction companies in Hong Kong have codes of ethics, emphasis is made on the practical implementation of codes within the organizations. Hence, implications were drawn from the recommended measures to guide construction companies and policy makers.
Jovanovic, Z; Krstic, D; Nikezic, D; Ros, J M Gomez; Ferrari, P
2018-03-01
Monte Carlo simulations were performed to evaluate treatment doses with wide spread used radionuclides 133Xe, 99mTc and 81mKr. These different radionuclides are used in perfusion or ventilation examinations in nuclear medicine and as indicators for cardiovascular and pulmonary diseases. The objective of this work was to estimate the specific absorbed fractions in surrounding organs and tissues, when these radionuclides are incorporated in the lungs. For this purpose a voxel thorax model has been developed and compared with the ORNL phantom. All calculations and simulations were performed by means of the MCNP5/X code.
General Unknown Screening by Ion Trap LC/MS/MS
2010-04-01
Subtitle 5 . Report Date April 2010 General Unknown Screening by Ion Trap LC/MS/MS 6 . Performing Organization Code 7. Author(s) 8... 5 Table 1: Analytical Data for Each of the...359 Compounds in the LC/MS/MS Library . . . . . . . . . . . 6 1 General Unknown ScreeninG by ion Trap lc/MS/MS INTrOduCTION The Federal Aviation
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-27
... mines to upgraded copper products--is highly dependent on global trade. According to CPM Group, a... roughly 55% of global output in 2011, while roughly 46% production is performed in Asia. BILLING CODE 8011... global industrial activity, given copper's prominence in major economic sectors such as construction...
Reed Solomon codes for error control in byte organized computer memory systems
NASA Technical Reports Server (NTRS)
Lin, S.; Costello, D. J., Jr.
1984-01-01
A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256K-bit DRAM's are organized in 32Kx8 bit-bytes. Byte oriented codes such as Reed Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. Some special decoding techniques for extended single-and-double-error-correcting RS codes which are capable of high speed operation are presented. These techniques are designed to find the error locations and the error values directly from the syndrome without having to use the iterative algorithm to find the error locator polynomial.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, R.N.
This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in thismore » publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.« less
Transformable Rhodobacter strains, method for producing transformable Rhodobacter strains
Laible, Philip D.; Hanson, Deborah K.
2018-05-08
The invention provides an organism for expressing foreign DNA, the organism engineered to accept standard DNA carriers. The genome of the organism codes for intracytoplasmic membranes and features an interruption in at least one of the genes coding for restriction enzymes. Further provided is a system for producing biological materials comprising: selecting a vehicle to carry DNA which codes for the biological materials; determining sites on the vehicle's DNA sequence susceptible to restriction enzyme cleavage; choosing an organism to accept the vehicle based on that organism not acting upon at least one of said vehicle's sites; engineering said vehicle to contain said DNA; thereby creating a synthetic vector; and causing the synthetic vector to enter the organism so as cause expression of said DNA.
Evaluation of audit-based performance measures for dental care plans.
Bader, J D; Shugars, D A; White, B A; Rindal, D B
1999-01-01
Although a set of clinical performance measures, i.e., a report card for dental plans, has been designed for use with administrative data, most plans do not have administrative data systems containing the data needed to calculate the measures. Therefore, we evaluated the use of a set of proxy clinical performance measures calculated from data obtained through chart audits. Chart audits were conducted in seven dental programs--three public health clinics, two dental health maintenance organizations (DHMO), and two preferred provider organizations (PPO). In all instances audits were completed by clinical staff who had been trained using telephone consultation and a self-instructional audit manual. The performance measures were calculated for the seven programs, audit reliability was assessed in four programs, and for one program the audit-based proxy measures were compared to the measures calculated using administrative data. The audit-based measures were sensitive to known differences in program performance. The chart audit procedures yielded reasonably reliable data. However, missing data in patient charts rendered the calculation of some measures problematic--namely, caries and periodontal disease assessment and experience. Agreement between administrative and audit-based measures was good for most, but not all, measures in one program. The audit-based proxy measures represent a complex but feasible approach to the calculation of performance measures for those programs lacking robust administrative data systems. However, until charts contain more complete diagnostic information (i.e., periodontal charting and diagnostic codes or reason-for-treatment codes), accurate determination of these aspects of clinical performance will be difficult.
Campaigning for Organ Donation at Mosques.
Rady, Mohamed Y; Verheijde, Joseph L
2016-09-01
There is a trend of recruiting faith leaders at mosques to overcome religious barriers to organ donation, and to increase donor registration among Muslims. Commentators have suggested that Muslims are not given enough information about organ donation in religious sermons or lectures delivered at mosques. Corrective actions have been recommended, such as funding campaigns to promote organ donation, and increasing the availability of organ donation information at mosques. These actions are recommended despite published literature expressing safety concerns (i.e., do no harm) in living and end-of-life organ donation. Living donors require life-long medical follow-up and treatment for complications that can appear years later. Scientific and medical controversies persist regarding the international guidelines for death determination in end-of-life donation. The medical criteria of death lack validation and can harm donors if surgical procurement is performed without general anesthesia and before biological death. In the moral code of Islam, the prevention of harm holds precedence over beneficence. Moral precepts described in the Quran encourage Muslims to be beneficent, but also to seek knowledge prior to making practical decisions. However, the Quran also contains passages that demand honesty and truthfulness when providing information to those who are seeking knowledge. Currently, information is limited to that which encourages donor registration. Campaigning for organ donation to congregations in mosques should adhere to the moral code of complete, rather than selective, disclosure of information. We recommend as a minimal standard the disclosure of risks, uncertainties, and controversies associated with the organ donation process.
Yu, Alexander C; Cimino, James J
2011-04-01
Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Recall and interclass correlation coefficient. Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p<0.05). Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. Copyright © 2011 Elsevier Inc. All rights reserved.
Yu, Alexander C.; Cimino, James J.
2012-01-01
Objective Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Design Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Measurements Recall and interclass correlation coefficient. Results Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p < 0.05). Conclusion Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. PMID:21262390
Blackwell, C.D.
1988-01-01
Codes for the unique identification of public and private organizations listed in computerized data systems are presented. These codes are used by the U.S. Geological Survey 's National Water Data Exchange (NAWDEX), National Water Data Storage and Retrieval System (WATSTORE), National Cartographic Information Center (NCIC), and Office of Water Data Coordination (OWDC). The format structure of the codes is discussed and instructions are given for requesting new books. (Author 's abstract)
Benavidez, Teresa; Friedman, Beth
2003-07-01
To ease staffing burdens, Seton Healthcare Network established a home coding program. DNFB claims pending the health information management department's code assignment consistently decreased, reducing the organization's dollars holding by 25 percent. Decreases in contract and as-needed labor contributed to an operational cost savings of about $200,000 per year. The organization was able to fill all of its coding vacancies.
2007-08-01
Documentation Page 1 . Report No. 2. Government Accession No. 3. Recipient’s Catalog No. DOT/FAA/AM-07/23 4. Title and Subtitle 5. Report Date...Organization Code 7. Author(s) 8. Performing Organization Report No. Ray H. Liu, 1 Chih-Hung Wu, 1 Yi-Jun Chen, 1 Chiung-Dan Chang, 1 Jason G...Investigation and the Office of Aerospace Medicine for sponsoring part of this research. 1 IntensIty of the Internal standard response as the BasIs
1988-07-01
NUMBER(S) S. MONITORING ORGANIZATION REPORT NUMBER(S) _ R.TR- 90 - 0 4 70 6a. NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING...University of Rhode Island Building 410 Kingston, RI 18195 Boiling AFB, DC 20332-6448 . Sa. NAME OF FUNDING / SPONSORING Sb. OFFICE SYMBOL 9, PROCUREMENT...AS RPT. 3 OTIC USERS UNCLASSIFIED 22a- NAME OF RESPONSIBLE INDIVIDUAL 22b. TELEPHONE (Include Area Code) 22c OFFICE SYMBOL Dr Anthony J. Matuszko (202
1983-05-01
release; distribution 2b. DECLASSIFICATION/DOWNGRADING SCHEDULE unlimited. 4. PERFORMING ORGANIZATION REPORT NUMBER(S) 5. MONITORING ORGANIZATION REPORT...coding scheme derives from the location of the Budget Line Items in Exhibit P-1 (Supporting Data for the President’s Budget), the BLIN’s reflect the same...MULTI-PURPOSE WHEELED VEH 305046 COMMERCIAL UTILITY CARGO VEHICLE 305017 SMALL UNIT SUPPORT VEHICLE(SYSV) 30504 TRUCK ST 6X6 ABT 305019 TRUCK, iOT ,8X8,ART
NASA Astrophysics Data System (ADS)
Jacques, Diederik; Gérard, Fréderic; Mayer, Uli; Simunek, Jirka; Leterme, Bertrand
2016-04-01
A large number of organic matter degradation, CO2 transport and dissolved organic matter models have been developed during the last decades. However, organic matter degradation models are in many cases strictly hard-coded in terms of organic pools, degradation kinetics and dependency on environmental variables. The scientific input of the model user is typically limited to the adjustment of input parameters. In addition, the coupling with geochemical soil processes including aqueous speciation, pH-dependent sorption and colloid-facilitated transport are not incorporated in many of these models, strongly limiting the scope of their application. Furthermore, the most comprehensive organic matter degradation models are combined with simplified representations of flow and transport processes in the soil system. We illustrate the capability of generic reactive transport codes to overcome these shortcomings. The formulations of reactive transport codes include a physics-based continuum representation of flow and transport processes, while biogeochemical reactions can be described as equilibrium processes constrained by thermodynamic principles and/or kinetic reaction networks. The flexibility of these type of codes allows for straight-forward extension of reaction networks, permits the inclusion of new model components (e.g.: organic matter pools, rate equations, parameter dependency on environmental conditions) and in such a way facilitates an application-tailored implementation of organic matter degradation models and related processes. A numerical benchmark involving two reactive transport codes (HPx and MIN3P) demonstrates how the process-based simulation of transient variably saturated water flow (Richards equation), solute transport (advection-dispersion equation), heat transfer and diffusion in the gas phase can be combined with a flexible implementation of a soil organic matter degradation model. The benchmark includes the production of leachable organic matter and inorganic carbon in the aqueous and gaseous phases, as well as different decomposition functions with first-order, linear dependence or nonlinear dependence on a biomass pool. In addition, we show how processes such as local bioturbation (bio-diffusion) can be included implicitly through a Fickian formulation of transport of soil organic matter. Coupling soil organic matter models with generic and flexible reactive transport codes offers a valuable tool to enhance insights into coupled physico-chemical processes at different scales within the scope of C-biogeochemical cycles, possibly linked with other chemical elements such as plant nutrients and pollutants.
Surveyor Management of Hospital Accreditation Program: A Thematic Analysis Conducted in Iran.
Teymourzadeh, Ehsan; Ramezani, Mozhdeh; Arab, Mohammad; Rahimi Foroushani, Abbas; Akbari Sari, Ali
2016-05-01
The surveyors in hospital accreditation program are considered as the core of accreditation programs. So, the reliability and validity of the accreditation program heavily depend on their performance. This study aimed to identify the dimensions and factors affecting surveyor management of hospital accreditation programs in Iran. This qualitative study used a thematic analysis method, and was performed in Iran in 2014. The study participants included experts in the field of hospital accreditation, and were derived from three groups: 1. Policy-makers, administrators, and surveyors of the accreditation bureau, the ministry of health and medical education, Iranian universities of medical science; 2. Healthcare service providers, and 3. University professors and faculty members. The data were collected using semi-structured in-depth interviews. Following text transcription and control of compliance with the original text, MAXQDA10 software was used to code, classify, and organize the interviews in six stages. The findings from the analysis of 21 interviews were first classified in the form of 1347 semantic units, 11 themes, 17 sub-themes, and 248 codes. These were further discussed by an expert panel, which then resulted in the emergence of seven main themes - selection and recruitment of the surveyor team, organization of the surveyor team, planning to perform surveys, surveyor motivation and retention, surveyor training, surveyor assessment, and recommendations - as well as 27 sub-themes, and 112 codes. The dimensions and variables affecting the surveyors' management were identified and classified on the basis of existing scientific methods in the form of a conceptual framework. Using the results of this study, it would certainly be possible to take a great step toward enhancing the reliability of surveys and the quality and safety of services, while effectively managing accreditation program surveyors.
1984-12-01
34MISCELLANEOUS" ACCOUNT CATEGORY WITHIN THE DOD INSTRUCTION 7220.29-H DEPOT LEVEL MAINTENANCE COST ACCOUNTING SYSTEM by a. Steven Eugene Lehr CDecember 1984...PERFORMING ONG. REPORT NUMBER Maintenance Cost Accounting System 7. AUTHOR(&) S. CONTRACT OR GRANT NUMBER(@) Steven Eugene Lehr 9. PERFORMING ORGANIZATION...Availability Codes IS. KEY WORDS (Continue on reverse *ids It necessary and Identify by block number) Dvi Special Uniform Cost Accounting System DoD
Workshop on Integrated Crew Resource Management (CRM), 19-21 November 1991
1992-03-01
VI DOT/FAAIRD-92/5 Workshop on Integrated Research and Development Service Crew Resource Washington, DC 20591 Management ( CRM ) AD-A252 980 II! Ir H... Management ( CRM ) Page i Technical Report Documentation Page 1. Report No. 2. Government Accession No. 3. Recipient’s Catalog No. DOT/FAA/RD-92/5I 4. Title and...Subtitle S. Report Date May 1992 Workshop on Integrated Crew Resource Management ( CRM ) 6. Performing Organization Code ARD-1 8. Performing
Oladinrin, Olugbenga Timo; Ho, Christabel Man-Fong
2016-08-01
Several researchers have identified codes of ethics (CoEs) as tools that stimulate positive ethical behavior by shaping the organisational decision-making process, but few have considered the information needed for code implementation. Beyond being a legal and moral responsibility, ethical behavior needs to become an organisational priority, which requires an alignment process that integrates employee behavior with the organisation's ethical standards. This paper discusses processes for the responsible implementation of CoEs based on an extensive review of the literature. The internationally recognized European Foundation for Quality Management Excellence Model (EFQM model) is proposed as a suitable framework for assessing an organisation's ethical performance, including CoE embeddedness. The findings presented herein have both practical and research implications. They will encourage construction practitioners to shift their attention from ethical policies to possible enablers of CoE implementation and serve as a foundation for further research on ethical performance evaluation using the EFQM model. This is the first paper to discuss the model's use in the context of ethics in construction practice.
Posttest analysis of the 1:6-scale reinforced concrete containment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeiffer, P.A.; Kennedy, J.M.; Marchertas, A.H.
A prediction of the response of the Sandia National Laboratories 1:6- scale reinforced concrete containment model test was made by Argonne National Laboratory. ANL along with nine other organizations performed a detailed nonlinear response analysis of the 1:6-scale model containment subjected to overpressurization in the fall of 1986. The two-dimensional code TEMP-STRESS and the three-dimensional NEPTUNE code were utilized (1) to predict the global response of the structure, (2) to identify global failure sites and the corresponding failure pressures and (3) to identify some local failure sites and pressure levels. A series of axisymmetric models was studied with the two-dimensionalmore » computer program TEMP-STRESS. The comparison of these pretest computations with test data from the containment model has provided a test for the capability of the respective finite element codes to predict global failure modes, and hence serves as a validation of these codes. Only the two-dimensional analyses will be discussed in this paper. 3 refs., 10 figs.« less
Code of Federal Regulations, 2010 CFR
2010-04-01
... 3101 and 3111 of the Code were paid with respect to remuneration paid by the organization to its... respect to remuneration for services performed on or after April 1, 1973. For purposes of the previous... quarter in which such period began. However, such waiver is effective only with respect to remuneration...
1981-10-01
unique alphanumeric designation assigned by the performing orga- nization or provided by the sponsoring organization in accordance with American...for cataloging. (b). Identifiers and Open-Ended Terms. Use identifiers for project names, code names, equipment designators , etc. Use open- ended...spool. Note. These components ae designed to function together or with the BASS alone, if internal control of job processing is not a requirement at a
Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)
2006-12-01
Encapsulation HMAC Keyed-Hash Message Authentication Code ICMP Internet Control Message Protocol IEEE Institute of Electrical and Electronics Engineers IETF...Internet Engineering Task Force IOS Internetwork Operating System IP Internet Protocol ITU International Telecommunication Union LAN Local Area...network computing. Most organizations today have sophisticated networks that are connected to the Internet. The major benefit reaped from such a
Stop Codon Reassignment in the Wild
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivanova, Natalia; Schwientek, Patrick; Tripp, H. James
Since the discovery of the genetic code and protein translation mechanisms (1), a limited number of variations of the standard assignment between unique base triplets (codons) and their encoded amino acids and translational stop signals have been found in bacteria and phages (2-3). Given the apparent ubiquity of the canonical genetic code, the design of genomically recoded organisms with non-canonical codes has been suggested as a means to prevent horizontal gene transfer between laboratory and environmental organisms (4). It is also predicted that genomically recoded organisms are immune to infection by viruses, under the assumption that phages and their hostsmore » must share a common genetic code (5). This paradigm is supported by the observation of increased resistance of genomically recoded bacteria to phages with a canonical code (4). Despite these assumptions and accompanying lines of evidence, it remains unclear whether differential and non-canonical codon usage represents an absolute barrier to phage infection and genetic exchange between organisms. Our knowledge of the diversity of genetic codes and their use by viruses and their hosts is primarily derived from the analysis of cultivated organisms. Advances in single-cell sequencing and metagenome assembly technologies have enabled the reconstruction of genomes of uncultivated bacterial and archaeal lineages (6). These initial findings suggest that large scale systematic studies of uncultivated microorganisms and viruses may reveal the extent and modes of divergence from the canonical genetic code operating in nature. To explore alternative genetic codes, we carried out a systematic analysis of stop codon reassignments from the canonical TAG amber, TGA opal, and TAA ochre codons in assembled metagenomes from environmental and host-associated samples, single-cell genomes of uncultivated bacteria and archaea, and a collection of phage sequences« less
Papaefstathiou, Giannis S; Friscić, Tomislav; MacGillivray, Leonard R
2005-10-19
A metal organic framework with two different nodes (circle and square) and a structure related to one of the 20 known 2-uniform nets has been constructed using an organic building unit that codes for multiply fused nodes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. George L Mesina
Our ultimate goal is to create and maintain RELAP5-3D as the best software tool available to analyze nuclear power plants. This begins with writing excellent programming and requires thorough testing. This document covers development of RELAP5-3D software, the behavior of the RELAP5-3D program that must be maintained, and code testing. RELAP5-3D must perform in a manner consistent with previous code versions with backward compatibility for the sake of the users. Thus file operations, code termination, input and output must remain consistent in form and content while adding appropriate new files, input and output as new features are developed. As computermore » hardware, operating systems, and other software change, RELAP5-3D must adapt and maintain performance. The code must be thoroughly tested to ensure that it continues to perform robustly on the supported platforms. The coding must be written in a consistent manner that makes the program easy to read to reduce the time and cost of development, maintenance and error resolution. The programming guidelines presented her are intended to institutionalize a consistent way of writing FORTRAN code for the RELAP5-3D computer program that will minimize errors and rework. A common format and organization of program units creates a unifying look and feel to the code. This in turn increases readability and reduces time required for maintenance, development and debugging. It also aids new programmers in reading and understanding the program. Therefore, when undertaking development of the RELAP5-3D computer program, the programmer must write computer code that follows these guidelines. This set of programming guidelines creates a framework of good programming practices, such as initialization, structured programming, and vector-friendly coding. It sets out formatting rules for lines of code, such as indentation, capitalization, spacing, etc. It creates limits on program units, such as subprograms, functions, and modules. It establishes documentation guidance on internal comments. The guidelines apply to both existing and new subprograms. They are written for both FORTRAN 77 and FORTRAN 95. The guidelines are not so rigorous as to inhibit a programmer’s unique style, but do restrict the variations in acceptable coding to create sufficient commonality that new readers will find the coding in each new subroutine familiar. It is recognized that this is a “living” document and must be updated as languages, compilers, and computer hardware and software evolve.« less
Investigation of Near Shannon Limit Coding Schemes
NASA Technical Reports Server (NTRS)
Kwatra, S. C.; Kim, J.; Mo, Fan
1999-01-01
Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.
2014-01-01
Background The pediatric complex chronic conditions (CCC) classification system, developed in 2000, requires revision to accommodate the International Classification of Disease 10th Revision (ICD-10). To update the CCC classification system, we incorporated ICD-9 diagnostic codes that had been either omitted or incorrectly specified in the original system, and then translated between ICD-9 and ICD-10 using General Equivalence Mappings (GEMs). We further reviewed all codes in the ICD-9 and ICD-10 systems to include both diagnostic and procedural codes indicative of technology dependence or organ transplantation. We applied the provisional CCC version 2 (v2) system to death certificate information and 2 databases of health utilization, reviewed the resulting CCC classifications, and corrected any misclassifications. Finally, we evaluated performance of the CCC v2 system by assessing: 1) the stability of the system between ICD-9 and ICD-10 codes using data which included both ICD-9 codes and ICD-10 codes; 2) the year-to-year stability before and after ICD-10 implementation; and 3) the proportions of patients classified as having a CCC in both the v1 and v2 systems. Results The CCC v2 classification system consists of diagnostic and procedural codes that incorporate a new neonatal CCC category as well as domains of complexity arising from technology dependence or organ transplantation. CCC v2 demonstrated close comparability between ICD-9 and ICD-10 and did not detect significant discontinuity in temporal trends of death in the United States. Compared to the original system, CCC v2 resulted in a 1.0% absolute (10% relative) increase in the number of patients identified as having a CCC in national hospitalization dataset, and a 0.4% absolute (24% relative) increase in a national emergency department dataset. Conclusions The updated CCC v2 system is comprehensive and multidimensional, and provides a necessary update to accommodate widespread implementation of ICD-10. PMID:25102958
Reassigning stop codons via translation termination: How a few eukaryotes broke the dogma.
Alkalaeva, Elena; Mikhailova, Tatiana
2017-03-01
The genetic code determines how amino acids are encoded within mRNA. It is universal among the vast majority of organisms, although several exceptions are known. Variant genetic codes are found in ciliates, mitochondria, and numerous other organisms. All revealed genetic codes (standard and variant) have at least one codon encoding a translation stop signal. However, recently two new genetic codes with a reassignment of all three stop codons were revealed in studies examining the protozoa transcriptomes. Here, we discuss this finding and the recent studies of variant genetic codes in eukaryotes. We consider the possible molecular mechanisms allowing the use of certain codons as sense and stop signals simultaneously. The results obtained by studying these amazing organisms represent a new and exciting insight into the mechanism of stop codon decoding in eukaryotes. Also see the video abstract here. © 2017 WILEY Periodicals, Inc.
DRG benchmarking study establishes national coding norms.
Vaul, J H
1998-05-01
With the increase in fraud and abuse investigations, healthcare financial managers should examine their organization's medical record coding procedures. The Federal government and third-party payers are looking specifically for improper billing of outpatient services, unbundling of procedures to increase payment, assigning higher-paying DRG codes for inpatient claims, and other abuses. A recent benchmarking study of Medicare Provider Analysis and Review (MEDPAR) data has established national norms for hospital coding and case mix based on DRGs and has revealed the majority of atypical coding cases fall into six DRG pairs. Organizations with a greater percentage of atypical cases--those more likely to be scrutinized by Federal investigators--will want to conduct suitable review and be sure appropriate documentation exists to justify the coding.
Maximum likelihood decoding analysis of Accumulate-Repeat-Accumulate Codes
NASA Technical Reports Server (NTRS)
Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung
2004-01-01
Repeat-Accumulate (RA) codes are the simplest turbo-like codes that achieve good performance. However, they cannot compete with Turbo codes or low-density parity check codes (LDPC) as far as performance is concerned. The Accumulate Repeat Accumulate (ARA) codes, as a subclass of LDPC codes, are obtained by adding a pre-coder in front of RA codes with puncturing where an accumulator is chosen as a precoder. These codes not only are very simple, but also achieve excellent performance with iterative decoding. In this paper, the performance of these codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. The weight distribution of some simple ARA codes is obtained, and through existing tightest bounds we have shown the ML SNR threshold of ARA codes approaches very closely to the performance of random codes. We have shown that the use of precoder improves the SNR threshold but interleaving gain remains unchanged with respect to RA code with puncturing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Planning Organization means that organization required by the Department of Transportation, and designated... planning provisions in a Standard Metropolitan Statistical Area. Model Energy Code, 1993, including Errata, means the model building code published by the Council of American Building Officials, which is...
Reconstruction of Human Monte Carlo Geometry from Segmented Images
NASA Astrophysics Data System (ADS)
Zhao, Kai; Cheng, Mengyun; Fan, Yanchang; Wang, Wen; Long, Pengcheng; Wu, Yican
2014-06-01
Human computational phantoms have been used extensively for scientific experimental analysis and experimental simulation. This article presented a method for human geometry reconstruction from a series of segmented images of a Chinese visible human dataset. The phantom geometry could actually describe detailed structure of an organ and could be converted into the input file of the Monte Carlo codes for dose calculation. A whole-body computational phantom of Chinese adult female has been established by FDS Team which is named Rad-HUMAN with about 28.8 billion voxel number. For being processed conveniently, different organs on images were segmented with different RGB colors and the voxels were assigned with positions of the dataset. For refinement, the positions were first sampled. Secondly, the large sums of voxels inside the organ were three-dimensional adjacent, however, there were not thoroughly mergence methods to reduce the cell amounts for the description of the organ. In this study, the voxels on the organ surface were taken into consideration of the mergence which could produce fewer cells for the organs. At the same time, an indexed based sorting algorithm was put forward for enhancing the mergence speed. Finally, the Rad-HUMAN which included a total of 46 organs and tissues was described by the cuboids into the Monte Carlo Monte Carlo Geometry for the simulation. The Monte Carlo geometry was constructed directly from the segmented images and the voxels was merged exhaustively. Each organ geometry model was constructed without ambiguity and self-crossing, its geometry information could represent the accuracy appearance and precise interior structure of the organs. The constructed geometry largely retaining the original shape of organs could easily be described into different Monte Carlo codes input file such as MCNP. Its universal property was testified and high-performance was experimentally verified
Selected DOE Headquarters publications, October 1977-September 1979
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-11-01
This sixth issue of cumulative listings of DOE Headquarters publications covers the first two years of the Department's operation (October 1, 1977 - September 30, 1979). It lists two groups of publications issued by then-existing Headquarters organizations and provides an index to their title keywords. The two groups of publications are publications assigned a DOE/XXX-type report number code and Headquarters contractor reports prepared by contractors (and published by DOE) to describe research and development work they have performed for the Department. Certain publications are omitted. They include such items as pamphlets, fact sheets, bulletins, newsletters, and telephone directories, as wellmore » as headquarters publications issued under the DOE-tr (DOE translation) and CONF (conference proceedings) codes, and technical reports from the Jet Propulsion Laboratory and NASA issued under DOE/JPL and DOE/NASA codes. The contents of this issue will not be repeated in subsequent issues of DOE/AD-0010. (RWR)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyack, B.E.; Dhir, V.K.; Gieseke, J.A.
1992-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. The newest version of MELCOR is Version 1.8.1, July 1991. MELCOR development has reached the point that the United States Nuclear Regulatory Commission sponsored a broad technical review by recognized experts to determine or confirm the technical adequacy of the code for the serious and complex analyses it is expected to perform. For this purpose, an eight-member MELCOR Peer Review Committee was organized. The Committee has completed its review of the MELCOR code: the review process and findingsmore » of the MELCOR Peer Review Committee are documented in this report. The Committee has determined that recommendations in five areas are appropriate: (1) MELCOR numerics, (2) models missing from MELCOR Version 1.8.1, (3) existing MELCOR models needing revision, (4) the need for expanded MELCOR assessment, and (5) documentation.« less
Region-Based Prediction for Image Compression in the Cloud.
Begaint, Jean; Thoreau, Dominique; Guillotel, Philippe; Guillemot, Christine
2018-04-01
Thanks to the increasing number of images stored in the cloud, external image similarities can be leveraged to efficiently compress images by exploiting inter-images correlations. In this paper, we propose a novel image prediction scheme for cloud storage. Unlike current state-of-the-art methods, we use a semi-local approach to exploit inter-image correlation. The reference image is first segmented into multiple planar regions determined from matched local features and super-pixels. The geometric and photometric disparities between the matched regions of the reference image and the current image are then compensated. Finally, multiple references are generated from the estimated compensation models and organized in a pseudo-sequence to differentially encode the input image using classical video coding tools. Experimental results demonstrate that the proposed approach yields significant rate-distortion performance improvements compared with the current image inter-coding solutions such as high efficiency video coding.
Power Systems Modeling for the ONR SSL-TM Program
2015-10-01
PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) The University of Texas at Austin Center for Electromechanics 10100 Burnet Road, Bid 133 Austin...Postgraduate School (NPS) and the University of Texas Center for Electromechanics (UT) have collaborated to develop simulation models of electrical... Electromechanics The University of Texas at Austin PRC, Mail Code R7000 Austin, TX 78712 (512) 471-4496 (512) 471-0781 fax For further
1990-11-06
Naval Research Laboratory IIK Washington, DC,20375 5000 NRL Memorandum Report 6741 0 N Fiber Optic Feed DENZIL STILWELL, MARK PARENT AND LEw GOLDBERG...SUBTITLE S. FUNDING NUMBERS Fiber Optic Feed 53-0611-A0 6. AUTHOR(S) P. D. Stilwell, M. G. Parent, L. Goldberg 7. PERFORMING ORGANIZATION NAME(S) AND...DISTRIBUTION CODE Approved for public release; distribution unlimited. 13. ABSTRACT (Maximum 200 words) This report details a Fiber Optic Feeding
Government Furnished Property: Management and Accounting.
1986-06-01
PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION I(if applicable) %J 8c ADDRESS (City. State, anid ZIP Code) 10 SOURCE OF FUNDING NUMBERS e...DAR), Armed Services Procurement Regulation (ASPR), GAO aid service comptroller guidelines, and contract administration procedures. D. SCOPE OF STUDY...Contractor-acquired property is property procured or otherwise provided by the contractor for the performance of a . contract, title to which is vested in
1984-12-31
Code) Washington, DC 20375-5000 8a AEO UDN POSRN FIESMO 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (if aplicable ) Naa i ~ Proulion...pentane mobile phase was maintained at a flow rate of 6.0 ml/min with a Milton Roy Constametric pump operating in the 400-600 psi range. The injector was
RF Characteristics of Mica-Z Wireless Sensor Network Motes
2006-03-01
MICA-Z WIRELESS SENSOR NETWORK MOTES by Swee Jin Koh March 2006 Thesis Advisor: Gurminder Singh Thesis Co-Advisor: John C...Mica-Z Wireless Sensor Network Motes 6. AUTHOR(S) : Swee Jin Koh 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval...ad-hoc deployment. 15. NUMBER OF PAGES 83 14. SUBJECT TERMS: Wireless Sensor Network 16. PRICE CODE 17. SECURITY CLASSIFICATION OF
Space station human productivity study. Volume 4: Issues
NASA Technical Reports Server (NTRS)
1985-01-01
The 305 Issues contained represent topics recommended for study in order to develop requirements in support of space station crew performance/productivity. The overall subject matter, space station elements affecting crew productivity, was organized into a coded subelement listing, which is included for the reader's reference. Each issue is numbered according to the 5-digit topical coding scheme. The requirements column on each Issue page shows a cross-reference to the unresolved requirement statement(s). Because topical overlaps were frequently encountered, many initial Issues were consolidated. Apparent gaps, therefore, may be accounted for by an Issue described within a related subelement. A glossary of abbreviations used throughout the study documentation is also included.
Establishing ethics in an organization by using principles.
Hawks, Val D; Benzley, Steven E; Terry, Ronald E
2004-04-01
Laws, codes, and rules are essential for any community, public or private, to operate in an orderly and productive fashion. Without laws and codes, anarchy and chaos abound and the purpose and role of the organization is lost. However, danger is significant, and damage serious and far-reaching when individuals or organizations become so focused on rules, laws, and specifications that basic principles are ignored. This paper discusses the purpose of laws, rules, and codes, to help understand basic principles. With such an understanding an increase in the level of ethical and moral behavior can be obtained without imposing detailed rules.
Ji, Yongsung; Zeigler, David F; Lee, Dong Su; Choi, Hyejung; Jen, Alex K-Y; Ko, Heung Cho; Kim, Tae-Wook
2013-01-01
Flexible organic memory devices are one of the integral components for future flexible organic electronics. However, high-density all-organic memory cell arrays on malleable substrates without cross-talk have not been demonstrated because of difficulties in their fabrication and relatively poor performances to date. Here we demonstrate the first flexible all-organic 64-bit memory cell array possessing one diode-one resistor architectures. Our all-organic one diode-one resistor cell exhibits excellent rewritable switching characteristics, even during and after harsh physical stresses. The write-read-erase-read output sequence of the cells perfectly correspond to the external pulse signal regardless of substrate deformation. The one diode-one resistor cell array is clearly addressed at the specified cells and encoded letters based on the standard ASCII character code. Our study on integrated organic memory cell arrays suggests that the all-organic one diode-one resistor cell architecture is suitable for high-density flexible organic memory applications in the future.
Illum, Niels Ove; Gradel, Kim Oren
2017-01-01
To help parents assess disability in their own children using World Health Organization (WHO) International Classification of Functioning, Disability and Health, Child and Youth Version (ICF-CY) code qualifier scoring and to assess the validity and reliability of the data sets obtained. Parents of 162 children with spina bifida, spinal muscular atrophy, muscular disorders, cerebral palsy, visual impairment, hearing impairment, mental disability, or disability following brain tumours performed scoring for 26 body functions qualifiers (b codes) and activities and participation qualifiers (d codes). Scoring was repeated after 6 months. Psychometric and Rasch data analysis was undertaken. The initial and repeated data had Cronbach α of 0.96 and 0.97, respectively. Inter-code correlation was 0.54 (range: 0.23-0.91) and 0.76 (range: 0.20-0.92). The corrected code-total correlations were 0.72 (range: 0.49-0.83) and 0.75 (range: 0.50-0.87). When repeated, the ICF-CY code qualifier scoring showed a correlation R of 0.90. Rasch analysis of the selected ICF-CY code data demonstrated a mean measure of 0.00 and 0.00, respectively. Code qualifier infit mean square (MNSQ) had a mean of 1.01 and 1.00. The mean corresponding outfit MNSQ was 1.05 and 1.01. The ICF-CY code τ thresholds and category measures were continuous when assessed and reassessed by parents. Participating children had a mean of 56 codes scores (range: 26-130) before and a mean of 55.9 scores (range: 25-125) after repeat. Corresponding measures were -1.10 (range: -5.31 to 5.25) and -1.11 (range: -5.42 to 5.36), respectively. Based on measures obtained at the 2 occasions, the correlation coefficient R was 0.84. The child code map showed coherence of ICF-CY codes at each level. There was continuity in covering the range across disabilities. And, first and foremost, the distribution of codes reflexed a true continuity in disability with codes for motor functions activated first, then codes for cognitive functions, and, finally, codes for more complex functions. Parents can assess their own children in a valid and reliable way, and if the WHO ICF-CY second-level code data set is functioning in a clinically sound way, it can be employed as a tool for identifying the severity of disabilities and for monitoring changes in those disabilities over time. The ICF-CY codes selected in this study might be one cornerstone in forming a national or even international generic set of ICF-CY codes for the benefit of children with disabilities, their parents, and caregivers and for the whole community supporting with children with disabilities on a daily and perpetual basis.
Illum, Niels Ove; Gradel, Kim Oren
2017-01-01
AIM To help parents assess disability in their own children using World Health Organization (WHO) International Classification of Functioning, Disability and Health, Child and Youth Version (ICF-CY) code qualifier scoring and to assess the validity and reliability of the data sets obtained. METHOD Parents of 162 children with spina bifida, spinal muscular atrophy, muscular disorders, cerebral palsy, visual impairment, hearing impairment, mental disability, or disability following brain tumours performed scoring for 26 body functions qualifiers (b codes) and activities and participation qualifiers (d codes). Scoring was repeated after 6 months. Psychometric and Rasch data analysis was undertaken. RESULTS The initial and repeated data had Cronbach α of 0.96 and 0.97, respectively. Inter-code correlation was 0.54 (range: 0.23-0.91) and 0.76 (range: 0.20-0.92). The corrected code-total correlations were 0.72 (range: 0.49-0.83) and 0.75 (range: 0.50-0.87). When repeated, the ICF-CY code qualifier scoring showed a correlation R of 0.90. Rasch analysis of the selected ICF-CY code data demonstrated a mean measure of 0.00 and 0.00, respectively. Code qualifier infit mean square (MNSQ) had a mean of 1.01 and 1.00. The mean corresponding outfit MNSQ was 1.05 and 1.01. The ICF-CY code τ thresholds and category measures were continuous when assessed and reassessed by parents. Participating children had a mean of 56 codes scores (range: 26-130) before and a mean of 55.9 scores (range: 25-125) after repeat. Corresponding measures were −1.10 (range: −5.31 to 5.25) and −1.11 (range: −5.42 to 5.36), respectively. Based on measures obtained at the 2 occasions, the correlation coefficient R was 0.84. The child code map showed coherence of ICF-CY codes at each level. There was continuity in covering the range across disabilities. And, first and foremost, the distribution of codes reflexed a true continuity in disability with codes for motor functions activated first, then codes for cognitive functions, and, finally, codes for more complex functions. CONCLUSIONS Parents can assess their own children in a valid and reliable way, and if the WHO ICF-CY second-level code data set is functioning in a clinically sound way, it can be employed as a tool for identifying the severity of disabilities and for monitoring changes in those disabilities over time. The ICF-CY codes selected in this study might be one cornerstone in forming a national or even international generic set of ICF-CY codes for the benefit of children with disabilities, their parents, and caregivers and for the whole community supporting with children with disabilities on a daily and perpetual basis. PMID:28680270
Current and anticipated uses of the thermal hydraulics codes at the NRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support thesemore » needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.« less
Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution
NASA Astrophysics Data System (ADS)
Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi
2015-05-01
In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.
NASA Astrophysics Data System (ADS)
Rogov, A.; Pepyolyshev, Yu.; Carta, M.; d'Angelo, A.
Scintillation detector (SD) is widely used in neutron and gamma-spectrometry in a count mode. The organic scintillators for the count mode of the detector operation are investigated rather well. Usually, they are applied for measurement of amplitude and time distributions of pulses caused by single interaction events of neutrons or gamma's with scintillator material. But in a large area of scientific research scintillation detectors can alternatively be used on a current mode by recording the average current from the detector. For example,the measurements of the neutron pulse shape at the pulsed reactors or another pulsed neutron sources. So as to get a rather large volume of experimental data at pulsed neutron sources, it is necessary to use the current mode detector for registration of fast neutrons. Many parameters of the SD are changed with a transition from an accounting mode to current one. For example, the detector efficiency is different in counting and current modes. Many effects connected with time accuracy become substantial. Besides, for the registration of solely fast neutrons, as must be in many measurements, in the mixed radiation field of the pulsed neutron sources, SD efficiency has to be determined with a gamma-radiation shield present. Here is no calculations or experimental data on SD current mode operation up to now. The response functions of the detectors can be either measured in high-precision reference fields or calculated by a computer simulation. We have used the MCNP code [1] and carried out some experiments for investigation of the plastic performances in a current mode. There are numerous programs performing simulating similar to the MCNP code. For example, for neutrons there are [2-4], for photons - [5-8]. However, all known codes to use (SCINFUL, NRESP4, SANDYL, EGS49) have more stringent restrictions on the source, geometry and detector characteristics. In MCNP code a lot of these restrictions are absent and you need only to write special additions for proton and electron recoil and transfer energy to light output. These code modifications allow taking into account all processes in organic scintillator influence the light yield.
Reinventing radiology reimbursement.
Marshall, John; Adema, Denise
2005-01-01
Lee Memorial Health System (LMHS), located in southwest Florida, consists of 5 hospitals, a home health agency, a skilled nursing facility, multiple outpatient centers, walk-in medical centers, and primary care physician offices. LMHS annually performs more than 300,000 imaging procedures with gross imaging revenues exceeding dollar 350 million. In fall 2002, LMHS received the results of an independent audit of its IR coding. The overall IR coding error rate was determined to be 84.5%. The projected net financial impact of these errors was an annual reimbursement loss of dollar 182,000. To address the issues of coding errors and reimbursement loss, LMHS implemented its clinical reimbursementspecialist (CRS) system in October 2003, as an extension of financial services' reimbursement division. LMHS began with CRSs in 3 service lines: emergency department, cardiac catheterization, and radiology. These 3 CRSs coordinate all facets of their respective areas' chargemaster, patient charges, coding, and reimbursement functions while serving as a resident coding expert within their clinical areas. The radiology reimbursement specialist (RRS) combines an experienced radiologic technologist, interventional technologist, medical records coder, financial auditor, reimbursement specialist, and biller into a single position. The RRS's radiology experience and technologist knowledge are key assets to resolving coding conflicts and handling complex interventional coding. In addition, performing a daily charge audit and an active code review are essential if an organization is to eliminate coding errors. One of the inherent effects of eliminating coding errors is the capturing of additional RVUs and units of service. During its first year, based on account level detail, the RRS system increased radiology productivity through the additional capture of just more than 3,000 RVUs and 1,000 additional units of service. In addition, the physicians appreciate having someone who "keeps up with all the coding changes" and looks out for the charges. By assisting a few physicians' staff with coding questions, providing coding updates, and allowing them to sit in on educational sessions, at least 2 physicians have transferred some their volume to LMHS from a competitor. The provision of a "clean account," without coding errors, allows the biller to avoid the rework and billing delays caused by coding issues. During the first quarter of the RRS system, the billers referred an average of 9 accounts per day for coding resolution. During the fourth quarter of the system, these referrals were reduced to less than one per day. Prior to the RRS system, resolving these issues took an average of 4 business days. Now the conflicts are resolved within 24 hours.
Regulatory changes that affect coding for immunotherapy.
Atwater, J Spencer
2006-02-01
During the past decade, a variety of federal regulations have had a significant impact on the way allergen immunotherapy is reimbursed and how Current Procedural Terminology (CPT) codes are used for this purpose. As mandated by the US Congress, the Centers for Medicare and Medicaid Services (CMS) through the Office of the Inspector General (OIG) targeted immunotherapy codes for scrutiny, because they are some of the most frequently used codes. To examine how federal regulations have affected reimbursement for allergy immunotherapy and other allergy services. A review was performed of the OIG survey of allergy immunotherapy and the OIG recommendations on CPT coding compliance guidelines. A preliminary survey found problems with medical appropriateness of allergen immunotherapy. For this reason, the OIG performed a more comprehensive study of 301 physicians using code 95165 to analyze by medical record and billing data whether the new billing rules were being correctly used and found that only 44% of physicians were following the new definition of a billable dose. In the early 1990s, the federal government served notice of its intent to more aggressively identify and prosecute health care providers who improperly billed and collected for medical services. Through the adoption of the 1991 US Sentencing Commission Guidelines, the government sought to enhance compliance by mandating lesser criminal penalties for violating organizations that nevertheless maintained and operated "effective compliance plans." In 2002, the OIG audited health care providers and recouped dollar 14.4 billion in improper payments by Medicare. Between January and June 2003, Medicare excluded 1,241 individual providers and health care entities due to fraudulent billing practices. Federal regulations have significantly affected reimbursement for allergy immunotherapy and other allergy services. Allergists need to be aware of these changes and implement the new recommendations into their practices.
Sudoku Puzzles for First-Year Organic Chemistry Students
ERIC Educational Resources Information Center
Perez, Alice L.; Lamoureux, G.
2007-01-01
Sudoku puzzle was designed to teach about amino acids and functional groups to the students of undergraduate organic chemistry students. The puzzles focus on helping the student learn the name, 3-letter code and 1-letter code of common amino acids and functional groups.
Kangaroo – A pattern-matching program for biological sequences
2002-01-01
Background Biologists are often interested in performing a simple database search to identify proteins or genes that contain a well-defined sequence pattern. Many databases do not provide straightforward or readily available query tools to perform simple searches, such as identifying transcription binding sites, protein motifs, or repetitive DNA sequences. However, in many cases simple pattern-matching searches can reveal a wealth of information. We present in this paper a regular expression pattern-matching tool that was used to identify short repetitive DNA sequences in human coding regions for the purpose of identifying potential mutation sites in mismatch repair deficient cells. Results Kangaroo is a web-based regular expression pattern-matching program that can search for patterns in DNA, protein, or coding region sequences in ten different organisms. The program is implemented to facilitate a wide range of queries with no restriction on the length or complexity of the query expression. The program is accessible on the web at http://bioinfo.mshri.on.ca/kangaroo/ and the source code is freely distributed at http://sourceforge.net/projects/slritools/. Conclusion A low-level simple pattern-matching application can prove to be a useful tool in many research settings. For example, Kangaroo was used to identify potential genetic targets in a human colorectal cancer variant that is characterized by a high frequency of mutations in coding regions containing mononucleotide repeats. PMID:12150718
77 FR 12202 - Public Inspection of Material Relating to Tax-Exempt Organizations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-29
...This document contains final regulations pertaining to the public inspection of material relating to tax-exempt organizations and final regulations pertaining to the public inspection of written determinations and background file documents. These regulations are necessary to clarify rules relating to information and materials made available by the IRS for public inspection under the Internal Revenue Code (Code). The final regulations affect certain organizations exempt from Federal income tax, organizations that were exempt but are no longer exempt from Federal income tax, and organizations that were denied tax-exempt status.
Rep. Stupak, Bart [D-MI-1
2009-03-16
House - 05/04/2009 Referred to the Subcommittee on Government Management, Organization, and Procurement. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Bearing performance degradation assessment based on time-frequency code features and SOM network
NASA Astrophysics Data System (ADS)
Zhang, Yan; Tang, Baoping; Han, Yan; Deng, Lei
2017-04-01
Bearing performance degradation assessment and prognostics are extremely important in supporting maintenance decision and guaranteeing the system’s reliability. To achieve this goal, this paper proposes a novel feature extraction method for the degradation assessment and prognostics of bearings. Features of time-frequency codes (TFCs) are extracted from the time-frequency distribution using a hybrid procedure based on short-time Fourier transform (STFT) and non-negative matrix factorization (NMF) theory. An alternative way to design the health indicator is investigated by quantifying the similarity between feature vectors using a self-organizing map (SOM) network. On the basis of this idea, a new health indicator called time-frequency code quantification error (TFCQE) is proposed to assess the performance degradation of the bearing. This indicator is constructed based on the bearing real-time behavior and the SOM model that is previously trained with only the TFC vectors under the normal condition. Vibration signals collected from the bearing run-to-failure tests are used to validate the developed method. The comparison results demonstrate the superiority of the proposed TFCQE indicator over many other traditional features in terms of feature quality metrics, incipient degradation identification and achieving accurate prediction. Highlights • Time-frequency codes are extracted to reflect the signals’ characteristics. • SOM network served as a tool to quantify the similarity between feature vectors. • A new health indicator is proposed to demonstrate the whole stage of degradation development. • The method is useful for extracting the degradation features and detecting the incipient degradation. • The superiority of the proposed method is verified using experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, Michael T.; Safdari, Masoud; Kress, Jessica E.
The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enablemore » coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site. There are over 100 unit tests provided that run through the Illinois Rocstar Application Development (IRAD) lightweight testing infrastructure that is also supplied along with IMPACT. The package as a whole provides an excellent base for developing high-quality multiphysics applications using modern software development practices. To facilitate understanding how to utilize IMPACT effectively, two multiphysics systems have been developed and are available open-source through gitHUB. The simpler of the two systems, named ElmerFoamFSI in the repository, is a multiphysics, fluid-structure-interaction (FSI) coupling of the solid mechanics package Elmer with a fluid dynamics module from OpenFOAM. This coupling illustrates how to combine software packages that are unrelated by either author or architecture and combine them into a robust, parallel multiphysics system. A more complex multiphysics tool is the Illinois Rocstar Rocstar Multiphysics code that was rebuilt during the project around IMPACT. Rocstar Multiphysics was already an HPC multiphysics tool, but now that it has been rearchitected around IMPACT, it can be readily expanded to capture new and different physics in the future. In fact, during this project, the Elmer and OpenFOAM tools were also coupled into Rocstar Multiphysics and demonstrated. The full Rocstar Multiphysics codebase is also available on gitHUB, and licensed for any organization to use as they wish. Finally, the new IMPACT product is already being used in several multiphysics code coupling projects for the Air Force, NASA and the Missile Defense Agency, and initial work on expansion of the IMPACT-enabled Rocstar Multiphysics has begun in support of a commercial company. These initiatives promise to expand the interest and reach of IMPACT and Rocstar Multiphysics, ultimately leading to the envisioned standardization and consortium of users that was one of the goals of this project.« less
Ribeiro, Aridiane Alves; Arantes, Cássia Irene Spinelli; Gualda, Dulce Maria Rosa; Rossi, Lídia Aparecida
2017-06-01
This case study aimed to interpret the underlying historical and cultural aspects of the provision of care at an indigenous healthcare service facility. This is an interpretive, case study-type research with qualitative approach, which was conducted in 2012 at the Indigenous Health Support Center (CASAI) of the State of Mato Grosso do Sul, Brazil. Data were collected by means systematic observation, documentary analyses and semi-structured interviews with ten health professionals. Data review was performed according to an approach based on social anthropology and health anthropology. The anthropological concepts of social code and ethnocentrism underpinned the interpretation of outcomes. Two categories were identified: CASAI, a space between streets and village; Ethnocentrism and indigenous health care. Healthcare practice and current social code are influenced by each other. The street social code prevails in the social environment under study. The institutional organization and professionals' appreciation of the indigenous biological body are decisive to provision of care under the streets social code perspective. Professionals' concepts evidence ethnocentrism in healthcare. Workers, however, try to adopt a relativized view vis-à-vis indigenous people at CASAI.
Implementation of algebraic stress models in a general 3-D Navier-Stokes method (PAB3D)
NASA Technical Reports Server (NTRS)
Abdol-Hamid, Khaled S.
1995-01-01
A three-dimensional multiblock Navier-Stokes code, PAB3D, which was developed for propulsion integration and general aerodynamic analysis, has been used extensively by NASA Langley and other organizations to perform both internal (exhaust) and external flow analysis of complex aircraft configurations. This code was designed to solve the simplified Reynolds Averaged Navier-Stokes equations. A two-equation k-epsilon turbulence model has been used with considerable success, especially for attached flows. Accurate predicting of transonic shock wave location and pressure recovery in separated flow regions has been more difficult. Two algebraic Reynolds stress models (ASM) have been recently implemented in the code that greatly improved the code's ability to predict these difficult flow conditions. Good agreement with Direct Numerical Simulation (DNS) for a subsonic flat plate was achieved with ASM's developed by Shih, Zhu, and Lumley and Gatski and Speziale. Good predictions were also achieved at subsonic and transonic Mach numbers for shock location and trailing edge boattail pressure recovery on a single-engine afterbody/nozzle model.
Lin, Changyu; Zou, Ding; Liu, Tao; Djordjevic, Ivan B
2016-08-08
A mutual information inspired nonbinary coded modulation design with non-uniform shaping is proposed. Instead of traditional power of two signal constellation sizes, we design 5-QAM, 7-QAM and 9-QAM constellations, which can be used in adaptive optical networks. The non-uniform shaping and LDPC code rate are jointly considered in the design, which results in a better performance scheme for the same SNR values. The matched nonbinary (NB) LDPC code is used for this scheme, which further improves the coding gain and the overall performance. We analyze both coding performance and system SNR performance. We show that the proposed NB LDPC-coded 9-QAM has more than 2dB gain in symbol SNR compared to traditional LDPC-coded star-8-QAM. On the other hand, the proposed NB LDPC-coded 5-QAM and 7-QAM have even better performance than LDPC-coded QPSK.
Compressive Hyperspectral Imaging and Anomaly Detection
2010-02-01
Level Set Systems 1058 Embury Street Pacific Palisades , CA 90272 8. PERFORMING ORGANIZATION REPORT NUMBER 1A-2010 9. SPONSORING/MONITORING...were obtained from a simple algorithm, namely, the atoms in the trained image were very similar to the simple cell receptive fields in early vision...Field, "Emergence of simple- cell receptive field properties by learning a sparse code for natural images,’" Nature 381(6583), pp. 607-609, 1996. M
2009-12-01
DIRECT AND INDIRECT APPROACHES: IMPLICATIONS FOR ENDING THE VIOLENCE IN SOUTHERN THAILAND by Chaiyo Rodthong December 2009 Thesis Advisor... Thailand 6. AUTHOR(S) Chaiyo Rodthong 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The instability in the Southern Border Provinces of Thailand resurged on
1995-01-01
1995 Ship Production Symposium Paper No. 24: Absenteeism Manage- ment U.S. DEPARTMENT OF THE NAVY CARDEROCK DIVISION, NAVAL SURFACE WARFARE CENTER...Paper No. 24: Absenteeism Management 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER...5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Surface Warfare Center CD Code 2230 - Design Integration Tools Bldg
Center for Nonlinear Phenomena and Magnetic Materials
1992-12-04
S) AND ADDRESS(ES) B. PERFORMING ORGANIZATION Howard University /ComSERC REPORT NUMBER 2216 6th St., N.W. Suite 205 NA Washington, D.C. 20059 9...contract on the research environment at Howard University 14. SUBJECT TERMS 15. NUMBER OF PAGES 16. PRICE CODE 17. SECURITY CLASSIFICATION 18. SECURITY...October 25, 1991: Dr. Gerald Chachere, Math Dept., Howard University . Visualization - Improved Marching Cubes. January 27, 1992: Dr. Gerald Chachere, Math
22 CFR 226.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Post-award Requirements Procurement Standards § 226.42 Codes of conduct. The recipient shall... immediate family, his or her partner, or an organization which employs or is about to employ any of the...
Design and Implementation of a Distributed Version of the NASA Engine Performance Program
NASA Technical Reports Server (NTRS)
Cours, Jeffrey T.
1994-01-01
Distributed NEPP is a new version of the NASA Engine Performance Program that runs in parallel on a collection of Unix workstations connected through a network. The program is fault-tolerant, efficient, and shows significant speed-up in a multi-user, heterogeneous environment. This report describes the issues involved in designing distributed NEPP, the algorithms the program uses, and the performance distributed NEPP achieves. It develops an analytical model to predict and measure the performance of the simple distribution, multiple distribution, and fault-tolerant distribution algorithms that distributed NEPP incorporates. Finally, the appendices explain how to use distributed NEPP and document the organization of the program's source code.
Alternative Fuels Data Center: Codes and Standards Basics
, the American National Standards Institute regulates how organizations publish codes and standards standards. Legal Enforcement Codes and standards are legally enforceable when jurisdictions adopt them by reference or direct incorporation into their regulations. When jurisdictions adopt codes, they also adopt
A traffic-light coding system to organize emergency surgery across surgical disciplines.
Leppäniemi, A; Jousela, I
2014-01-01
Emergency surgery is associated with night-time procedures and disruption of elective surgery. An analysis was undertaken of the effect of classifying emergency operations uniformly with a three-tier urgency colour code and the use of dedicated daytime operating rooms. Observed changes from 2001 to 2012 in the number, timing and ability to meet the urgency-designated colour code deadline were retrieved from the computer-based operating theatre organization system for all emergency operations. The number of emergency operations performed annually ranged from 3330 to 4341, with an increasing trend. The proportion of night-time emergency operations decreased from 27.4 per cent (2563 of 9347) before to 23.5 per cent (7731 of 32,959) after introduction of the colour coding system in 2004 (χ2 = 61.94, 1 d.f., P < 0.001). In 2007, owing to long preoperative delays in patients with acute appendicitis and acute cholecystitis, colour codes for these patients were upgraded from 'orange' to 'red' and from 'yellow' to 'orange' respectively. The proportion of patients operated on with a red code before and after this change increased from 45.2 per cent (5831 of 12,907 operations) to 62.7 per cent (13,020 of 20,778 operations; χ2 = 986.99, 1 d.f., P < 0.001). In 2012, the office-hours raw utilization time for the principal emergency operation theatre was 85.4 per cent. The structural separation of elective and emergency surgery, the use of dedicated daytime operating theatres and the implementation of a universal classification of emergency operations reduced night-time surgery, improved the efficiency of operating theatre utilization during daytime, shortened preoperative delay in patients requiring urgent surgery, and enabled monitoring and corrective actions for providing emergency surgery services. © 2013 BJS Society Ltd. Published by John Wiley & Sons Ltd.
Constructing LDPC Codes from Loop-Free Encoding Modules
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher; Thorpe, Jeremy; Andrews, Kenneth
2009-01-01
A method of constructing certain low-density parity-check (LDPC) codes by use of relatively simple loop-free coding modules has been developed. The subclasses of LDPC codes to which the method applies includes accumulate-repeat-accumulate (ARA) codes, accumulate-repeat-check-accumulate codes, and the codes described in Accumulate-Repeat-Accumulate-Accumulate Codes (NPO-41305), NASA Tech Briefs, Vol. 31, No. 9 (September 2007), page 90. All of the affected codes can be characterized as serial/parallel (hybrid) concatenations of such relatively simple modules as accumulators, repetition codes, differentiators, and punctured single-parity check codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. These codes can also be characterized as hybrid turbolike codes that have projected graph or protograph representations (for example see figure); these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms. The present method comprises two related submethods for constructing LDPC codes from simple loop-free modules with circulant permutations. The first submethod is an iterative encoding method based on the erasure-decoding algorithm. The computations required by this method are well organized because they involve a parity-check matrix having a block-circulant structure. The second submethod involves the use of block-circulant generator matrices. The encoders of this method are very similar to those of recursive convolutional codes. Some encoders according to this second submethod have been implemented in a small field-programmable gate array that operates at a speed of 100 megasymbols per second. By use of density evolution (a computational- simulation technique for analyzing performances of LDPC codes), it has been shown through some examples that as the block size goes to infinity, low iterative decoding thresholds close to channel capacity limits can be achieved for the codes of the type in question having low maximum variable node degrees. The decoding thresholds in these examples are lower than those of the best-known unstructured irregular LDPC codes constrained to have the same maximum node degrees. Furthermore, the present method enables the construction of codes of any desired rate with thresholds that stay uniformly close to their respective channel capacity thresholds.
Biodegradation of paint stripper solvents in a modified gas lift loop bioreactor.
Vanderberg-Twary, L; Steenhoudt, K; Travis, B J; Hanners, J L; Foreman, T M; Brainard, J R
1997-07-05
Paint stripping wastes generated during the decontamination and decommissioning of former nuclear facilities contain paint stripping organics (dichloromethane, 2-propanol, and methanol) and bulk materials containing paint pigments. It is desirable to degrade the organic residues as part of an integrated chemical-biological treatment system. We have developed a modified gas lift loop bioreactor employing a defined consortium of Rhodococcus rhodochrous strain OFS and Hyphomicrobium sp. DM-2 that degrades paint stripper organics. Mass transfer coefficients and kinetic constants for biodegradation in the system were determined. It was found that transfer of organic substrates from surrogate waste into the air and further into the liquid medium in the bioreactor were rapid processes, occurring within minutes. Monod kinetics was employed to model the biodegradation of paint stripping organics. Analysis of the bioreactor process was accomplished with BIOLAB, a mathematical code that simulates coupled mass transfer and biodegradation processes. This code was used to fit experimental data to Monod kinetics and to determine kinetic parameters. The BIOLAB code was also employed to compare activities in the bioreactor of individual microbial cultures to the activities of combined cultures in the bioreactor. This code is of benefit for further optimization and scale-up of the bioreactor for treatment of paint stripping and other volatile organic wastes in bulk materials.
Selected DOE headquarters publications
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-04-01
This publication provides listings of (mainly policy and programmatic) publications which have been issued by headquarters organizations of the Department of Energy; assigned a DOE/XXX- type report number code, where XXX is the 1- to 4-letter code for the issuing headquarters organization; received by the Energy Library; and made available to the public.
Mu, Chuang; Wang, Ruijia; Li, Tianqi; Li, Yuqiang; Tian, Meilin; Jiao, Wenqian; Huang, Xiaoting; Zhang, Lingling; Hu, Xiaoli; Wang, Shi; Bao, Zhenmin
2016-08-01
Long non-coding RNA (lncRNA) structurally resembles mRNA but cannot be translated into protein. Although the systematic identification and characterization of lncRNAs have been increasingly reported in model species, information concerning non-model species is still lacking. Here, we report the first systematic identification and characterization of lncRNAs in two sea cucumber species: (1) Apostichopus japonicus during lipopolysaccharide (LPS) challenge and in heathy tissues and (2) Holothuria glaberrima during radial organ complex regeneration, using RNA-seq datasets and bioinformatics analysis. We identified A. japonicus and H. glaberrima lncRNAs that were differentially expressed during LPS challenge and radial organ complex regeneration, respectively. Notably, the predicted lncRNA-microRNA-gene trinities revealed that, in addition to targeting protein-coding transcripts, miRNAs might also target lncRNAs, thereby participating in a potential novel layer of regulatory interactions among non-coding RNA classes in echinoderms. Furthermore, the constructed coding-non-coding network implied the potential involvement of lncRNA-gene interactions during the regulation of several important genes (e.g., Toll-like receptor 1 [TLR1] and transglutaminase-1 [TGM1]) in response to LPS challenge and radial organ complex regeneration in sea cucumbers. Overall, this pioneer systematic identification, annotation, and characterization of lncRNAs in echinoderm pave the way for similar studies and future genetic, genomic, and evolutionary research in non-model species.
PoMiN: A Post-Minkowskian N-body Solver
NASA Astrophysics Data System (ADS)
Feng, Justin; Baumann, Mark; Hall, Bryton; Doss, Joel; Spencer, Lucas; Matzner, Richard
2018-06-01
In this paper, we introduce PoMiN, a lightweight N-body code based on the post-Minkowskian N-body Hamiltonian of Ledvinka et al., which includes general relativistic effects up to first order in Newton’s constant G, and all orders in the speed of light c. PoMiN is written in C and uses a fourth-order Runge–Kutta integration scheme. PoMiN has also been written to handle an arbitrary number of particles (both massive and massless), with a computational complexity that scales as O(N 2). We describe the methods we used to simplify and organize the Hamiltonian, and the tests we performed (convergence, conservation, and analytical comparison tests) to validate the code.
Mille, Matthew M; Jung, Jae Won; Lee, Choonik; Kuzmin, Gleb A; Lee, Choonsik
2018-06-01
Radiation dosimetry is an essential input for epidemiological studies of radiotherapy patients aimed at quantifying the dose-response relationship of late-term morbidity and mortality. Individualised organ dose must be estimated for all tissues of interest located in-field, near-field, or out-of-field. Whereas conventional measurement approaches are limited to points in water or anthropomorphic phantoms, computational approaches using patient images or human phantoms offer greater flexibility and can provide more detailed three-dimensional dose information. In the current study, we systematically compared four different dose calculation algorithms so that dosimetrists and epidemiologists can better understand the advantages and limitations of the various approaches at their disposal. The four dose calculations algorithms considered were as follows: the (1) Analytical Anisotropic Algorithm (AAA) and (2) Acuros XB algorithm (Acuros XB), as implemented in the Eclipse treatment planning system (TPS); (3) a Monte Carlo radiation transport code, EGSnrc; and (4) an accelerated Monte Carlo code, the x-ray Voxel Monte Carlo (XVMC). The four algorithms were compared in terms of their accuracy and appropriateness in the context of dose reconstruction for epidemiological investigations. Accuracy in peripheral dose was evaluated first by benchmarking the calculated dose profiles against measurements in a homogeneous water phantom. Additional simulations in a heterogeneous cylinder phantom evaluated the performance of the algorithms in the presence of tissue heterogeneity. In general, we found that the algorithms contained within the commercial TPS (AAA and Acuros XB) were fast and accurate in-field or near-field, but not acceptable out-of-field. Therefore, the TPS is best suited for epidemiological studies involving large cohorts and where the organs of interest are located in-field or partially in-field. The EGSnrc and XVMC codes showed excellent agreement with measurements both in-field and out-of-field. The EGSnrc code was the most accurate dosimetry approach, but was too slow to be used for large-scale epidemiological cohorts. The XVMC code showed similar accuracy to EGSnrc, but was significantly faster, and thus epidemiological applications seem feasible, especially when the organs of interest reside far away from the field edge.
Extensible Computational Chemistry Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-09
ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less
Organic brain syndrome treated with oxiracetam. A double-blind randomized controlled trial.
Hjorther, A; Browne, E; Jakobsen, K; Viskum, P; Gyntelberg, F
1987-04-01
In a 12-week double-blind study, oxiracetam (CGP 21690 E), a new nootropic drug, at a dose of 2.4 mg per day, was compared to placebo in the treatment of 106 middle-aged patients suffering from mild to moderate organic brain syndrome due to prolonged exposure to organic solvents. At the beginning of the study and after 12 weeks treatment, the patients underwent a battery of neuropsychological tests to determine their mental and memory functioning. A symptom questionnaire consisting of 21 items was rated pre-treatment, and improvement or worsening of any of the symptoms recorded monthly. At the end of the study a global evaluation was performed by the patients themselves, their relatives, the psychologist and the doctor. The code was not broken until the final writing of this paper. No statistically significant differences were observed between the treatment groups in any of the above-mentioned evaluations; neither were any differences in the neuropsychological tests performance observed. Thus, oxiracetam seems to have no effect in the treatment of organic brain syndrome.
Case reviews of infections of the spine in patients with a history of solid organ transplantation.
Falakassa, Jonathan; Hirsch, Brandon P; Norton, Robert P; Mendez-Zfass, Matthew; Eismont, Frank J
2014-09-01
Retrospective clinical case series. To report on the epidemiological, microbiological, and clinical characteristics of spinal infections in patients who have undergone solid organ transplantation. Spine infections remain a therapeutic challenge, particularly in patients who are immunocompromised. Solid organ transplant patients represent a growing population of immunocompromised hosts. To our knowledge, no previous reports have examined the clinical characteristics spinal infections in this at-risk population in a systematic fashion. The records of patients with a history of solid organ transplantation from January 2007 through December 2012 were identified using Current Procedural Terminology procedure codes. Patients with spine infections who have received transplants were then identified using International Classification of Diseases, Ninth Revision codes for spine infection. In addition to demographic data, we recorded medical comorbidities, immunosuppressant medications, laboratory results, culture data, treatment received, and short-term results. During this 6-year period, 2764 solid organ transplants were performed at our institution. Of this cohort, 6 patients (0.22%) were treated for a spinal infection. Patient's age ranged from 51 to 80 years (mean, 63 yr). All spine infections occurred within 1 year after organ transplantation. All patients had an elevated erythrocyte sedimentation rate. Only 1 patient had an elevated white blood cell count. The most common organisms were Escherichia coli and Staphylococcus. Four patients required surgical treatment. All patients had complete resolution of symptoms. Our data suggest that patients with a history of solid organ transplantation may be more susceptible to developing spine infections than the general population. The most common organisms in our cohort were E. coli and Staphylococcus. Spine infections caused by atypical organisms do also occur in the organ transplant population, as is the case in other immunocompromised patients. The identification of these organisms and timely institution of treatment remains critical in the management of this at-risk population. 4.
NASA Technical Reports Server (NTRS)
Rajpal, Sandeep; Rhee, DoJun; Lin, Shu
1997-01-01
In this paper, we will use the construction technique proposed in to construct multidimensional trellis coded modulation (TCM) codes for both the additive white Gaussian noise (AWGN) and the fading channels. Analytical performance bounds and simulation results show that these codes perform very well and achieve significant coding gains over uncoded reference modulation systems. In addition, the proposed technique can be used to construct codes which have a performance/decoding complexity advantage over the codes listed in literature.
The Molecular and Cellular Basis of Taste Coding in the Legs of Drosophila
Ling, Frederick; Dahanukar, Anupama; Weiss, Linnea A.; Kwon, Jae Young
2014-01-01
To understand the principles of taste coding, it is necessary to understand the functional organization of the taste organs. Although the labellum of the Drosophila melanogaster head has been described in detail, the tarsal segments of the legs, which collectively contain more taste sensilla than the labellum, have received much less attention. We performed a systematic anatomical, physiological, and molecular analysis of the tarsal sensilla of Drosophila. We construct an anatomical map of all five tarsal segments of each female leg. The taste sensilla of the female foreleg are systematically tested with a panel of 40 diverse compounds, yielding a response matrix of ∼500 sensillum–tastant combinations. Six types of sensilla are characterized. One type was tuned remarkably broadly: it responded to 19 of 27 bitter compounds tested, as well as sugars; another type responded to neither. The midleg is similar but distinct from the foreleg. The response specificities of the tarsal sensilla differ from those of the labellum, as do n-dimensional taste spaces constructed for each organ, enhancing the capacity of the fly to encode and respond to gustatory information. We examined the expression patterns of all 68 gustatory receptors (Grs). A total of 28 Gr–GAL4 drivers are expressed in the legs. We constructed a receptor-to-sensillum map of the legs and a receptor-to-neuron map. Fourteen Gr–GAL4 drivers are expressed uniquely in the bitter-sensing neuron of the sensillum that is tuned exceptionally broadly. Integration of the molecular and physiological maps provides insight into the underlying basis of taste coding. PMID:24849350
Practical moral codes in the transgenic organism debate.
Cooley, D R; Goreham, Gary; Youngs, George A
2004-01-01
In one study funded by the United States Department of Agriculture, people from North Dakota were interviewed to discover which moral principles they use in evaluating the morality of transgenic organisms and their introduction into markets. It was found that although the moral codes the human subjects employed were very similar, their views on transgenics were vastly different. In this paper, the codes that were used by the respondents are developed, compared to that of the academically composed Belmont Report, and then modified to create the more practical Common Moral Code. At the end, it is shown that the Common Moral Code has inherent inconsistency flaws that might be resolvable, but would require extensive work on the definition of terms and principles. However, the effort is worthwhile, especially if it results in a common moral code that all those involved in the debate are willing to use in negotiating a resolution to their differences.
On the evolution of primitive genetic codes.
Weberndorfer, Günter; Hofacker, Ivo L; Stadler, Peter F
2003-10-01
The primordial genetic code probably has been a drastically simplified ancestor of the canonical code that is used by contemporary cells. In order to understand how the present-day code came about we first need to explain how the language of the building plan can change without destroying the encoded information. In this work we introduce a minimal organism model that is based on biophysically reasonable descriptions of RNA and protein, namely secondary structure folding and knowledge based potentials. The evolution of a population of such organism under competition for a common resource is simulated explicitly at the level of individual replication events. Starting with very simple codes, and hence greatly reduced amino acid alphabets, we observe a diversification of the codes in most simulation runs. The driving force behind this effect is the possibility to produce fitter proteins when the repertoire of amino acids is enlarged.
Decoding of DBEC-TBED Reed-Solomon codes. [Double-Byte-Error-Correcting, Triple-Byte-Error-Detecting
NASA Technical Reports Server (NTRS)
Deng, Robert H.; Costello, Daniel J., Jr.
1987-01-01
A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256 K bit DRAM's are organized in 32 K x 8 bit-bytes. Byte-oriented codes such as Reed-Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. The paper presents a special decoding technique for double-byte-error-correcting, triple-byte-error-detecting RS codes which is capable of high-speed operation. This technique is designed to find the error locations and the error values directly from the syndrome without having to use the iterative algorithm to find the error locator polynomial.
1988-07-29
VOLff2) 6 July 19837 A-74 A1.5.6 MONITORING NON-CONTROLLED OBJECTS AIM, 7 OIHERS REPORT AIRSPACEJ FIRST 10 DETECT INTRUSION IradR’SIoN BY NON-CON’TROLLED 1...1988 Volume II: ACF/ACCC Terminal and En Route Controllers (ClIG 1) 6 . Porliming Organization Code 7 . Author(s) 8. Performing Organization Report No...MANEUVER SYSTEM GENERATES ABSORPT ION PREVIOUSLY PREPARED RECEIVED MANEUVER FOR A FLIGHT CLEAPANCEI D0T/FAA/AP-47-01 (VOLt2) 6 July 1987 A- 7 A,1.O
1992-03-01
DOWNGRADING SCHEDULE Aptvdfo uii 4101tt;dsrttto i ilmti 4 PERFORMING ORGANIZATION REPORT NUMBER( S ) 5 MONITORING ORGANIZATION REPORT NUMBE I~’’i) 6tt NAME...CT MAN A(TEHSI IGUIJ.I) K NOW AIOUIt’ tS’I’ ANDI SClI IED11L11 M A N At Ill:EN’I 1? PERSONAL AUTHOR( S ) (’aaattnu.IlairlerineN. 13ai TYPE OF REPORT 13b...tat I )rlii-I ova Ia loo I S 17 COSATI CODES 18. SUBJECT TERMS (continue on reverse if necessary and identify by blocl, nnwibel) HlELD I ROUP I
Vladimirov, N V; Likhoshvaĭ, V A; Matushkin, Iu G
2007-01-01
Gene expression is known to correlate with degree of codon bias in many unicellular organisms. However, such correlation is absent in some organisms. Recently we demonstrated that inverted complementary repeats within coding DNA sequence must be considered for proper estimation of translation efficiency, since they may form secondary structures that obstruct ribosome movement. We have developed a program for estimation of potential coding DNA sequence expression in defined unicellular organism using its genome sequence. The program computes elongation efficiency index. Computation is based on estimation of coding DNA sequence elongation efficiency, taking into account three key factors: codon bias, average number of inverted complementary repeats, and free energy of potential stem-loop structures formed by the repeats. The influence of these factors on translation is numerically estimated. An optimal proportion of these factors is computed for each organism individually. Quantitative translational characteristics of 384 unicellular organisms (351 bacteria, 28 archaea, 5 eukaryota) have been computed using their annotated genomes from NCBI GenBank. Five potential evolutionary strategies of translational optimization have been determined among studied organisms. A considerable difference of preferred translational strategies between Bacteria and Archaea has been revealed. Significant correlations between elongation efficiency index and gene expression levels have been shown for two organisms (S. cerevisiae and H. pylori) using available microarray data. The proposed method allows to estimate numerically the coding DNA sequence translation efficiency and to optimize nucleotide composition of heterologous genes in unicellular organisms. http://www.mgs.bionet.nsc.ru/mgs/programs/eei-calculator/.
The adaption and use of research codes for performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebetrau, A.M.
1987-05-01
Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less
A proven approach for more effective software development and maintenance
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Hall, Dana; Sinclair, Craig
1994-01-01
Modern space flight mission operations and associated ground data systems are increasingly dependent upon reliable, quality software. Critical functions such as command load preparation, health and status monitoring, communications link scheduling and conflict resolution, and transparent gateway protocol conversion are routinely performed by software. Given budget constraints and the ever increasing capabilities of processor technology, the next generation of control centers and data systems will be even more dependent upon software across all aspects of performance. A key challenge now is to implement improved engineering, management, and assurance processes for the development and maintenance of that software; processes that cost less, yield higher quality products, and that self-correct for continual improvement evolution. The NASA Goddard Space Flight Center has a unique experience base that can be readily tapped to help solve the software challenge. Over the past eighteen years, the Software Engineering Laboratory within the code 500 Flight Dynamics Division has evolved a software development and maintenance methodology that accommodates the unique characteristics of an organization while optimizing and continually improving the organization's software capabilities. This methodology relies upon measurement, analysis, and feedback much analogous to that of control loop systems. It is an approach with a time-tested track record proven through repeated applications across a broad range of operational software development and maintenance projects. This paper describes the software improvement methodology employed by the Software Engineering Laboratory, and how it has been exploited within the Flight Dynamics Division with GSFC Code 500. Examples of specific improvement in the software itself and its processes are presented to illustrate the effectiveness of the methodology. Finally, the initial findings are given when this methodology was applied across the mission operations and ground data systems software domains throughout Code 500.
Exploratory Experimental Investigation of a Wave Propeller
1992-03-01
ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION Naval Postgraduate School (if appl able) Naval Postgraduate School 6c. ADDRESS (City... SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (ff applicable) Bc ADDRESS (City, State, and ZIP Code) 10- SOURCE OF FUNDING...UNCLASSIFIED 22a. NAME OF RESPONSIBLE INDIVIDUAL 22b. TELEPHONE (Include Area code) 22c. OFFICE SYMBOL Max F. Platter (408) 646-2058 AA/PL DD FORM 1473,84
Calderwood, Michael S.; Kleinman, Ken; Murphy, Michael V.; Platt, Richard; Huang, Susan S.
2014-01-01
Background Deep and organ/space surgical site infections (D/OS SSI) cause significant morbidity, mortality, and costs. Rates are publicly reported and increasingly used as quality metrics affecting hospital payment. Lack of standardized surveillance methods threaten the accuracy of reported data and decrease confidence in comparisons based upon these data. Methods We analyzed data from national validation studies that used Medicare claims to trigger chart review for SSI confirmation after coronary artery bypass graft surgery (CABG) and hip arthroplasty. We evaluated code performance (sensitivity and positive predictive value) to select diagnosis codes that best identified D/OS SSI. Codes were analyzed individually and in combination. Results Analysis included 143 patients with D/OS SSI after CABG and 175 patients with D/OS SSI after hip arthroplasty. For CABG, 9 International Classification of Diseases, 9th Revision (ICD-9) diagnosis codes identified 92% of D/OS SSI, with 1 D/OS SSI identified for every 4 cases with a diagnosis code. For hip arthroplasty, 6 ICD-9 diagnosis codes identified 99% of D/OS SSI, with 1 D/OS SSI identified for every 2 cases with a diagnosis code. Conclusions This standardized and efficient approach for identifying D/OS SSI can be used by hospitals to improve case detection and public reporting. This method can also be used to identify potential D/OS SSI cases for review during hospital audits for data validation. PMID:25734174
Development of a Communication System Compatible with Chemical Protective Clothing and Equipment.
1986-06-23
for the Chemical Protective Clothing Communication System are discussed in the operation manuals for the engineering prototypes of the trans- ceiver...Report Date DEVELOPMENT OF A COMMUNICATION SYSTEM COMPATIBLE June 1986 WITH CHEMICAL PROTECTIVE CLOTHING AND EQUIPMENT 6. Performing Organization Code 8...Abstract - ’The U. S. Coast Guard and NASA joined in a project to develop a cominications system to operate inside protective suits used in chemical spill
Modeling and Prediction of Corrosion-Fatigue Failures in AF1410 Steel Test Specimens
2009-01-12
PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Structures Division, Code 4.3.3 University of Dayton Research Bldg. 2187 Room 2340A Institute Naval...AND ADDRESS(ES) Office of Naval Research One Liberty Center 875 North Randolph St., Suite 1425 Arlington, VA 22203-1995 11. SPONSOR/MONITOR’S...costs. To address these issues, NAVAIR has initiated a multiyear research program to investigate and quantify the fatigue life reduction due to
Thermally Optimized Paradigm of Thermal Management (TOP-M)
2017-07-18
ELEMENT NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 6. AUTHOR(S) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8...19b. TELEPHONE NUMBER (Include area code) 18-07-2017 Final Technical Jul 2015 - Jul 2017 NICOP - Thermally Optimized Paradigm of Thermal Management ...The main goal of this research was to present a New Thermal Management Approach, which combines thermally aware Very/Ultra Large Scale Integration
2006-01-01
PERFORMING ORGANIZATION REPORT DIR, ECBC, ATTN: AMSRD-ECB-RT- TE /AMSRD-ECB-RT-TV, NUMBER APG, MD 21010-5424 ECBC-TR-460 9. SPONSORING / MONITORING AGENCY...Johnson a. REPORT b. ABSTRACT c . THIS PAGE 19b. TELEPHONE NUMBER (include area code) U U U UL 13 (410) 436-2914 Standard Form 298 (Rev. 8-98) Prescribed...8 4. D ISC USSIO N
Bibliography of Joint Aircraft Survivability Reports and Related Documents
1994-07-01
report are: synthetic and preparative procedures for new materials developed; a new concept of fire-control by dry chemical agents; descriptions of...5001 Author: James T. Sweeten , Jr. Abstract: (U) This report provides information for users on the implementation of the MJU-7A/B, MJU-8A/B, MJU-10, MJU...John 0. Bennett, Code 4072 Crane Performing Organization: ARC Professional Services Group Information Systems Division Author: James T. Sweeten , Jr
Sensible Heat Flux Related to Variations in Atmospheric Turbulence Kinetic Energy on a Sandy Beach
2017-06-01
FLUX RELATED TO VARIATIONS IN ATMOSPHERIC TURBULENCE KINETIC ENERGY ON A SANDY BEACH by Jessica S. Koscinski June 2017 Thesis Advisor...KINETIC ENERGY ON A SANDY BEACH 5. FUNDING NUMBERS 6. AUTHOR(S) Jessica S. Koscinski 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval...Sensible heat flux, turbulence kinetic energy , surf zone 15. NUMBER OF PAGES 57 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT
Autoregressive Methods for Spectral Estimation from Interferograms.
1986-09-19
RL83 6?6 AUTOREGRESSIVE METHODS FOR SPECTRAL. ESTIMTION FROM / SPACE ENGINEERING E N RICHARDS ET AL. 19 SEPINEFRGAS.()UA TT NV GNCNE O C: 31SSF...was AUG1085 performed under subcontract to . Center for Space Engineering Utah State University Logan, UT 84322-4140 4 4 Scientific Report No. 17 AFGL...MONITORING ORGANIZATION Center for Space Engineering (iapplicable) Air Force Geophysics Laboratory e. AORESS (City. State and ZIP Code) 7b. AOORESS (City
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-25
... the International Maritime Organization's Development of a Mandatory Code for Ships Operating in Polar... States Coast Guard will hold a public workshop in Washington, DC on topics related to the development of... Polar Code). Various safety topics will be discussed including design, equipment, and operational...
77 FR 6005 - Application for Recognition as a 501(c)(29) Organization
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-07
... Application for Recognition as a 501(c)(29) Organization AGENCY: Internal Revenue Service (IRS), Treasury...: For date of applicability, see Sec. 1.501(c)(29)-1T(c). FOR FURTHER INFORMATION CONTACT: Amy Franklin...: Background Section 501(c)(29) of the Internal Revenue Code (Code) provides requirements for tax exemption...
77 FR 6027 - Application for Recognition as a 501(c)(29) Organization
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-07
... Application for Recognition as a 501(c)(29) Organization AGENCY: Internal Revenue Service (IRS), Treasury...) relating to section 501(c)(29) of the Internal Revenue Code (Code). The temporary regulations provide that... health insurance issuer (within the meaning of section 1322(c) of the Patient Protection and Affordable...
Error control for reliable digital data transmission and storage systems
NASA Technical Reports Server (NTRS)
Costello, D. J., Jr.; Deng, R. H.
1985-01-01
A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256K-bit DRAM's are organized in 32Kx8 bit-bytes. Byte oriented codes such as Reed Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. In this paper we present some special decoding techniques for extended single-and-double-error-correcting RS codes which are capable of high speed operation. These techniques are designed to find the error locations and the error values directly from the syndrome without having to use the iterative alorithm to find the error locator polynomial. Two codes are considered: (1) a d sub min = 4 single-byte-error-correcting (SBEC), double-byte-error-detecting (DBED) RS code; and (2) a d sub min = 6 double-byte-error-correcting (DBEC), triple-byte-error-detecting (TBED) RS code.
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.
2010-01-01
The space radiation environment, particularly solar particle events (SPEs), poses the risk of acute radiation sickness (ARS) to humans; and organ doses from SPE exposure may reach critical levels during extra vehicular activities (EVAs) or within lightly shielded spacecraft. NASA has developed an organ dose projection model using the BRYNTRN with SUMDOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUMDOSE, written in FORTRAN, are a Baryon transport code and an output data processing code, respectively. The ARR code is written in C. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. BRYNTRN code operation requires extensive input preparation. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN in friendly way. A GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. The ARRBOD GUI will serve as a proof-of-concept example for future integration of other human space applications risk projection models. The current version of the ARRBOD GUI is a new self-contained product and will have follow-on versions, as options are added: 1) human geometries of MAX/FAX in addition to CAM/CAF; 2) shielding distributions for spacecraft, Mars surface and atmosphere; 3) various space environmental and biophysical models; and 4) other response models to be connected to the BRYNTRN. The major components of the overall system, the subsystem interconnections, and external interfaces are described in this report; and the ARRBOD GUI product is explained step by step in order to serve as a tutorial.
Lowet, Eric; Roberts, Mark; Hadjipapas, Avgis; Peter, Alina; van der Eerden, Jan; De Weerd, Peter
2015-02-01
Fine-scale temporal organization of cortical activity in the gamma range (∼25-80Hz) may play a significant role in information processing, for example by neural grouping ('binding') and phase coding. Recent experimental studies have shown that the precise frequency of gamma oscillations varies with input drive (e.g. visual contrast) and that it can differ among nearby cortical locations. This has challenged theories assuming widespread gamma synchronization at a fixed common frequency. In the present study, we investigated which principles govern gamma synchronization in the presence of input-dependent frequency modulations and whether they are detrimental for meaningful input-dependent gamma-mediated temporal organization. To this aim, we constructed a biophysically realistic excitatory-inhibitory network able to express different oscillation frequencies at nearby spatial locations. Similarly to cortical networks, the model was topographically organized with spatially local connectivity and spatially-varying input drive. We analyzed gamma synchronization with respect to phase-locking, phase-relations and frequency differences, and quantified the stimulus-related information represented by gamma phase and frequency. By stepwise simplification of our models, we found that the gamma-mediated temporal organization could be reduced to basic synchronization principles of weakly coupled oscillators, where input drive determines the intrinsic (natural) frequency of oscillators. The gamma phase-locking, the precise phase relation and the emergent (measurable) frequencies were determined by two principal factors: the detuning (intrinsic frequency difference, i.e. local input difference) and the coupling strength. In addition to frequency coding, gamma phase contained complementary stimulus information. Crucially, the phase code reflected input differences, but not the absolute input level. This property of relative input-to-phase conversion, contrasting with latency codes or slower oscillation phase codes, may resolve conflicting experimental observations on gamma phase coding. Our modeling results offer clear testable experimental predictions. We conclude that input-dependency of gamma frequencies could be essential rather than detrimental for meaningful gamma-mediated temporal organization of cortical activity.
Lowet, Eric; Roberts, Mark; Hadjipapas, Avgis; Peter, Alina; van der Eerden, Jan; De Weerd, Peter
2015-01-01
Fine-scale temporal organization of cortical activity in the gamma range (∼25–80Hz) may play a significant role in information processing, for example by neural grouping (‘binding’) and phase coding. Recent experimental studies have shown that the precise frequency of gamma oscillations varies with input drive (e.g. visual contrast) and that it can differ among nearby cortical locations. This has challenged theories assuming widespread gamma synchronization at a fixed common frequency. In the present study, we investigated which principles govern gamma synchronization in the presence of input-dependent frequency modulations and whether they are detrimental for meaningful input-dependent gamma-mediated temporal organization. To this aim, we constructed a biophysically realistic excitatory-inhibitory network able to express different oscillation frequencies at nearby spatial locations. Similarly to cortical networks, the model was topographically organized with spatially local connectivity and spatially-varying input drive. We analyzed gamma synchronization with respect to phase-locking, phase-relations and frequency differences, and quantified the stimulus-related information represented by gamma phase and frequency. By stepwise simplification of our models, we found that the gamma-mediated temporal organization could be reduced to basic synchronization principles of weakly coupled oscillators, where input drive determines the intrinsic (natural) frequency of oscillators. The gamma phase-locking, the precise phase relation and the emergent (measurable) frequencies were determined by two principal factors: the detuning (intrinsic frequency difference, i.e. local input difference) and the coupling strength. In addition to frequency coding, gamma phase contained complementary stimulus information. Crucially, the phase code reflected input differences, but not the absolute input level. This property of relative input-to-phase conversion, contrasting with latency codes or slower oscillation phase codes, may resolve conflicting experimental observations on gamma phase coding. Our modeling results offer clear testable experimental predictions. We conclude that input-dependency of gamma frequencies could be essential rather than detrimental for meaningful gamma-mediated temporal organization of cortical activity. PMID:25679780
Jones, Dean P; Sies, Helmut
2015-09-20
The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.
DOT National Transportation Integrated Search
2001-02-01
Problems, solutions and recommendations for implementation have been contributed by 16 of the 27 CODES states and organized as appropriate under the administrative, linkage and application requirements for a Crash Outcome Data Evaluation System (CODE...
Jäckel, D; Schlothauer, N I; Zeeb, H; Wagner, G; Sachse, M M
2018-04-12
Organ transplant recipients have an up to 250-times higher risk to develop skin cancer. This article evaluated the utilisation of skin cancer screening and the treatment costs for skin cancer in organ transplant recipients. Patients of the health insurance AOK Bremen/Bremerhaven had been identified and the need for skin cancer prevention trainings was derived. The number of organ transplant recipients (ICD code Z94.0-4) with and without any history of skin cancer (ICD code C43/C44), the utilisation of dermatologic health care services, and the costs for treatments with the diagnosis Z94.0-4 with and without C43/C44 were evaluated. The analyses were carried out for the period from 2009-2014 by using the accounting systems of the AOK. Between 2009 and 2014, 231 organ transplant recipients had been recorded. By mid-2014, 20% of these insured persons developed skin cancer and the mean incidence was 2.76% per year. On average, 43% of these patients were seen by a dermatologist at least once a year, whereby only 15% of the organ transplant recipients participated in the annual skin cancer screening. In 29% of the patients without any history of skin cancer, a skin examination was never performed by a dermatologist or a general practitioner. In all, 17 inpatient cases of organ transplant recipients with the primary diagnosis C43/C44 were analyzed. This resulted in total costs of 54,707 € (on average about 3200 € per case). The increased incidence of skin cancer and the associated treatment costs indicate the need for skin cancer prevention training.
NASA Technical Reports Server (NTRS)
Hinds, Erold W. (Principal Investigator)
1996-01-01
This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.
Semantic and visual memory codes in learning disabled readers.
Swanson, H L
1984-02-01
Two experiments investigated whether learning disabled readers' impaired recall is due to multiple coding deficiencies. In Experiment 1, learning disabled and skilled readers viewed nonsense pictures without names or with either relevant or irrelevant names with respect to the distinctive characteristics of the picture. Both types of names improved recall of nondisabled readers, while learning disabled readers exhibited better recall for unnamed pictures. No significant difference in recall was found between name training (relevant, irrelevant) conditions within reading groups. In Experiment 2, both reading groups participated in recall training for complex visual forms labeled with unrelated words, hierarchically related words, or without labels. A subsequent reproduction transfer task showed a facilitation in performance in skilled readers due to labeling, with learning disabled readers exhibiting better reproduction for unnamed pictures. Measures of output organization (clustering) indicated that recall is related to the development of superordinate categories. The results suggest that learning disabled children's reading difficulties are due to an inability to activate a semantic representation that interconnects visual and verbal codes.
International variation in the definition of ‘main condition’ in ICD-coded health data
Quan, H.; Moskal, L.; Forster, A.J.; Brien, S.; Walker, R.; Romano, P.S.; Sundararajan, V.; Burnand, B.; Henriksson, G.; Steinum, O.; Droesler, S.; Pincus, H.A.; Ghali, W.A.
2014-01-01
Hospital-based medical records are abstracted to create International Classification of Disease (ICD) coded discharge health data in many countries. The ‘main condition’ is not defined in a consistent manner internationally. Some countries employ a ‘reason for admission’ rule as the basis for the main condition, while other countries employ a ‘resource use’ rule. A few countries have recently transitioned from one of these approaches to the other. The definition of ‘main condition’ in such ICD data matters when it is used to define a disease cohort to assign diagnosis-related groups and to perform risk adjustment. We propose a method of harmonizing the international definition to enable researchers and international organizations using ICD-coded health data to aggregate or compare hospital care and outcomes across countries in a consistent manner. Inter-observer reliability of alternative harmonization approaches should be evaluated before finalizing the definition and adopting it worldwide. PMID:24990594
Effects of verbal and nonverbal interference on spatial and object visual working memory.
Postle, Bradley R; Desposito, Mark; Corkin, Suzanne
2005-03-01
We tested the hypothesis that a verbal coding mechanism is necessarily engaged by object, but not spatial, visual working memory tasks. We employed a dual-task procedure that paired n-back working memory tasks with domain-specific distractor trials inserted into each interstimulus interval of the n-back tasks. In two experiments, object n-back performance demonstrated greater sensitivity to verbal distraction, whereas spatial n-back performance demonstrated greater sensitivity to motion distraction. Visual object and spatial working memory may differ fundamentally in that the mnemonic representation of featural characteristics of objects incorporates a verbal (perhaps semantic) code, whereas the mnemonic representation of the location of objects does not. Thus, the processes supporting working memory for these two types of information may differ in more ways than those dictated by the "what/where" organization of the visual system, a fact more easily reconciled with a component process than a memory systems account of working memory function.
Effects of verbal and nonverbal interference on spatial and object visual working memory
POSTLE, BRADLEY R.; D’ESPOSITO, MARK; CORKIN, SUZANNE
2005-01-01
We tested the hypothesis that a verbal coding mechanism is necessarily engaged by object, but not spatial, visual working memory tasks. We employed a dual-task procedure that paired n-back working memory tasks with domain-specific distractor trials inserted into each interstimulus interval of the n-back tasks. In two experiments, object n-back performance demonstrated greater sensitivity to verbal distraction, whereas spatial n-back performance demonstrated greater sensitivity to motion distraction. Visual object and spatial working memory may differ fundamentally in that the mnemonic representation of featural characteristics of objects incorporates a verbal (perhaps semantic) code, whereas the mnemonic representation of the location of objects does not. Thus, the processes supporting working memory for these two types of information may differ in more ways than those dictated by the “what/where” organization of the visual system, a fact more easily reconciled with a component process than a memory systems account of working memory function. PMID:16028575
Simulation of Comet Impact and Survivability of Organic Compounds
NASA Astrophysics Data System (ADS)
Liu, Benjamin; Lomov, Ilya; Blank, Jennifer; Antoun, Tarabay
2007-06-01
Comets have been proposed as a mechanism for the transport of complex organic compounds to Earth. For this to occur, a significant fraction of organic compounds must survive the shock loading, in particular the high temperatures, due to impact. 2D and 3D numerical simulations were performed to study the thermodynamic states due to a comet impact. The comet was modeled as a 1-km diameter icy sphere traveling at the Earth's escape velocity (11 km/s) impacting a half-space of basalt. Simulations were performed with GEODYN, a parallel, multi-material, Godunov-based Eulerian code employing adaptive mesh refinement. A constitutive model calibrated for hard rock was used for basalt. Tabular equations of state were used to account for the extreme conditions present upon shock loading. A major focus of the study was tracking the thermodynamic state of the comet material. Both the maximum temperature experienced and the phase were tracked for each point in the comet Temperature histories in the comet were also recorded. These quantities were used to estimate viability of organic compounds upon impact. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1995-01-01
This report focuses on the results obtained during the PI's recent sabbatical leave at the Swiss Federal Institute of Technology (ETH) in Zurich, Switzerland, from January 1, 1995 through June 30, 1995. Two projects investigated various properties of TURBO codes, a new form of concatenated coding that achieves near channel capacity performance at moderate bit error rates. The performance of TURBO codes is explained in terms of the code's distance spectrum. These results explain both the near capacity performance of the TURBO codes and the observed 'error floor' for moderate and high signal-to-noise ratios (SNR's). A semester project, entitled 'The Realization of the Turbo-Coding System,' involved a thorough simulation study of the performance of TURBO codes and verified the results claimed by previous authors. A copy of the final report for this project is included as Appendix A. A diploma project, entitled 'On the Free Distance of Turbo Codes and Related Product Codes,' includes an analysis of TURBO codes and an explanation for their remarkable performance. A copy of the final report for this project is included as Appendix B.
Stilp, Christian E.; Kluender, Keith R.
2012-01-01
To the extent that sensorineural systems are efficient, redundancy should be extracted to optimize transmission of information, but perceptual evidence for this has been limited. Stilp and colleagues recently reported efficient coding of robust correlation (r = .97) among complex acoustic attributes (attack/decay, spectral shape) in novel sounds. Discrimination of sounds orthogonal to the correlation was initially inferior but later comparable to that of sounds obeying the correlation. These effects were attenuated for less-correlated stimuli (r = .54) for reasons that are unclear. Here, statistical properties of correlation among acoustic attributes essential for perceptual organization are investigated. Overall, simple strength of the principal correlation is inadequate to predict listener performance. Initial superiority of discrimination for statistically consistent sound pairs was relatively insensitive to decreased physical acoustic/psychoacoustic range of evidence supporting the correlation, and to more frequent presentations of the same orthogonal test pairs. However, increased range supporting an orthogonal dimension has substantial effects upon perceptual organization. Connectionist simulations and Eigenvalues from closed-form calculations of principal components analysis (PCA) reveal that perceptual organization is near-optimally weighted to shared versus unshared covariance in experienced sound distributions. Implications of reduced perceptual dimensionality for speech perception and plausible neural substrates are discussed. PMID:22292057
"Hour of Code": Can It Change Students' Attitudes toward Programming?
ERIC Educational Resources Information Center
Du, Jie; Wimmer, Hayden; Rada, Roy
2016-01-01
The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…
Codes, Costs, and Critiques: The Organization of Information in "Library Quarterly", 1931-2004
ERIC Educational Resources Information Center
Olson, Hope A.
2006-01-01
This article reports the results of a quantitative and thematic content analysis of the organization of information literature in the "Library Quarterly" ("LQ") between its inception in 1931 and 2004. The majority of articles in this category were published in the first half of "LQ's" run. Prominent themes have included cataloging codes and the…
Clustering of neural code words revealed by a first-order phase transition
NASA Astrophysics Data System (ADS)
Huang, Haiping; Toyoizumi, Taro
2016-06-01
A network of neurons in the central nervous system collectively represents information by its spiking activity states. Typically observed states, i.e., code words, occupy only a limited portion of the state space due to constraints imposed by network interactions. Geometrical organization of code words in the state space, critical for neural information processing, is poorly understood due to its high dimensionality. Here, we explore the organization of neural code words using retinal data by computing the entropy of code words as a function of Hamming distance from a particular reference codeword. Specifically, we report that the retinal code words in the state space are divided into multiple distinct clusters separated by entropy-gaps, and that this structure is shared with well-known associative memory networks in a recallable phase. Our analysis also elucidates a special nature of the all-silent state. The all-silent state is surrounded by the densest cluster of code words and located within a reachable distance from most code words. This code-word space structure quantitatively predicts typical deviation of a state-trajectory from its initial state. Altogether, our findings reveal a non-trivial heterogeneous structure of the code-word space that shapes information representation in a biological network.
NASA Astrophysics Data System (ADS)
Reese, Keturah
Under the direction of Sharon Murphy Augustine, Ph.D./Ph.D Curriculum and Instruction There was a substantial performance gap among African Americans and other ethnic groups. Additionally, African American students in a Title I school were at a significantly high risk of not meeting or exceeding on performance tests in science. Past reports have shown average gains in some subject areas, and declines in others (NCES, 2011; GADOE, 2012). Current instructional strategies and the lack of literacy within the biology classroom created a problem for African American high school students on national and state assessments. The purpose of this study was to examine the perceptions of African American students and teachers in the context of literacy and biology through the incorporation of an interactive notebook and other literacy strategies. The data was collected three ways: field notes for a two week observation period within the biology classroom, student and teacher interviews, and student work samples. During the observations, student work collection, and interviews, I looked for the following codes: active learning, constructive learning, collaborative learning, authentic learning, and intentional learning. In the process of coding for the pre-determined codes, three more codes emerged. The three codes that emerged were organization, studying/student ownership, and student teacher relationships. Students and teachers both solidified the notion that literacy and biology worked well together. The implemented literacy strategies were something that both teachers and students appreciated in their learning of biology. Overall students and teachers perceived that the interactive notebook along Cornell notes, Thinking maps, close reads, writing, lab experiments, and group work created meaningful learning experiences within the biology classroom.
Light transport feature for SCINFUL.
Etaati, G R; Ghal-Eh, N
2008-03-01
An extended version of the scintillator response function prediction code SCINFUL has been developed by incorporating PHOTRACK, a Monte Carlo light transport code. Comparisons of calculated and experimental results for organic scintillators exposed to neutrons show that the extended code improves the predictive capability of SCINFUL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colón, Yamil J.; Gómez-Gualdrón, Diego A.; Snurr, Randall Q.
Metal-organic frameworks (MOFs) are promising materials for a range of energy and environmental applications. Here we describe in detail a computational algorithm and code to generate MOFs based on edge-transitive topological nets for subsequent evaluation via molecular simulation. This algorithm has been previously used by us to construct and evaluate 13 512 MOFs of 41 different topologies for cryo-adsorbed hydrogen storage. Grand canonical Monte Carlo simulations are used here to evaluate the 13 512 structures for the storage of gaseous fuels such as hydrogen and methane and nondistillative separation of xenon/krypton mixtures at various operating conditions. MOF performance for bothmore » gaseous fuel storage and xenon/krypton separation is influenced by topology. Simulation data suggest that gaseous fuel storage performance is topology-dependent due to MOF properties such as void fraction and surface area combining differently in different topologies, whereas xenon/krypton separation performance is topology-dependent due to how topology constrains the pore size distribution.« less
Structural architecture of the human long non-coding RNA, steroid receptor RNA activator
Novikova, Irina V.; Hennelly, Scott P.; Sanbonmatsu, Karissa Y.
2012-01-01
While functional roles of several long non-coding RNAs (lncRNAs) have been determined, the molecular mechanisms are not well understood. Here, we report the first experimentally derived secondary structure of a human lncRNA, the steroid receptor RNA activator (SRA), 0.87 kB in size. The SRA RNA is a non-coding RNA that coactivates several human sex hormone receptors and is strongly associated with breast cancer. Coding isoforms of SRA are also expressed to produce proteins, making the SRA gene a unique bifunctional system. Our experimental findings (SHAPE, in-line, DMS and RNase V1 probing) reveal that this lncRNA has a complex structural organization, consisting of four domains, with a variety of secondary structure elements. We examine the coevolution of the SRA gene at the RNA structure and protein structure levels using comparative sequence analysis across vertebrates. Rapid evolutionary stabilization of RNA structure, combined with frame-disrupting mutations in conserved regions, suggests that evolutionary pressure preserves the RNA structural core rather than its translational product. We perform similar experiments on alternatively spliced SRA isoforms to assess their structural features. PMID:22362738
Simonaitis, Linas; McDonald, Clement J
2009-10-01
The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.
Liu, Charles; Kayima, Peter; Riesel, Johanna; Situma, Martin; Chang, David; Firth, Paul
2017-11-01
The lack of a classification system for surgical procedures in resource-limited settings hinders outcomes measurement and reporting. Existing procedure coding systems are prohibitively large and expensive to implement. We describe the creation and prospective validation of 3 brief procedure code lists applicable in low-resource settings, based on analysis of surgical procedures performed at Mbarara Regional Referral Hospital, Uganda's second largest public hospital. We reviewed operating room logbooks to identify all surgical operations performed at Mbarara Regional Referral Hospital during 2014. Based on the documented indication for surgery and procedure(s) performed, we assigned each operation up to 4 procedure codes from the International Classification of Diseases, 9th Revision, Clinical Modification. Coding of procedures was performed by 2 investigators, and a random 20% of procedures were coded by both investigators. These codes were aggregated to generate procedure code lists. During 2014, 6,464 surgical procedures were performed at Mbarara Regional Referral Hospital, to which we assigned 435 unique procedure codes. Substantial inter-rater reliability was achieved (κ = 0.7037). The 111 most common procedure codes accounted for 90% of all codes assigned, 180 accounted for 95%, and 278 accounted for 98%. We considered these sets of codes as 3 procedure code lists. In a prospective validation, we found that these lists described 83.2%, 89.2%, and 92.6% of surgical procedures performed at Mbarara Regional Referral Hospital during August to September of 2015, respectively. Empirically generated brief procedure code lists based on International Classification of Diseases, 9th Revision, Clinical Modification can be used to classify almost all surgical procedures performed at a Ugandan referral hospital. Such a standardized procedure coding system may enable better surgical data collection for administration, research, and quality improvement in resource-limited settings. Copyright © 2017 Elsevier Inc. All rights reserved.
Defense Depot Mechanicsburg Total Quality Management Implementation Plan
1989-06-01
B T I TLEE 5 . FUNDING NUMBERS Defense Depot Mechanicsburg Total Quality Management Implementation Plan 6. AUTHOR(S) 7. PERFORMING ORGANIZATION NAME...Form 298 (Rev. 2-89) L296- 102 Acces.ion For NYI J ... I:: ted DEFENSE DEPOT MECHANICSBURG PENNSYLVANIAL--I By_ TOTAL QUALITY MANAGEMENT K_~ t buty-n...IMPLEMENTATION PLAN Avmail-t!Ilty Codes IvLl c 2Dd/or JUN 3 0 1989 iDizt Special PURPOSE The purpose of this Total Quality Management Implementation
Mechanics of Air-Inflated Drop-Stitch Fabric Panels Subject to Bending Loads
2013-08-15
Division Newport and Navatek, Ltd. The technical reviewer was Geoffrey R. Moss (Code 1522). The authors gratefully acknowledge Martin S . Leff and...to Bending Loads 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Paul V. Cavallaro Christopher J. Hart Ali M...Sadegh 5.d PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) Naval Undersea Warfare
Neuron Learning to Network Organization.
1983-12-20
02912 N 0-8 1t COTOLIGOF 1HV AflRS 12. REPORT OATE Pesne an ann Research Program December 20, 1983 Office of Naval Research , Code 442PT 13. NUMBER...visual cortc\\ from R. Cajal, Histologie du Systete Nerveux. mostly hard-wired and perform a great variety of control functions took hundreds of millions of...certain sense there is much that is known. A set of coupled non -linear differential equations. including time delays, can be written down that in
Development of Alabama Resources Information System (ARIS)
NASA Technical Reports Server (NTRS)
Herring, B. E.; Vachon, R. I.
1976-01-01
A formal, organized set of information concerning the development status of the Alabama Resources Information System (ARIS) as of September 1976 is provided. A series of computer source language programs, and flow charts related to each of the computer programs to provide greater ease in performing future change are presented. Listings of the variable names, and their meanings, used in the various source code programs, and copies of the various user manuals which were prepared through this time are given.
2010-01-01
GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND...REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM( S ) 11. SPONSOR/MONITOR’S REPORT NUMBER( S ) 12...retardation, and substance abuse or dependence. Coding was based on the International Classification of Diseases , Ninth Revision, Clinical
NASA Astrophysics Data System (ADS)
Armstrong, D. J.; Pollacco, D.; Santerne, A.
2017-03-01
A crucial step in planet hunting surveys is to select the best candidates for follow-up observations, given limited telescope resources. This is often performed by human 'eyeballing', a time consuming and statistically awkward process. Here, we present a new, fast machine learning technique to separate true planet signals from astrophysical false positives. We use self-organizing maps (SOMs) to study the transit shapes of Kepler and K2 known and candidate planets. We find that SOMs are capable of distinguishing known planets from known false positives with a success rate of 87.0 per cent, using the transit shape alone. Furthermore, they do not require any candidate to be dispositioned prior to use, meaning that they can be used early in a mission's lifetime. A method for classifying candidates using a SOM is developed, and applied to previously unclassified members of the Kepler Objects of Interest (KOI) list as well as candidates from the K2 mission. The method is extremely fast, taking minutes to run the entire KOI list on a typical laptop. We make PYTHON code for performing classifications publicly available, using either new SOMs or those created in this work. The SOM technique represents a novel method for ranking planetary candidate lists, and can be used both alone or as part of a larger autovetting code.
Performance Analysis of New Binary User Codes for DS-CDMA Communication
NASA Astrophysics Data System (ADS)
Usha, Kamle; Jaya Sankar, Kottareddygari
2016-03-01
This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Killough, G.G.; Rohwer, P.S.
1974-03-01
INDOS1, INDOS2, and INDOS3 (the INDOS codes) are conversational FORTRAN IV programs, implemented for use in time-sharing mode on the ORNL PDP-10 System. These codes use ICRP10-10A models to estimate the radiation dose to an organ of the body of Reference Man resulting from the ingestion or inhalation of any one of various radionuclides. Two patterns of intake are simulated: intakes at discrete times and continuous intake at a constant rate. The IND0S codes provide tabular output of dose rate and dose vs time, graphical output of dose vs time, and punched-card output of organ burden and dose vs time.more » The models of internal dose calculation are discussed and instructions for the use of the INDOS codes are provided. The INDOS codes are available from the Radiation Shielding Information Center, Oak Ridge National Laboratory, P. O. Box X, Oak Ridge, Tennessee 37830. (auth)« less
Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R
2008-05-15
A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.
Acute Radiation Risk and BRYNTRN Organ Dose Projection Graphical User Interface
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Hu, Shaowen; Nounu, Hateni N.; Kim, Myung-Hee
2011-01-01
The integration of human space applications risk projection models of organ dose and acute radiation risk has been a key problem. NASA has developed an organ dose projection model using the BRYNTRN with SUM DOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUM DOSE are a Baryon transport code and an output data processing code, respectively. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN. A GUI for the ARR and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. BRYNTRN code operation requires extensive input preparation. Only a graphical user interface (GUI) can handle input and output for BRYNTRN to the response models easily and correctly. The purpose of the GUI development for ARRBOD is to provide seamless integration of input and output manipulations for the operations of projection modules (BRYNTRN, SLMDOSE, and the ARR probabilistic response model) in assessing the acute risk and the organ doses of significant Solar Particle Events (SPEs). The assessment of astronauts radiation risk from SPE is in support of mission design and operational planning to manage radiation risks in future space missions. The ARRBOD GUI can identify the proper shielding solutions using the gender-specific organ dose assessments in order to avoid ARR symptoms, and to stay within the current NASA short-term dose limits. The quantified evaluation of ARR severities based on any given shielding configuration and a specified EVA or other mission scenario can be made to guide alternative solutions for attaining determined objectives set by mission planners. The ARRBOD GUI estimates the whole-body effective dose, organ doses, and acute radiation sickness symptoms for astronauts, by which operational strategies and capabilities can be made for the protection of astronauts from SPEs in the planning of future lunar surface scenarios, exploration of near-Earth objects, and missions to Mars.
NASA Astrophysics Data System (ADS)
Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf
2016-11-01
This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.
Stability or stasis in the names of organisms: the evolving codes of nomenclature.
Knapp, Sandra; Lamas, Gerardo; Lughadha, Eimear Nic; Novarino, Gianfranco
2004-01-01
Nomenclature, far from being a dry dusty subject, is today more relevant than ever before. Researchers into genomics are discovering again the need for systems of nomenclature-names are what we use to communicate about organisms, and by extension the rest of their biology. Here, we briefly outline the history of the published international codes of nomenclature, tracing them from the time of Linnaeus in the eighteenth century to the present day. We then outline some of what we feel are the major challenges that face the codes in the twenty-first century; focusing primarily on publication, priority, typification and the role of science in the naming of organisms. We conclude that the codes are essential for taxonomists in the pursuance of their science, and that the democratic nature of decision-making in the regulation of the rules of nomenclature, though sometimes perceived as a potential weakness, is in fact one of its great strengths. PMID:15253348
Song, Yuhyun; Leman, Scotland; Monteil, Caroline L.; Heath, Lenwood S.; Vinatzer, Boris A.
2014-01-01
A broadly accepted and stable biological classification system is a prerequisite for biological sciences. It provides the means to describe and communicate about life without ambiguity. Current biological classification and nomenclature use the species as the basic unit and require lengthy and laborious species descriptions before newly discovered organisms can be assigned to a species and be named. The current system is thus inadequate to classify and name the immense genetic diversity within species that is now being revealed by genome sequencing on a daily basis. To address this lack of a general intra-species classification and naming system adequate for today’s speed of discovery of new diversity, we propose a classification and naming system that is exclusively based on genome similarity and that is suitable for automatic assignment of codes to any genome-sequenced organism without requiring any phenotypic or phylogenetic analysis. We provide examples demonstrating that genome similarity-based codes largely align with current taxonomic groups at many different levels in bacteria, animals, humans, plants, and viruses. Importantly, the proposed approach is only slightly affected by the order of code assignment and can thus provide codes that reflect similarity between organisms and that do not need to be revised upon discovery of new diversity. We envision genome similarity-based codes to complement current biological nomenclature and to provide a universal means to communicate unambiguously about any genome-sequenced organism in fields as diverse as biodiversity research, infectious disease control, human and microbial forensics, animal breed and plant cultivar certification, and human ancestry research. PMID:24586551
Matsumoto, Shinnosuke; Koba, Yusuke; Kohno, Ryosuke; Lee, Choonsik; Bolch, Wesley E; Kai, Michiaki
2016-04-01
Proton therapy has the physical advantage of a Bragg peak that can provide a better dose distribution than conventional x-ray therapy. However, radiation exposure of normal tissues cannot be ignored because it is likely to increase the risk of secondary cancer. Evaluating secondary neutrons generated by the interaction of the proton beam with the treatment beam-line structure is necessary; thus, performing the optimization of radiation protection in proton therapy is required. In this research, the organ dose and energy spectrum were calculated from secondary neutrons using Monte Carlo simulations. The Monte Carlo code known as the Particle and Heavy Ion Transport code System (PHITS) was used to simulate the transport proton and its interaction with the treatment beam-line structure that modeled the double scattering body of the treatment nozzle at the National Cancer Center Hospital East. The doses of the organs in a hybrid computational phantom simulating a 5-y-old boy were calculated. In general, secondary neutron doses were found to decrease with increasing distance to the treatment field. Secondary neutron energy spectra were characterized by incident neutrons with three energy peaks: 1×10, 1, and 100 MeV. A block collimator and a patient collimator contributed significantly to organ doses. In particular, the secondary neutrons from the patient collimator were 30 times higher than those from the first scatter. These results suggested that proactive protection will be required in the design of the treatment beam-line structures and that organ doses from secondary neutrons may be able to be reduced.
Follow the Code: Rules or Guidelines for Academic Deans' Behavior?
ERIC Educational Resources Information Center
Bray, Nathaniel J.
2012-01-01
In the popular movie series "Pirates of the Caribbean," there is a pirate code that influences how pirates behave in unclear situations, with a running joke about whether the code is either a set of rules or guidelines for behavior. Codes of conduct in any social group or organization can have much the same feel; they can provide clarity and…
Three decades of the WHO code and marketing of infant formulas.
Forsyth, Stewart
2012-05-01
The International Code of Marketing of Breast Milk Substitutes states that governments, non-governmental organizations, experts, consumers and industry need to cooperate in activities aimed at improving infant nutrition. However, the evidence from the last three decades is that of a series of disputes, legal proceedings and boycotts. The purpose of this review is to assess the overall progress in the implementation of the Code and to examine the problematic areas of monitoring, compliance and governance. There are continuing issues of implementation, monitoring and compliance which predominantly reflect weak governance. Many Member States have yet to fully implement the Code recommendations and most States do not have adequate monitoring and reporting mechanisms. Application of the Code in developed countries may be undermined by a lack of consensus on the WHO recommendation of 6 months exclusive breastfeeding. There is evidence of continuing conflict and acrimony, especially between non-government organizations and industry. Measures need to be taken to encourage the Member States to implement the Code and to establish the governance systems that will not only ensure effective implementation and monitoring of the Code, but also deliver the Code within a spirit of participation, collaboration and trust.
Soldavini, Jessica; Taillie, Lindsey Smith
2017-08-01
In 1981, the World Health Organization adopted the International Code of Marketing of Breast-milk Substitutes ( International Code), with subsequent resolutions adopted since then. The International Code contributes to the safe and adequate provision of nutrition for infants by protecting and promoting breastfeeding and ensuring that human milk substitutes, when necessary, are used properly through adequate information and appropriate marketing and distribution. Despite the World Health Organization recommendations for all member nations to implement the International Code in its entirety, the United States has yet to take action to translate it into any national measures. In 2012, only 22.3% of infants in the United States met the American Academy of Pediatrics recommendation of at least 6 months of exclusive breastfeeding. Countries adopting legislation reflecting the provisions of the International Code have seen increases in breastfeeding rates. This article discusses recommendations for translating the International Code into U.S. policy. Adopting legislation that implements, monitors, and enforces the International Code in its entirety has the potential to contribute to increased rates of breastfeeding in the United States, which can lead to improved health outcomes in both infants and breastfeeding mothers.
NASA Astrophysics Data System (ADS)
Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria
2017-07-01
In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 50 Wildlife and Fisheries 9 2013-10-01 2013-10-01 false Codes for the Representation of Names of Countries (Established by the International Organization for Standardization) A Appendix A to Chapter I.... Papua New Guinea PG. Paraguay PY. Peru PE. Philippines PH. Poland PL. Portugal PT. Qatar QA. Republic of...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 50 Wildlife and Fisheries 9 2012-10-01 2012-10-01 false Codes for the Representation of Names of Countries (Established by the International Organization for Standardization) A Appendix A to Chapter I.... Papua New Guinea PG. Paraguay PY. Peru PE. Philippines PH. Poland PL. Portugal PT. Qatar QA. Republic of...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-27
... (also known as origin code) refers to the participant types listed in Rule 1080.08(b) and Rule 1000(b..., and, therefore, is referring to the participant origin codes in Rule 1080.08(b) only. The proposed...-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing of Proposed Rule Change Relating to Which...
BASiNET-BiologicAl Sequences NETwork: a case study on coding and non-coding RNAs identification.
Ito, Eric Augusto; Katahira, Isaque; Vicente, Fábio Fernandes da Rocha; Pereira, Luiz Filipe Protasio; Lopes, Fabrício Martins
2018-06-05
With the emergence of Next Generation Sequencing (NGS) technologies, a large volume of sequence data in particular de novo sequencing was rapidly produced at relatively low costs. In this context, computational tools are increasingly important to assist in the identification of relevant information to understand the functioning of organisms. This work introduces BASiNET, an alignment-free tool for classifying biological sequences based on the feature extraction from complex network measurements. The method initially transform the sequences and represents them as complex networks. Then it extracts topological measures and constructs a feature vector that is used to classify the sequences. The method was evaluated in the classification of coding and non-coding RNAs of 13 species and compared to the CNCI, PLEK and CPC2 methods. BASiNET outperformed all compared methods in all adopted organisms and datasets. BASiNET have classified sequences in all organisms with high accuracy and low standard deviation, showing that the method is robust and non-biased by the organism. The proposed methodology is implemented in open source in R language and freely available for download at https://cran.r-project.org/package=BASiNET.
Jones, Dean P.
2015-01-01
Abstract Significance: The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O2 and H2O2 contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Recent Advances: Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Critical Issues: Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Future Directions: Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine. Antioxid. Redox Signal. 23, 734–746. PMID:25891126
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giantsoudi, D; Schuemann, J; Dowdell, S
Purpose: For proton radiation therapy, Monte Carlo simulation (MCS) methods are recognized as the gold-standard dose calculation approach. Although previously unrealistic due to limitations in available computing power, GPU-based applications allow MCS of proton treatment fields to be performed in routine clinical use, on time scales comparable to that of conventional pencil-beam algorithms. This study focuses on validating the results of our GPU-based code (gPMC) versus fully implemented proton therapy based MCS code (TOPAS) for clinical patient cases. Methods: Two treatment sites were selected to provide clinical cases for this study: head-and-neck cases due to anatomical geometrical complexity (air cavitiesmore » and density heterogeneities), making dose calculation very challenging, and prostate cases due to higher proton energies used and close proximity of the treatment target to sensitive organs at risk. Both gPMC and TOPAS methods were used to calculate 3-dimensional dose distributions for all patients in this study. Comparisons were performed based on target coverage indices (mean dose, V90 and D90) and gamma index distributions for 2% of the prescription dose and 2mm. Results: For seven out of eight studied cases, mean target dose, V90 and D90 differed less than 2% between TOPAS and gPMC dose distributions. Gamma index analysis for all prostate patients resulted in passing rate of more than 99% of voxels in the target. Four out of five head-neck-cases showed passing rate of gamma index for the target of more than 99%, the fifth having a gamma index passing rate of 93%. Conclusion: Our current work showed excellent agreement between our GPU-based MCS code and fully implemented proton therapy based MC code for a group of dosimetrically challenging patient cases.« less
Performance and structure of single-mode bosonic codes
NASA Astrophysics Data System (ADS)
Albert, Victor V.; Noh, Kyungjoo; Duivenvoorden, Kasper; Young, Dylan J.; Brierley, R. T.; Reinhold, Philip; Vuillot, Christophe; Li, Linshu; Shen, Chao; Girvin, S. M.; Terhal, Barbara M.; Jiang, Liang
2018-03-01
The early Gottesman, Kitaev, and Preskill (GKP) proposal for encoding a qubit in an oscillator has recently been followed by cat- and binomial-code proposals. Numerically optimized codes have also been proposed, and we introduce codes of this type here. These codes have yet to be compared using the same error model; we provide such a comparison by determining the entanglement fidelity of all codes with respect to the bosonic pure-loss channel (i.e., photon loss) after the optimal recovery operation. We then compare achievable communication rates of the combined encoding-error-recovery channel by calculating the channel's hashing bound for each code. Cat and binomial codes perform similarly, with binomial codes outperforming cat codes at small loss rates. Despite not being designed to protect against the pure-loss channel, GKP codes significantly outperform all other codes for most values of the loss rate. We show that the performance of GKP and some binomial codes increases monotonically with increasing average photon number of the codes. In order to corroborate our numerical evidence of the cat-binomial-GKP order of performance occurring at small loss rates, we analytically evaluate the quantum error-correction conditions of those codes. For GKP codes, we find an essential singularity in the entanglement fidelity in the limit of vanishing loss rate. In addition to comparing the codes, we draw parallels between binomial codes and discrete-variable systems. First, we characterize one- and two-mode binomial as well as multiqubit permutation-invariant codes in terms of spin-coherent states. Such a characterization allows us to introduce check operators and error-correction procedures for binomial codes. Second, we introduce a generalization of spin-coherent states, extending our characterization to qudit binomial codes and yielding a multiqudit code.
Modeling the acute health effects of astronauts from exposure to large solar particle events.
Hu, Shaowen; Kim, Myung-Hee Y; McClellan, Gene E; Cucinotta, Francis A
2009-04-01
Radiation exposure from Solar Particle Events (SPE) presents a significant health concern for astronauts for exploration missions outside the protection of the Earth's magnetic field, which could impair their performance and result in the possibility of failure of the mission. Assessing the potential for early radiation effects under such adverse conditions is of prime importance. Here we apply a biologically based mathematical model that describes the dose- and time-dependent early human responses that constitute the prodromal syndromes to consider acute risks from SPEs. We examine the possible early effects on crews from exposure to some historically large solar events on lunar and/or Mars missions. The doses and dose rates of specific organs were calculated using the Baryon radiation transport (BRYNTRN) code and a computerized anatomical man model, while the hazard of the early radiation effects and performance reduction were calculated using the Radiation-Induced Performance Decrement (RIPD) code. Based on model assumptions we show that exposure to these historical events would cause moderate early health effects to crew members inside a typical spacecraft or during extra-vehicular activities, if effective shielding and medical countermeasure tactics were not provided. We also calculate possible even worse cases (double intensity, multiple occurrences in a short period of time, etc.) to estimate the severity, onset and duration of various types of early illness. Uncertainties in the calculation due to limited data on relative biological effectiveness and dose-rate modifying factors for protons and secondary radiation, and the identification of sensitive sites in critical organs are discussed.
Gorbenko, Ksenia O.; Fraze, Taressa; Lewis, Valerie A.
2017-01-01
INTRODUCTION Accountable care organizations (ACOs) are a value-based payment model in the United States rooted in holding groups of healthcare providers financially accountable for the quality and total cost of care of their attributed population. To succeed in reaching their quality and efficiency goals, ACOs implement a variety of care delivery changes, including workforce redesign. Patient support personnel (PSP)—non-physician staff such as care coordinators, community health workers, and others—are critical to restructuring care delivery. Little is known about how ACOs are redesigning their patient support personnel in terms of responsibilities, location, and evaluation. METHODS We conducted semi-structured one-hour interviews with 25 executives at 16 distinct ACOs. The interviews were recorded, transcribed, and coded for themes, using a qualitative coding and analysis process. RESULTS ACOs deployed PSP to perform four clusters of responsibilities: care provision, care coordination, logistical help with transportation, and social and emotional support. ACOs deployed these personnel strategically across settings (primary care, inpatient services, emergency department, home care and community) depending on their population needs. Most ACOs used personnel with the same level of training across settings. Few ACOs planned to conduct a comprehensive evaluation of their PSP to optimize their value. DISCUSSION ACO strategies in workforce redesign indicate a shift from a physician-centered to a team-based approach. Employing personnel with varying levels of clinical training to perform different tasks can help further optimize care delivery. More robust evaluation of the deployment of PSP and their performance is needed to demonstrate cost-saving benefits of workforce redesign. PMID:28217305
Error Control Coding Techniques for Space and Satellite Communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Takeshita, Oscar Y.; Cabral, Hermano A.
1998-01-01
It is well known that the BER performance of a parallel concatenated turbo-code improves roughly as 1/N, where N is the information block length. However, it has been observed by Benedetto and Montorsi that for most parallel concatenated turbo-codes, the FER performance does not improve monotonically with N. In this report, we study the FER of turbo-codes, and the effects of their concatenation with an outer code. Two methods of concatenation are investigated: across several frames and within each frame. Some asymmetric codes are shown to have excellent FER performance with an information block length of 16384. We also show that the proposed outer coding schemes can improve the BER performance as well by eliminating pathological frames generated by the iterative MAP decoding process.
Los Alamos radiation transport code system on desktop computing platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less
Predicting the Performance of an Axial-Flow Compressor
NASA Technical Reports Server (NTRS)
Steinke, R. J.
1986-01-01
Stage-stacking computer code (STGSTK) developed for predicting off-design performance of multi-stage axial-flow compressors. Code uses meanline stagestacking method. Stage and cumulative compressor performance calculated from representative meanline velocity diagrams located at rotor inlet and outlet meanline radii. Numerous options available within code. Code developed so user modify correlations to suit their needs.
Software ``Best'' Practices: Agile Deconstructed
NASA Astrophysics Data System (ADS)
Fraser, Steven
This workshop will explore the intersection of agility and software development in a world of legacy code-bases and large teams. Organizations with hundreds of developers and code-bases exceeding a million or tens of millions of lines of code are seeking new ways to expedite development while retaining and attracting staff who desire to apply “agile” methods. This is a situation where specific agile practices may be embraced outside of their usual zone of applicability. Here is where practitioners must understand both what “best practices” already exist in the organization - and how they might be improved or modified by applying “agile” approaches.
Numerical predictions of EML (electromagnetic launcher) system performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schnurr, N.M.; Kerrisk, J.F.; Davidson, R.F.
1987-01-01
The performance of an electromagnetic launcher (EML) depends on a large number of parameters, including the characteristics of the power supply, rail geometry, rail and insulator material properties, injection velocity, and projectile mass. EML system performance is frequently limited by structural or thermal effects in the launcher (railgun). A series of computer codes has been developed at the Los Alamos National Laboratory to predict EML system performance and to determine the structural and thermal constraints on barrel design. These codes include FLD, a two-dimensional electrostatic code used to calculate the high-frequency inductance gradient and surface current density distribution for themore » rails; TOPAZRG, a two-dimensional finite-element code that simultaneously analyzes thermal and electromagnetic diffusion in the rails; and LARGE, a code that predicts the performance of the entire EML system. Trhe NIKE2D code, developed at the Lawrence Livermore National Laboratory, is used to perform structural analyses of the rails. These codes have been instrumental in the design of the Lethality Test System (LTS) at Los Alamos, which has an ultimate goal of accelerating a 30-g projectile to a velocity of 15 km/s. The capabilities of the individual codes and the coupling of these codes to perform a comprehensive analysis is discussed in relation to the LTS design. Numerical predictions are compared with experimental data and presented for the LTS prototype tests.« less
Design and System Implications of a Family of Wideband HF Data Waveforms
2010-09-01
code rates (i.e. 8/9, 9/10) will be used to attain the highest data rates for surface wave links. Very high puncturing of convolutional codes can...Communication Links”, Edition 1, North Atlantic Treaty Organization, 2009. [14] Yasuda, Y., Kashiki, K., Hirata, Y. “High- Rate Punctured Convolutional Codes ...length 7 convolutional code that has been used for over two decades in 110A. In addition, repetition coding and puncturing was
NASA Technical Reports Server (NTRS)
Lin, Shu; Rhee, Dojun; Rajpal, Sandeep
1993-01-01
This report presents a low-complexity and high performance concatenated coding scheme for high-speed satellite communications. In this proposed scheme, the NASA Standard Reed-Solomon (RS) code over GF(2(exp 8) is used as the outer code and the second-order Reed-Muller (RM) code of Hamming distance 8 is used as the inner code. The RM inner code has a very simple trellis structure and is decoded with the soft-decision Viterbi decoding algorithm. It is shown that the proposed concatenated coding scheme achieves an error performance which is comparable to that of the NASA TDRS concatenated coding scheme in which the NASA Standard rate-1/2 convolutional code of constraint length 7 and d sub free = 10 is used as the inner code. However, the proposed RM inner code has much smaller decoding complexity, less decoding delay, and much higher decoding speed. Consequently, the proposed concatenated coding scheme is suitable for reliable high-speed satellite communications, and it may be considered as an alternate coding scheme for the NASA TDRS system.
Coding performance of the Probe-Orbiter-Earth communication link
NASA Technical Reports Server (NTRS)
Divsalar, D.; Dolinar, S.; Pollara, F.
1993-01-01
The coding performance of the Probe-Orbiter-Earth communication link is analyzed and compared for several cases. It is assumed that the coding system consists of a convolutional code at the Probe, a quantizer and another convolutional code at the Orbiter, and two cascaded Viterbi decoders or a combined decoder on the ground.
NASA Technical Reports Server (NTRS)
Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)
2008-01-01
An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.
Di Giulio, Massimo
2017-02-07
Whereas it is extremely easy to prove that "if the biosynthetic relationships between amino acids were fundamental in the structuring of the genetic code, then their physico-chemical properties might also be revealed in the genetic code table"; it is, on the contrary, impossible to prove that "if the physico-chemical properties of amino acids were fundamental in the structuring of the genetic code, then the presence of the biosynthetic relationships between amino acids should not be revealed in the genetic code". And, given that in the genetic code table are mirrored both the biosynthetic relationships between amino acids and their physico-chemical properties, all this would be a test that would falsify the physico-chemical theories of the origin of the genetic code. That is to say, if the physico-chemical properties of amino acids had a fundamental role in organizing the genetic code, then we would not have duly revealed the presence - in the genetic code - of the biosynthetic relationships between amino acids, and on the contrary this has been observed. Therefore, this falsifies the physico-chemical theories of genetic code origin. Whereas, the coevolution theory of the origin of the genetic code would be corroborated by this analysis, because it would be able to give a description of evolution of the genetic code more coherent with the indisputable empirical observations that link both the biosynthetic relationships of amino acids and their physico-chemical properties to the evolutionary organization of the genetic code. Copyright © 2016 Elsevier Ltd. All rights reserved.
Low-density parity-check codes for volume holographic memory systems.
Pishro-Nik, Hossein; Rahnavard, Nazanin; Ha, Jeongseok; Fekri, Faramarz; Adibi, Ali
2003-02-10
We investigate the application of low-density parity-check (LDPC) codes in volume holographic memory (VHM) systems. We show that a carefully designed irregular LDPC code has a very good performance in VHM systems. We optimize high-rate LDPC codes for the nonuniform error pattern in holographic memories to reduce the bit error rate extensively. The prior knowledge of noise distribution is used for designing as well as decoding the LDPC codes. We show that these codes have a superior performance to that of Reed-Solomon (RS) codes and regular LDPC counterparts. Our simulation shows that we can increase the maximum storage capacity of holographic memories by more than 50 percent if we use irregular LDPC codes with soft-decision decoding instead of conventionally employed RS codes with hard-decision decoding. The performance of these LDPC codes is close to the information theoretic capacity.
Genomics dataset on unclassified published organism (patent US 7547531).
Khan Shawan, Mohammad Mahfuz Ali; Hasan, Md Ashraful; Hossain, Md Mozammel; Hasan, Md Mahmudul; Parvin, Afroza; Akter, Salina; Uddin, Kazi Rasel; Banik, Subrata; Morshed, Mahbubul; Rahman, Md Nazibur; Rahman, S M Badier
2016-12-01
Nucleotide (DNA) sequence analysis provides important clues regarding the characteristics and taxonomic position of an organism. With the intention that, DNA sequence analysis is very crucial to learn about hierarchical classification of that particular organism. This dataset (patent US 7547531) is chosen to simplify all the complex raw data buried in undisclosed DNA sequences which help to open doors for new collaborations. In this data, a total of 48 unidentified DNA sequences from patent US 7547531 were selected and their complete sequences were retrieved from NCBI BioSample database. Quick response (QR) code of those DNA sequences was constructed by DNA BarID tool. QR code is useful for the identification and comparison of isolates with other organisms. AT/GC content of the DNA sequences was determined using ENDMEMO GC Content Calculator, which indicates their stability at different temperature. The highest GC content was observed in GP445188 (62.5%) which was followed by GP445198 (61.8%) and GP445189 (59.44%), while lowest was in GP445178 (24.39%). In addition, New England BioLabs (NEB) database was used to identify cleavage code indicating the 5, 3 and blunt end and enzyme code indicating the methylation site of the DNA sequences was also shown. These data will be helpful for the construction of the organisms' hierarchical classification, determination of their phylogenetic and taxonomic position and revelation of their molecular characteristics.
Maximum likelihood decoding analysis of accumulate-repeat-accumulate codes
NASA Technical Reports Server (NTRS)
Abbasfar, A.; Divsalar, D.; Yao, K.
2004-01-01
In this paper, the performance of the repeat-accumulate codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. Some simple codes are shown that perform very close to Shannon limit with maximum likelihood decoding.
MONTE CARLO STUDY OF THE CARDIAC ABSORBED DOSE DURING X-RAY EXAMINATION OF AN ADULT PATIENT.
Kadri, O; Manai, K; Alfuraih, A
2016-12-01
The computational voxel phantom 'High-Definition Reference Korean-Man (HDRK-Man)' was implemented into the Monte Carlo transport toolkit Geant4. The voxel model, adjusted to the Reference Korean Man, is 171 cm in height and 68 kg in weight and composed of ∼30 million voxels whose size is 1.981 × 1.981 × 2.0854 mm 3 The Geant4 code is then utilised to compute the dose conversion coefficients (DCCs) expressed in absorbed dose per air kerma free in air for >30 tissues and organs, including almost all organs required in the new recommendation of the ICRP 103, due to a broad parallel beam of monoenergetic photons impinging in antero-postero direction with energy ranging from 10 to 150 keV. The computed DCCs of different organs are found to be in good agreement with data published using other simulation codes. Also, the influence of patient size on DCC values was investigated for a representative body size of the adult Korean patient population. The study was performed using five different sizes covering the range of 0.8-1.2 magnification order of the original HDRK-Man. It focussed on the computation of DCC for the human heart. Moreover, the provided DCCs were used to present an analytical parameterisation for the calculation of the cardiac absorbed dose for any arbitrary X-ray spectrum and for those patient sizes. Thus, the present work can be considered as an enhancement of the continuous studies performed by medical physicist as part of quality control tests and radiation protection dosimetry. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Code of Federal Regulations, 2012 CFR
2012-04-01
... from social security taxes by certain tax-exempt organizations. 31.3121(k)-4 Section 31.3121(k)-4... Contributions Act (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-4 Constructive... organization did not file a valid waiver certificate under section 3121(k)(1) of the Internal Revenue Code of...
Code of Federal Regulations, 2014 CFR
2014-04-01
... exempt organization before August 1, 1956. 31.3121(k)-3 Section 31.3121(k)-3 Internal Revenue INTERNAL... (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-3 Request for coverage of... section 3121(k), or under section 1426(l) of the Internal Revenue Code of 1939, may request after July 31...
Code of Federal Regulations, 2011 CFR
2011-04-01
... from social security taxes by certain tax-exempt organizations. 31.3121(k)-4 Section 31.3121(k)-4... Contributions Act (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-4 Constructive... organization did not file a valid waiver certificate under section 3121(k)(1) of the Internal Revenue Code of...
Code of Federal Regulations, 2013 CFR
2013-04-01
... from social security taxes by certain tax-exempt organizations. 31.3121(k)-4 Section 31.3121(k)-4... Contributions Act (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-4 Constructive... organization did not file a valid waiver certificate under section 3121(k)(1) of the Internal Revenue Code of...
Code of Federal Regulations, 2012 CFR
2012-04-01
... exempt organization before August 1, 1956. 31.3121(k)-3 Section 31.3121(k)-3 Internal Revenue INTERNAL... (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-3 Request for coverage of... section 3121(k), or under section 1426(l) of the Internal Revenue Code of 1939, may request after July 31...
Code of Federal Regulations, 2014 CFR
2014-04-01
... from social security taxes by certain tax-exempt organizations. 31.3121(k)-4 Section 31.3121(k)-4... Contributions Act (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-4 Constructive... organization did not file a valid waiver certificate under section 3121(k)(1) of the Internal Revenue Code of...
Code of Federal Regulations, 2013 CFR
2013-04-01
... exempt organization before August 1, 1956. 31.3121(k)-3 Section 31.3121(k)-3 Internal Revenue INTERNAL... (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-3 Request for coverage of... section 3121(k), or under section 1426(l) of the Internal Revenue Code of 1939, may request after July 31...
Code of Federal Regulations, 2011 CFR
2011-04-01
... exempt organization before August 1, 1956. 31.3121(k)-3 Section 31.3121(k)-3 Internal Revenue INTERNAL... (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-3 Request for coverage of... section 3121(k), or under section 1426(l) of the Internal Revenue Code of 1939, may request after July 31...
Further Development of a Model for Rod Ricochet
2007-02-01
PROJECT NUMBER AH80 5e. TASK NUMBER 6. AUTHOR( S ) Steven B. Segletes 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES...164 10. SPONSOR/MONITOR’S ACRONYM( S ) 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 11. SPONSOR/MONITOR’S REPORT NUMBER( S ) 12...TELEPHONE NUMBER (Include area code) 410-306-1939 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 ARTICLE IN PRESS0734-743X/$ - s doi
Confronting trade-offs in health care: Harvard Pilgrim Health Care's organizational ethics program.
Sabin, James E; Cochran, David
2007-01-01
Patients, providers, and policy leaders need a new moral compass to guide them in the turbulent U.S. health care system. Task forces have proposed excellent ethical codes, but these have been seen as too abstract to provide guidance at the front lines. Harvard Pilgrim Health Care's ten-year experience with an organizational ethics program suggests ways in which health care organizations can strengthen transparency, consumer focus, and overall ethical performance and contribute to the national health policy dialogue.
2015-04-01
and execution of Performance Review Tool; Organization, coding, and transcribing of collected data; Analysis of qualitative survey and quantitative...University of Wisconsin System Madison, WI 53715-1218 REPORT DATE: April 2015 TYPE OF REPORT: Annual PREPARED FOR: U.S. Army Medical Research and...MONITOR’S ACRONYM(S) U.S. Army Medical Research and Material Command Fort Detrick, Maryland 21702-5012 11. SPONSOR/MONITOR’S REPORT NUMBER
Morphodynamics and Geology of the Southeastern Virginia Shelf: False Cape Shoals Area (Phase 2)
2001-09-30
Code 322PO Randolph A. McBride, Ph.D. Assistant Professor of Geology Environmental Science and Policy, MS 5F2 George Mason University Fairfax...NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Environmental Science and Policy, MS 5F2,,George Mason...Geology Laboratory at George Mason University. In addition, a Ph.D. student in Environmental Science and Policy, who is working under the direction
Rotorcraft Use in Disaster Relief and Mass Casualty Incidents - Case Studies
1990-06-01
Disaster Relief and Mass 6. Performing Organization Code C asuallty Incidents- C ase Studies 8._P rfo minOr ani ati nR porNo 7. Author (s) 8...disaster planning process; and 3) produce a color video tape promoting the need for and the use of rotorcraft and heliports in disaster relief. 17...disaster prepaLedness ageicies for use in the integration of local helicopter assets into the disaster planning process; and 3) produce a color video tape
Radiative Augmented Combustion
1988-03-01
PbLFICE SY 7a NAME OF MONITORING ORGANIZATION M.L. ENERGIA , Inc. AFOSR/NA 6r. ADDRESS (City. State. anW ZIP Code) 7b. ADDRESS (City State, and ZIPCode...27 -00 N ’fPECTED 0 6I FOREWORD This is the Final Report on research on Radiative Augmented Combustion conducted at M. L. ENERGIA , Inc. It was a...the first two annual reports prior to this one. The entire research program was performed at ENERGIA , Inc., Princeton, New Jersey, with Dr. Moshe Lavid
A trend analysis of surgical operations under a global payment system in Tehran, Iran (2005–2015)
Goudari, Faranak Behzadi; Rashidian, Arash; Arab, Mohammad; Mahmoudi, Mahmood
2018-01-01
Background Global payment system is a first example of per-case payment system that contains 60 commonly used surgical operations for which payment is based on the average cost per case in Iran. Objective The aim of the study was to determine the amount of reduction, increase or no change in the trend of global operations. Methods In this retrospective longitudinal study, data on the 60 primary global surgery codes was gathered from Tehran Health Insurance Organization within the ten-year period of 2005–2015 separately, for each month. Out of 60 surgery codes, only acceptable data for 46 codes were available based on the insurance documents sent by medical centers. A quantitative analysis of time series through Regression Analysis Model using STATA software v.11 was performed. Results Some global surgery codes had an upward trend and some were downwards. Of N Codes, N83, N20, N28, N63, and N93 had an upward trend (p<0.05) and N32, N43, N81 and N90 showed a significant downward trend (p<0.05). Similarly, all H Codes except for H18 had a significant upward trend (p<0.000). As such, K Codes including K45, K56 and K81 had an increasing movement. S Codes also experienced both increasing and decreasing trends. However, none of the O Codes changed according to time. Other global surgical codes like C61, E07, M51, L60, J98 (p<0.000), I84 (p<0.031) and I86 (p<0.000) shown upward and downward trends. Total global surgeries trend was significantly upwards (B=24.26109, p<0.000). Conclusion The varying trend of global surgeries can partly reflect the behavior of service providers in order to increase their profits and minimize their costs. PMID:29765576
Révész, Kinga M.; Doctor, Daniel H.
2014-01-01
The purposes of the Reston Stable Isotope Laboratory (RSIL) lab codes 1851 and 1852 are to determine the total carbon mass and the ratio of the stable isotopes of carbon (δ13C) for total dissolved inorganic carbon (DIC, lab code 1851) and total nonpurgeable dissolved organic carbon (DOC, lab code 1852) in aqueous samples. The analysis procedure is automated according to a method that utilizes a total carbon analyzer as a peripheral sample preparation device for analysis of carbon dioxide (CO2) gas by a continuous-flow isotope ratio mass spectrometer (CF-IRMS). The carbon analyzer produces CO2 and determines the carbon mass in parts per million (ppm) of DIC and DOC in each sample separately, and the CF-IRMS determines the carbon isotope ratio of the produced CO2. This configuration provides a fully automated analysis of total carbon mass and δ13C with no operator intervention, additional sample preparation, or other manual analysis. To determine the DIC, the carbon analyzer transfers a specified sample volume to a heated (70 °C) reaction vessel with a preprogrammed volume of 10% phosphoric acid (H3PO4), which allows the carbonate and bicarbonate species in the sample to dissociate to CO2. The CO2 from the reacted sample is subsequently purged with a flow of helium gas that sweeps the CO2 through an infrared CO2 detector and quantifies the CO2. The CO2 is then carried through a high-temperature (650 °C) scrubber reactor, a series of water traps, and ultimately to the inlet of the mass spectrometer. For the analysis of total dissolved organic carbon, the carbon analyzer performs a second step on the sample in the heated reaction vessel during which a preprogrammed volume of sodium persulfate (Na2S2O8) is added, and the hydroxyl radicals oxidize the organics to CO2. Samples containing 2 ppm to 30,000 ppm of carbon are analyzed. The precision of the carbon isotope analysis is within 0.3 per mill for DIC, and within 0.5 per mill for DOC.
Iterative demodulation and decoding of coded non-square QAM
NASA Technical Reports Server (NTRS)
Li, L.; Divsalar, D.; Dolinar, S.
2003-01-01
Simulation results show that, with iterative demodulation and decoding, coded NS-8QAM performs 0.5 dB better than standard 8QAM and 0.7 dB better than 8PSK at BER= 10(sup -5), when the FEC code is the (15, 11) Hamming code concatenated with a rate-1 accumulator code, while coded NS-32QAM performs 0.25 dB better than standard 32QAM.
Biphasic patterns of diversification and the emergence of modules
Mittenthal, Jay; Caetano-Anollés, Derek; Caetano-Anollés, Gustavo
2012-01-01
The intricate molecular and cellular structure of organisms converts energy to work, which builds and maintains structure. Evolving structure implements modules, in which parts are tightly linked. Each module performs characteristic functions. In this work we propose that a module can emerge through two phases of diversification of parts. Early in the first phase of this biphasic pattern, the parts have weak linkage—they interact weakly and associate variously. The parts diversify and compete. Under selection for performance, interactions among the parts increasingly constrain their structure and associations. As many variants are eliminated, parts self-organize into modules with tight linkage. Linkage may increase in response to exogenous stresses as well as endogenous processes. In the second phase of diversification, variants of the module and its functions evolve and become new parts for a new cycle of generation of higher-level modules. This linkage hypothesis can interpret biphasic patterns in the diversification of protein domain structure, RNA and protein shapes, and networks in metabolism, codes, and embryos, and can explain hierarchical levels of structural organization that are widespread in biology. PMID:22891076
Advanced coding and modulation schemes for TDRSS
NASA Technical Reports Server (NTRS)
Harrell, Linda; Kaplan, Ted; Berman, Ted; Chang, Susan
1993-01-01
This paper describes the performance of the Ungerboeck and pragmatic 8-Phase Shift Key (PSK) Trellis Code Modulation (TCM) coding techniques with and without a (255,223) Reed-Solomon outer code as they are used for Tracking Data and Relay Satellite System (TDRSS) S-Band and Ku-Band return services. The performance of these codes at high data rates is compared to uncoded Quadrature PSK (QPSK) and rate 1/2 convolutionally coded QPSK in the presence of Radio Frequency Interference (RFI), self-interference, and hardware distortions. This paper shows that the outer Reed-Solomon code is necessary to achieve a 10(exp -5) Bit Error Rate (BER) with an acceptable level of degradation in the presence of RFI. This paper also shows that the TCM codes with or without the Reed-Solomon outer code do not perform well in the presence of self-interference. In fact, the uncoded QPSK signal performs better than the TCM coded signal in the self-interference situation considered in this analysis. Finally, this paper shows that the E(sub b)/N(sub 0) degradation due to TDRSS hardware distortions is approximately 1.3 dB with a TCM coded signal or a rate 1/2 convolutionally coded QPSK signal and is 3.2 dB with an uncoded QPSK signal.
Wetmore, Kelly M.; Price, Morgan N.; Waters, Robert J.; Lamson, Jacob S.; He, Jennifer; Hoover, Cindi A.; Blow, Matthew J.; Bristow, James; Butland, Gareth
2015-01-01
ABSTRACT Transposon mutagenesis with next-generation sequencing (TnSeq) is a powerful approach to annotate gene function in bacteria, but existing protocols for TnSeq require laborious preparation of every sample before sequencing. Thus, the existing protocols are not amenable to the throughput necessary to identify phenotypes and functions for the majority of genes in diverse bacteria. Here, we present a method, random bar code transposon-site sequencing (RB-TnSeq), which increases the throughput of mutant fitness profiling by incorporating random DNA bar codes into Tn5 and mariner transposons and by using bar code sequencing (BarSeq) to assay mutant fitness. RB-TnSeq can be used with any transposon, and TnSeq is performed once per organism instead of once per sample. Each BarSeq assay requires only a simple PCR, and 48 to 96 samples can be sequenced on one lane of an Illumina HiSeq system. We demonstrate the reproducibility and biological significance of RB-TnSeq with Escherichia coli, Phaeobacter inhibens, Pseudomonas stutzeri, Shewanella amazonensis, and Shewanella oneidensis. To demonstrate the increased throughput of RB-TnSeq, we performed 387 successful genome-wide mutant fitness assays representing 130 different bacterium-carbon source combinations and identified 5,196 genes with significant phenotypes across the five bacteria. In P. inhibens, we used our mutant fitness data to identify genes important for the utilization of diverse carbon substrates, including a putative d-mannose isomerase that is required for mannitol catabolism. RB-TnSeq will enable the cost-effective functional annotation of diverse bacteria using mutant fitness profiling. PMID:25968644
Public domain optical character recognition
NASA Astrophysics Data System (ADS)
Garris, Michael D.; Blue, James L.; Candela, Gerald T.; Dimmick, Darrin L.; Geist, Jon C.; Grother, Patrick J.; Janet, Stanley A.; Wilson, Charles L.
1995-03-01
A public domain document processing system has been developed by the National Institute of Standards and Technology (NIST). The system is a standard reference form-based handprint recognition system for evaluating optical character recognition (OCR), and it is intended to provide a baseline of performance on an open application. The system's source code, training data, performance assessment tools, and type of forms processed are all publicly available. The system recognizes the handprint entered on handwriting sample forms like the ones distributed with NIST Special Database 1. From these forms, the system reads hand-printed numeric fields, upper and lowercase alphabetic fields, and unconstrained text paragraphs comprised of words from a limited-size dictionary. The modular design of the system makes it useful for component evaluation and comparison, training and testing set validation, and multiple system voting schemes. The system contains a number of significant contributions to OCR technology, including an optimized probabilistic neural network (PNN) classifier that operates a factor of 20 times faster than traditional software implementations of the algorithm. The source code for the recognition system is written in C and is organized into 11 libraries. In all, there are approximately 19,000 lines of code supporting more than 550 subroutines. Source code is provided for form registration, form removal, field isolation, field segmentation, character normalization, feature extraction, character classification, and dictionary-based postprocessing. The recognition system has been successfully compiled and tested on a host of UNIX workstations. This paper gives an overview of the recognition system's software architecture, including descriptions of the various system components along with timing and accuracy statistics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, F.; Brown, K.; Flach, G.
The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material propertiesmore » via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was developed to link GoldSim with external codes (Smith III et al. 2010). The DLL uses a list of code inputs provided by GoldSim to create an input file for the external application, runs the external code, and returns a list of outputs (read from files created by the external application) back to GoldSim. In this way GoldSim provides: (1) a unified user interface to the applications, (2) the capability of coupling selected codes in a synergistic manner, and (3) the capability of performing probabilistic uncertainty analysis with the codes. GoldSim is made available by the GoldSim Technology Group as a free 'Player' version that allows running but not editing GoldSim models. The player version makes the software readily available to a wider community of users that would wish to use the CBP application but do not have a license for GoldSim.« less
Toward performance portability of the Albany finite element analysis code using the Kokkos library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.
Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less
Toward performance portability of the Albany finite element analysis code using the Kokkos library
Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.; ...
2018-02-05
Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less
45 CFR 74.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Procurement Standards § 74.42 Codes of conduct... the gift is an unsolicited item of nominal value. The standards of conduct shall provide for...
Residential photovoltaic module and array requirements study
NASA Technical Reports Server (NTRS)
Nearhoof, S. L.; Oster, J. R.
1979-01-01
Design requirements for photovoltaic modules and arrays used in residential applications were identified. Building codes and referenced standards were reviewed for their applicability to residential photovoltaic array installations. Four installation types were identified - integral (replaces roofing), direct (mounted on top of roofing), stand-off (mounted away from roofing), and rack (for flat or low slope roofs, or ground mounted). Installation costs were developed for these mounting types as a function of panel/module size. Studies were performed to identify optimum module shapes and sizes and operating voltage cost drivers. It is concluded that there are no perceived major obstacles to the use of photovoltaic modules in residential arrays. However, there is no applicable building code category for residential photovoltaic modules and arrays and additional work with standards writing organizations is needed to develop residential module and array requirements.
A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.
Some partial-unit-memory convolutional codes
NASA Technical Reports Server (NTRS)
Abdel-Ghaffar, K.; Mceliece, R. J.; Solomon, G.
1991-01-01
The results of a study on a class of error correcting codes called partial unit memory (PUM) codes are presented. This class of codes, though not entirely new, has until now remained relatively unexplored. The possibility of using the well developed theory of block codes to construct a large family of promising PUM codes is shown. The performance of several specific PUM codes are compared with that of the Voyager standard (2, 1, 6) convolutional code. It was found that these codes can outperform the Voyager code with little or no increase in decoder complexity. This suggests that there may very well be PUM codes that can be used for deep space telemetry that offer both increased performance and decreased implementational complexity over current coding systems.
NASA Astrophysics Data System (ADS)
Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.
2010-04-01
An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.
NASA Astrophysics Data System (ADS)
Nitadori, Keigo; Makino, Junichiro; Hut, Piet
2006-12-01
The main performance bottleneck of gravitational N-body codes is the force calculation between two particles. We have succeeded in speeding up this pair-wise force calculation by factors between 2 and 10, depending on the code and the processor on which the code is run. These speed-ups were obtained by writing highly fine-tuned code for x86_64 microprocessors. Any existing N-body code, running on these chips, can easily incorporate our assembly code programs. In the current paper, we present an outline of our overall approach, which we illustrate with one specific example: the use of a Hermite scheme for a direct N2 type integration on a single 2.0 GHz Athlon 64 processor, for which we obtain an effective performance of 4.05 Gflops, for double-precision accuracy. In subsequent papers, we will discuss other variations, including the combinations of N log N codes, single-precision implementations, and performance on other microprocessors.
Insertion of operation-and-indicate instructions for optimized SIMD code
Eichenberger, Alexander E; Gara, Alan; Gschwind, Michael K
2013-06-04
Mechanisms are provided for inserting indicated instructions for tracking and indicating exceptions in the execution of vectorized code. A portion of first code is received for compilation. The portion of first code is analyzed to identify non-speculative instructions performing designated non-speculative operations in the first code that are candidates for replacement by replacement operation-and-indicate instructions that perform the designated non-speculative operations and further perform an indication operation for indicating any exception conditions corresponding to special exception values present in vector register inputs to the replacement operation-and-indicate instructions. The replacement is performed and second code is generated based on the replacement of the at least one non-speculative instruction. The data processing system executing the compiled code is configured to store special exception values in vector output registers, in response to a speculative instruction generating an exception condition, without initiating exception handling.
NASA Technical Reports Server (NTRS)
Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.
45 CFR 12.3 - General policies.
Code of Federal Regulations, 2012 CFR
2012-10-01
... with State or local zoning restrictions, building codes, or similar limitations. (e) Organizations...-exempt under section 501(c)(3) of the Internal Revenue Code of 1954. (c) Real property will be requested...
45 CFR 12.3 - General policies.
Code of Federal Regulations, 2014 CFR
2014-10-01
... with State or local zoning restrictions, building codes, or similar limitations. (e) Organizations...-exempt under section 501(c)(3) of the Internal Revenue Code of 1954. (c) Real property will be requested...
45 CFR 12.3 - General policies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... with State or local zoning restrictions, building codes, or similar limitations. (e) Organizations...-exempt under section 501(c)(3) of the Internal Revenue Code of 1954. (c) Real property will be requested...
45 CFR 12.3 - General policies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... with State or local zoning restrictions, building codes, or similar limitations. (e) Organizations...-exempt under section 501(c)(3) of the Internal Revenue Code of 1954. (c) Real property will be requested...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-23
... Offshore Drilling Units AGENCY: Coast Guard, DHS. ACTION: Notice of availability. SUMMARY: The Coast Guard...), Code for the Construction and Equipment of Mobile Offshore Drilling Units, 2009 (2009 MODU Code). CG...: Background and Purpose Foreign documented MODUs engaged in any offshore activity associated with the...
A Code of Ethics for All Adult Educators?
ERIC Educational Resources Information Center
Wood, George S., Jr.
1996-01-01
Offers a code of ethics for adult educators, outlining ethical responsibilities to society, to learners, to the sponsoring organization and other stakeholders, and to the profession. Stresses that this is more of a framework than a code, and adult educators can use it to reflect systematically upon the specifics of practice. (SK)
40 CFR 52.1570 - Identification of plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
... regulation, section 7:1-3.1 of New Jersey Air Pollution Control Code, submitted on November 20, 1973, by the... regulation, section 7:27-2.1 of the New Jersey Air Pollution Control Code, submitted on November 19, 1975, by... and Prohibition of Air Pollution by Volatile Organic Substances,” New Jersey Administrative Code (N.J...
40 CFR 52.1570 - Identification of plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
... regulation, section 7:1-3.1 of New Jersey Air Pollution Control Code, submitted on November 20, 1973, by the... regulation, section 7:27-2.1 of the New Jersey Air Pollution Control Code, submitted on November 19, 1975, by... and Prohibition of Air Pollution by Volatile Organic Substances,” New Jersey Administrative Code (N.J...
40 CFR 52.1570 - Identification of plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
... regulation, section 7:1-3.1 of New Jersey Air Pollution Control Code, submitted on November 20, 1973, by the... regulation, section 7:27-2.1 of the New Jersey Air Pollution Control Code, submitted on November 19, 1975, by... and Prohibition of Air Pollution by Volatile Organic Substances,” New Jersey Administrative Code (N.J...
International Code of Marketing of Breast-Milk Substitutes.
ERIC Educational Resources Information Center
World Health Organization, Geneva (Switzerland).
The World Health Organization's final draft of the "International Code of Marketing of Breast-milk Substitutes" is presented in its entirety. Recognizing that breast-feeding is an unequalled way of providing ideal food for the healthy growth and development of infants, the Code's aim is to contribute to the safe and adequate nutrition of…
Perspective: a culture of respect, part 2: creating a culture of respect.
Leape, Lucian L; Shore, Miles F; Dienstag, Jules L; Mayer, Robert J; Edgman-Levitan, Susan; Meyer, Gregg S; Healy, Gerald B
2012-07-01
Creating a culture of respect is the essential first step in a health care organization's journey to becoming a safe, high-reliability organization that provides a supportive and nurturing environment and a workplace that enables staff to engage wholeheartedly in their work. A culture of respect requires that the institution develop effective methods for responding to episodes of disrespectful behavior while also initiating the cultural changes needed to prevent such episodes from occurring. Both responding to and preventing disrespect are major challenges for the organization's leader, who must create the preconditions for change, lead in establishing and enforcing policies, enable frontline worker engagement, and facilitate the creation of a safe learning environment.When disrespectful behavior occurs, it must be addressed consistently and transparently. Central to an effective response is a code of conduct that establishes unequivocally the expectation that everyone is entitled to be treated with courtesy, honesty, respect, and dignity. The code must be enforced fairly through a clear and explicit process and applied consistently regardless of rank or station.Creating a culture of respect requires action on many fronts: modeling respectful conduct; educating students, physicians, and nonphysicians on appropriate behavior; conducting performance evaluations to identify those in need of help; providing counseling and training when needed; and supporting frontline changes that increase the sense of fairness, transparency, collaboration, and individual responsibility.
Error-Rate Bounds for Coded PPM on a Poisson Channel
NASA Technical Reports Server (NTRS)
Moision, Bruce; Hamkins, Jon
2009-01-01
Equations for computing tight bounds on error rates for coded pulse-position modulation (PPM) on a Poisson channel at high signal-to-noise ratio have been derived. These equations and elements of the underlying theory are expected to be especially useful in designing codes for PPM optical communication systems. The equations and the underlying theory apply, more specifically, to a case in which a) At the transmitter, a linear outer code is concatenated with an inner code that includes an accumulator and a bit-to-PPM-symbol mapping (see figure) [this concatenation is known in the art as "accumulate-PPM" (abbreviated "APPM")]; b) The transmitted signal propagates on a memoryless binary-input Poisson channel; and c) At the receiver, near-maximum-likelihood (ML) decoding is effected through an iterative process. Such a coding/modulation/decoding scheme is a variation on the concept of turbo codes, which have complex structures, such that an exact analytical expression for the performance of a particular code is intractable. However, techniques for accurately estimating the performances of turbo codes have been developed. The performance of a typical turbo code includes (1) a "waterfall" region consisting of a steep decrease of error rate with increasing signal-to-noise ratio (SNR) at low to moderate SNR, and (2) an "error floor" region with a less steep decrease of error rate with increasing SNR at moderate to high SNR. The techniques used heretofore for estimating performance in the waterfall region have differed from those used for estimating performance in the error-floor region. For coded PPM, prior to the present derivations, equations for accurate prediction of the performance of coded PPM at high SNR did not exist, so that it was necessary to resort to time-consuming simulations in order to make such predictions. The present derivation makes it unnecessary to perform such time-consuming simulations.
Bounds on Block Error Probability for Multilevel Concatenated Codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Moorthy, Hari T.; Stojanovic, Diana
1996-01-01
Maximum likelihood decoding of long block codes is not feasable due to large complexity. Some classes of codes are shown to be decomposable into multilevel concatenated codes (MLCC). For these codes, multistage decoding provides good trade-off between performance and complexity. In this paper, we derive an upper bound on the probability of block error for MLCC. We use this bound to evaluate difference in performance for different decompositions of some codes. Examples given show that a significant reduction in complexity can be achieved when increasing number of stages of decoding. Resulting performance degradation varies for different decompositions. A guideline is given for finding good m-level decompositions.
Brian Roy Lockhart; Ralph D. Nyland
2004-01-01
Professional ethics involve statements by a professional organization to guide the behavior of its members, and to help them determine acceptable and unacceptable behavior in a given situation. Most, if not all, natural resource organizations have Code of Ethics. How to incorporate them across the curriculum and in individual courses of a natural resources program is a...
Rady, Mohamed Y; Verheijde, Joseph L
2014-06-02
End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life.
2014-01-01
End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life. PMID:24888748
Fidelity of the Integrated Force Method Solution
NASA Technical Reports Server (NTRS)
Hopkins, Dale; Halford, Gary; Coroneos, Rula; Patnaik, Surya
2002-01-01
The theory of strain compatibility of the solid mechanics discipline was incomplete since St. Venant's 'strain formulation' in 1876. We have addressed the compatibility condition both in the continuum and the discrete system. This has lead to the formulation of the Integrated Force Method. A dual Integrated Force Method with displacement as the primal variable has also been formulated. A modest finite element code (IFM/Analyzers) based on the IFM theory has been developed. For a set of standard test problems the IFM results were compared with the stiffness method solutions and the MSC/Nastran code. For the problems IFM outperformed the existing methods. Superior IFM performance is attributed to simultaneous compliance of equilibrium equation and compatibility condition. MSC/Nastran organization expressed reluctance to accept the high fidelity IFM solutions. This report discusses the solutions to the examples. No inaccuracy was detected in the IFM solutions. A stiffness method code with a small programming effort can be improved to reap the many IFM benefits when implemented with the IFMD elements. Dr. Halford conducted a peer-review on the Integrated Force Method. Reviewers' response is included.
International variation in the definition of 'main condition' in ICD-coded health data.
Quan, H; Moskal, L; Forster, A J; Brien, S; Walker, R; Romano, P S; Sundararajan, V; Burnand, B; Henriksson, G; Steinum, O; Droesler, S; Pincus, H A; Ghali, W A
2014-10-01
Hospital-based medical records are abstracted to create International Classification of Disease (ICD) coded discharge health data in many countries. The 'main condition' is not defined in a consistent manner internationally. Some countries employ a 'reason for admission' rule as the basis for the main condition, while other countries employ a 'resource use' rule. A few countries have recently transitioned from one of these approaches to the other. The definition of 'main condition' in such ICD data matters when it is used to define a disease cohort to assign diagnosis-related groups and to perform risk adjustment. We propose a method of harmonizing the international definition to enable researchers and international organizations using ICD-coded health data to aggregate or compare hospital care and outcomes across countries in a consistent manner. Inter-observer reliability of alternative harmonization approaches should be evaluated before finalizing the definition and adopting it worldwide. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
Comparison of simple sequence repeats in 19 Archaea.
Trivedi, S
2006-12-05
All organisms that have been studied until now have been found to have differential distribution of simple sequence repeats (SSRs), with more SSRs in intergenic than in coding sequences. SSR distribution was investigated in Archaea genomes where complete chromosome sequences of 19 Archaea were analyzed with the program SPUTNIK to find di- to penta-nucleotide repeats. The number of repeats was determined for the complete chromosome sequences and for the coding and non-coding sequences. Different from what has been found for other groups of organisms, there is an abundance of SSRs in coding regions of the genome of some Archaea. Dinucleotide repeats were rare and CG repeats were found in only two Archaea. In general, trinucleotide repeats are the most abundant SSR motifs; however, pentanucleotide repeats are abundant in some Archaea. Some of the tetranucleotide and pentanucleotide repeat motifs are organism specific. In general, repeats are short and CG-rich repeats are present in Archaea having a CG-rich genome. Among the 19 Archaea, SSR density was not correlated with genome size or with optimum growth temperature. Pentanucleotide density had an inverse correlation with the CG content of the genome.
Global Magnetohydrodynamic Simulation Using High Performance FORTRAN on Parallel Computers
NASA Astrophysics Data System (ADS)
Ogino, T.
High Performance Fortran (HPF) is one of modern and common techniques to achieve high performance parallel computation. We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5 VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.
NASA Technical Reports Server (NTRS)
Tsuchiya, T.; Murthy, S. N. B.
1982-01-01
A computer code is presented for the prediction of off-design axial flow compressor performance with water ingestion. Four processes were considered to account for the aero-thermo-mechanical interactions during operation with air-water droplet mixture flow: (1) blade performance change, (2) centrifuging of water droplets, (3) heat and mass transfer process between the gaseous and the liquid phases and (4) droplet size redistribution due to break-up. Stage and compressor performance are obtained by a stage stacking procedure using representative veocity diagrams at a rotor inlet and outlet mean radii. The Code has options for performance estimation with (1) mixtures of gas and (2) gas-water droplet mixtures, and therefore can take into account the humidity present in ambient conditions. A test case illustrates the method of using the Code. The Code follows closely the methodology and architecture of the NASA-STGSTK Code for the estimation of axial-flow compressor performance with air flow.
7 CFR 1160.114 - Eligible organization.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 9 2010-01-01 2009-01-01 true Eligible organization. 1160.114 Section 1160.114... Order Definitions § 1160.114 Eligible organization. Eligible organization means an organization eligible... organization pursuant to section 501(c) (3), (5), or (6) of the Internal Revenue Code (26 U.S.C. 501(c) (3), (5...
Binary weight distributions of some Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Pollara, F.; Arnold, S.
1992-01-01
The binary weight distributions of the (7,5) and (15,9) Reed-Solomon (RS) codes and their duals are computed using the MacWilliams identities. Several mappings of symbols to bits are considered and those offering the largest binary minimum distance are found. These results are then used to compute bounds on the soft-decoding performance of these codes in the presence of additive Gaussian noise. These bounds are useful for finding large binary block codes with good performance and for verifying the performance obtained by specific soft-coding algorithms presently under development.
Long distance quantum communication with quantum Reed-Solomon codes
NASA Astrophysics Data System (ADS)
Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang; Jianggroup Team
We study the construction of quantum Reed Solomon codes from classical Reed Solomon codes and show that they achieve the capacity of quantum erasure channel for multi-level quantum systems. We extend the application of quantum Reed Solomon codes to long distance quantum communication, investigate the local resource overhead needed for the functioning of one-way quantum repeaters with these codes, and numerically identify the parameter regime where these codes perform better than the known quantum polynomial codes and quantum parity codes . Finally, we discuss the implementation of these codes into time-bin photonic states of qubits and qudits respectively, and optimize the performance for one-way quantum repeaters.
15 CFR 14.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-01-01
... ORGANIZATIONS Post-Award Requirements Procurement Standards § 14.42 Codes of conduct. The recipient shall... standards for situations in which the financial interest is not substantial or the gift is an unsolicited...
20 CFR 435.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-04-01
... ORGANIZATIONS Post-Award Requirements Procurement Standards § 435.42 Codes of conduct. The recipient must... set standards for situations in which the financial interest is not substantial or the gift is an...
28 CFR 70.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-PROFIT ORGANIZATIONS Post-Award Requirements Procurement Standards § 70.42 Codes of conduct. The..., recipients may set standards for situations in which the financial interest is not substantial or the gift is...
The Geometric Organizer: A Study Technique.
ERIC Educational Resources Information Center
Derr, Alice M.; Peters, Chris L.
1986-01-01
The geometric organizer, a multisensory technique using visual mnemonic devices that key information to color-coded geometric shapes, can help learning disabled students read, organize, and study information in content subject textbooks. (CL)
Agency-wide Quality System Documents
Quality specifications for EPA organizations as defined by EPA Directives are internal policy documents that apply only to EPA organizations. The Code of Federal Regulations defines specifications for extramural agreements with non-EPA organizations.
The organization of conspecific face space in nonhuman primates
Parr, Lisa A.; Taubert, Jessica; Little, Anthony C.; Hancock, Peter J. B.
2013-01-01
Humans and chimpanzees demonstrate numerous cognitive specializations for processing faces, but comparative studies with monkeys suggest that these may be the result of recent evolutionary adaptations. The present study utilized the novel approach of face space, a powerful theoretical framework used to understand the representation of face identity in humans, to further explore species differences in face processing. According to the theory, faces are represented by vectors in a multidimensional space, the centre of which is defined by an average face. Each dimension codes features important for describing a face’s identity, and vector length codes the feature’s distinctiveness. Chimpanzees and rhesus monkeys discriminated male and female conspecifics’ faces, rated by humans for their distinctiveness, using a computerized task. Multidimensional scaling analyses showed that the organization of face space was similar between humans and chimpanzees. Distinctive faces had the longest vectors and were the easiest for chimpanzees to discriminate. In contrast, distinctiveness did not correlate with the performance of rhesus monkeys. The feature dimensions for each species’ face space were visualized and described using morphing techniques. These results confirm species differences in the perceptual representation of conspecific faces, which are discussed within an evolutionary framework. PMID:22670823
A computational search for box C/D snoRNA genes in the Drosophila melanogaster genome.
Accardo, M C; Giordano, E; Riccardo, S; Digilio, F A; Iazzetti, G; Calogero, R A; Furia, M
2004-12-12
In eukaryotes, the family of non-coding RNA genes includes a number of genes encoding small nucleolar RNAs (mainly C/D and H/ACA snoRNAs), which act as guides in the maturation or post-transcriptional modifications of target RNA molecules. Since in Drosophila melanogaster (Dm) only few examples of snoRNAs have been identified so far by cDNA libraries screening, integration of the molecular data with in silico identification of these types of genes could throw light on their organization in the Dm genome. We have performed a computational screening of the Dm genome for C/D snoRNA genes, followed by experimental validation of the putative candidates. Few of the 26 confirmed snoRNAs had been recognized by cDNA library analysis. Organization of the Dm genome was also found to be more variegated than previously suspected, with snoRNA genes nested in both the introns and exons of protein-coding genes. This finding suggests that the presence of additional mechanisms of snoRNA biogenesis based on the alternative production of overlapping mRNA/snoRNA molecules. Additional information is available at http://www.bioinformatica.unito.it/bioinformatics/snoRNAs.
Kim, Seungill; Kim, Myung-Shin; Kim, Yong-Min; Yeom, Seon-In; Cheong, Kyeongchae; Kim, Ki-Tae; Jeon, Jongbum; Kim, Sunggil; Kim, Do-Sun; Sohn, Seong-Han; Lee, Yong-Hwan; Choi, Doil
2015-01-01
The onion (Allium cepa L.) is one of the most widely cultivated and consumed vegetable crops in the world. Although a considerable amount of onion transcriptome data has been deposited into public databases, the sequences of the protein-coding genes are not accurate enough to be used, owing to non-coding sequences intermixed with the coding sequences. We generated a high-quality, annotated onion transcriptome from de novo sequence assembly and intensive structural annotation using the integrated structural gene annotation pipeline (ISGAP), which identified 54,165 protein-coding genes among 165,179 assembled transcripts totalling 203.0 Mb by eliminating the intron sequences. ISGAP performed reliable annotation, recognizing accurate gene structures based on reference proteins, and ab initio gene models of the assembled transcripts. Integrative functional annotation and gene-based SNP analysis revealed a whole biological repertoire of genes and transcriptomic variation in the onion. The method developed in this study provides a powerful tool for the construction of reference gene sets for organisms based solely on de novo transcriptome data. Furthermore, the reference genes and their variation described here for the onion represent essential tools for molecular breeding and gene cloning in Allium spp. PMID:25362073
Maljovec, D.; Liu, S.; Wang, B.; ...
2015-07-14
Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less
One-way quantum repeaters with quantum Reed-Solomon codes
NASA Astrophysics Data System (ADS)
Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang
2018-05-01
We show that quantum Reed-Solomon codes constructed from classical Reed-Solomon codes can approach the capacity on the quantum erasure channel of d -level systems for large dimension d . We study the performance of one-way quantum repeaters with these codes and obtain a significant improvement in key generation rate compared to previously investigated encoding schemes with quantum parity codes and quantum polynomial codes. We also compare the three generations of quantum repeaters using quantum Reed-Solomon codes and identify parameter regimes where each generation performs the best.
Optimized atom position and coefficient coding for matching pursuit-based image compression.
Shoa, Alireza; Shirani, Shahram
2009-12-01
In this paper, we propose a new encoding algorithm for matching pursuit image coding. We show that coding performance is improved when correlations between atom positions and atom coefficients are both used in encoding. We find the optimum tradeoff between efficient atom position coding and efficient atom coefficient coding and optimize the encoder parameters. Our proposed algorithm outperforms the existing coding algorithms designed for matching pursuit image coding. Additionally, we show that our algorithm results in better rate distortion performance than JPEG 2000 at low bit rates.
New primary renal diagnosis codes for the ERA-EDTA
Venkat-Raman, Gopalakrishnan; Tomson, Charles R.V.; Gao, Yongsheng; Cornet, Ronald; Stengel, Benedicte; Gronhagen-Riska, Carola; Reid, Chris; Jacquelinet, Christian; Schaeffner, Elke; Boeschoten, Els; Casino, Francesco; Collart, Frederic; De Meester, Johan; Zurriaga, Oscar; Kramar, Reinhard; Jager, Kitty J.; Simpson, Keith
2012-01-01
The European Renal Association-European Dialysis and Transplant Association (ERA-EDTA) Registry has produced a new set of primary renal diagnosis (PRD) codes that are intended for use by affiliated registries. It is designed specifically for use in renal centres and registries but is aligned with international coding standards supported by the WHO (International Classification of Diseases) and the International Health Terminology Standards Development Organization (SNOMED Clinical Terms). It is available as supplementary material to this paper and free on the internet for non-commercial, clinical, quality improvement and research use, and by agreement with the ERA-EDTA Registry for use by commercial organizations. Conversion between the old and the new PRD codes is possible. The new codes are very flexible and will be actively managed to keep them up-to-date and to ensure that renal medicine can remain at the forefront of the electronic revolution in medicine, epidemiology research and the use of decision support systems to improve the care of patients. PMID:23175621
24 CFR 84.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Procurement Standards § 84.42 Codes of... substantial or the gift is an unsolicited item of nominal value. The standards of conduct shall provide for...
22 CFR 145.42 - Code of conduct.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Procurement Standards § 145.42 Code of... substantial or the gift is an unsolicited item of nominal value. The standards of conduct shall provide for...
38 CFR 49.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-07-01
... NON-PROFIT ORGANIZATIONS Post-Award Requirements Procurement Standards § 49.42 Codes of conduct. The..., recipients may set standards for situations in which the financial interest is not substantial or the gift is...
32 CFR 32.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-07-01
... NON-PROFIT ORGANIZATIONS Post-Award Requirements Procurement Standards § 32.42 Codes of conduct. The..., recipients may set standards for situations in which the financial interest is not substantial or the gift is...
14 CFR 1260.142 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., Hospitals, and Other Non-Profit Organizations Procurement Standards § 1260.142 Codes of conduct. The..., recipients may set standards for situations in which the financial interest is not substantial or the gift is...
40 CFR 30.42 - Codes of conduct.
Code of Federal Regulations, 2010 CFR
2010-07-01
... NON-PROFIT ORGANIZATIONS Post-Award Requirements Procurement Standards § 30.42 Codes of conduct. The..., recipients may set standards for situations in which the financial interest is not substantial or the gift is...
Carbon source-dependent expansion of the genetic code in bacteria
Prat, Laure; Heinemann, Ilka U.; Aerni, Hans R.; Rinehart, Jesse; O’Donoghue, Patrick; Söll, Dieter
2012-01-01
Despite the fact that the genetic code is known to vary between organisms in rare cases, it is believed that in the lifetime of a single cell the code is stable. We found Acetohalobium arabaticum cells grown on pyruvate genetically encode 20 amino acids, but in the presence of trimethylamine (TMA), A. arabaticum dynamically expands its genetic code to 21 amino acids including pyrrolysine (Pyl). A. arabaticum is the only known organism that modulates the size of its genetic code in response to its environment and energy source. The gene cassette pylTSBCD, required to biosynthesize and genetically encode UAG codons as Pyl, is present in the genomes of 24 anaerobic archaea and bacteria. Unlike archaeal Pyl-decoding organisms that constitutively encode Pyl, we observed that A. arabaticum controls Pyl encoding by down-regulating transcription of the entire Pyl operon under growth conditions lacking TMA, to the point where no detectable Pyl-tRNAPyl is made in vivo. Pyl-decoding archaea adapted to an expanded genetic code by minimizing TAG codon frequency to typically ∼5% of ORFs, whereas Pyl-decoding bacteria (∼20% of ORFs contain in-frame TAGs) regulate Pyl-tRNAPyl formation and translation of UAG by transcriptional deactivation of genes in the Pyl operon. We further demonstrate that Pyl encoding occurs in a bacterium that naturally encodes the Pyl operon, and identified Pyl residues by mass spectrometry in A. arabaticum proteins including two methylamine methyltransferases. PMID:23185002
Self-organized Evaluation of Dynamic Hand Gestures for Sign Language Recognition
NASA Astrophysics Data System (ADS)
Buciu, Ioan; Pitas, Ioannis
Two main theories exist with respect to face encoding and representation in the human visual system (HVS). The first one refers to the dense (holistic) representation of the face, where faces have "holon"-like appearance. The second one claims that a more appropriate face representation is given by a sparse code, where only a small fraction of the neural cells corresponding to face encoding is activated. Theoretical and experimental evidence suggest that the HVS performs face analysis (encoding, storing, face recognition, facial expression recognition) in a structured and hierarchical way, where both representations have their own contribution and goal. According to neuropsychological experiments, it seems that encoding for face recognition, relies on holistic image representation, while a sparse image representation is used for facial expression analysis and classification. From the computer vision perspective, the techniques developed for automatic face and facial expression recognition fall into the same two representation types. Like in Neuroscience, the techniques which perform better for face recognition yield a holistic image representation, while those techniques suitable for facial expression recognition use a sparse or local image representation. The proposed mathematical models of image formation and encoding try to simulate the efficient storing, organization and coding of data in the human cortex. This is equivalent with embedding constraints in the model design regarding dimensionality reduction, redundant information minimization, mutual information minimization, non-negativity constraints, class information, etc. The presented techniques are applied as a feature extraction step followed by a classification method, which also heavily influences the recognition results.
Light Infantry in the Defense of Urban Europe.
1986-12-14
if applicable) 6c. ADDRESS (City, State, and ZIP Code ) 7b. ADDRESS (City, State, and ZIP Code ) Fort Leavenworth, Kansas 66027-6900 Ba. NAME OF FUNDING...SPONSORING 8b. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If applicable) Sc. ADDRESS (City, State, and ZIP Code ) 10...PAGE COUNT wo - EFROM TO144 16. SUPPLEMENTARY NOTATION 17. COSATI CODES A*SUBJECT TERMS (Continue on reverse if necessary and identify by block
Unaligned instruction relocation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.
In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unalignedmore » ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.« less
Unaligned instruction relocation
Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.; Sura, Zehra N.
2018-01-23
In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unaligned ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.
Volume accumulator design analysis computer codes
NASA Technical Reports Server (NTRS)
Whitaker, W. D.; Shimazaki, T. T.
1973-01-01
The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.
Performance analysis of parallel gravitational N-body codes on large GPU clusters
NASA Astrophysics Data System (ADS)
Huang, Si-Yi; Spurzem, Rainer; Berczik, Peter
2016-01-01
We compare the performance of two very different parallel gravitational N-body codes for astrophysical simulations on large Graphics Processing Unit (GPU) clusters, both of which are pioneers in their own fields as well as on certain mutual scales - NBODY6++ and Bonsai. We carry out benchmarks of the two codes by analyzing their performance, accuracy and efficiency through the modeling of structure decomposition and timing measurements. We find that both codes are heavily optimized to leverage the computational potential of GPUs as their performance has approached half of the maximum single precision performance of the underlying GPU cards. With such performance we predict that a speed-up of 200 - 300 can be achieved when up to 1k processors and GPUs are employed simultaneously. We discuss the quantitative information about comparisons of the two codes, finding that in the same cases Bonsai adopts larger time steps as well as larger relative energy errors than NBODY6++, typically ranging from 10 - 50 times larger, depending on the chosen parameters of the codes. Although the two codes are built for different astrophysical applications, in specified conditions they may overlap in performance at certain physical scales, thus allowing the user to choose either one by fine-tuning parameters accordingly.
EUGENE'HOM: A generic similarity-based gene finder using multiple homologous sequences.
Foissac, Sylvain; Bardou, Philippe; Moisan, Annick; Cros, Marie-Josée; Schiex, Thomas
2003-07-01
EUGENE'HOM is a gene prediction software for eukaryotic organisms based on comparative analysis. EUGENE'HOM is able to take into account multiple homologous sequences from more or less closely related organisms. It integrates the results of TBLASTX analysis, splice site and start codon prediction and a robust coding/non-coding probabilistic model which allows EUGENE'HOM to handle sequences from a variety of organisms. The current target of EUGENE'HOM is plant sequences. The EUGENE'HOM web site is available at http://genopole.toulouse.inra.fr/bioinfo/eugene/EuGeneHom/cgi-bin/EuGeneHom.pl.
Performance evaluation of MPEG internet video coding
NASA Astrophysics Data System (ADS)
Luo, Jiajia; Wang, Ronggang; Fan, Kui; Wang, Zhenyu; Li, Ge; Wang, Wenmin
2016-09-01
Internet Video Coding (IVC) has been developed in MPEG by combining well-known existing technology elements and new coding tools with royalty-free declarations. In June 2015, IVC project was approved as ISO/IEC 14496-33 (MPEG- 4 Internet Video Coding). It is believed that this standard can be highly beneficial for video services in the Internet domain. This paper evaluates the objective and subjective performances of IVC by comparing it against Web Video Coding (WVC), Video Coding for Browsers (VCB) and AVC High Profile. Experimental results show that IVC's compression performance is approximately equal to that of the AVC High Profile for typical operational settings, both for streaming and low-delay applications, and is better than WVC and VCB.
Throughput of Coded Optical CDMA Systems with AND Detectors
NASA Astrophysics Data System (ADS)
Memon, Kehkashan A.; Umrani, Fahim A.; Umrani, A. W.; Umrani, Naveed A.
2012-09-01
Conventional detection techniques used in optical code-division multiple access (OCDMA) systems are not optimal and result in poor bit error rate performance. This paper analyzes the coded performance of optical CDMA systems with AND detectors for enhanced throughput efficiencies and improved error rate performance. The results show that the use of AND detectors significantly improve the performance of an optical channel.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-04
... participants with low volumes of deposits have elected to use ``peel-off'' adhesive bar code labels instead of... printers or ``peel-off'' bar code labels. Effective October 8, 2010, DTC retired the outdated and unsupported SNA ticket print stream and the use of ``peel-off'' adhesive bar code labels. Participants...
Status of Metric Conversion A Survey of U.S. Standards Writing Organizations.
1982-05-01
Boiler and Pressure Vessel Code . 7...to and consistent with metrication of the ASME Boiler and Pressure Vessel Code . The Electrical Apparatus Service Association is a trade asso- ciation...metrication of TEMA Standards will be compatible to and consistent with metrication of the ASME Boiler and Pressure Vessel Code . TEMA’s metrication
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-29
... of a code of ethics by an investment advisor to include, at a minimum: (i) Standards of business... any violations of the code of ethics promptly to the chief compliance officer (``CCO'') or, provided the CCO also receives reports of all violations, to other persons designated in the code of ethics...
Lexical–Semantic Organization in Bilingual Children: Evidence From a Repeated Word Association Task
Sheng, Li; McGregor, Karla K.; Marian, Viorica
2007-01-01
Purpose This study examined lexical–semantic organization of bilingual children in their 2 languages and in relation to monolingual age-mates. Method Twelve Mandarin–English bilingual and 12 English monolingual children generated 3 associations to each of 36 words. Responses were coded as paradigmatic (dog–cat) or syntagmatic (dog–bark). Results Within the bilingual group, word association performance was comparable and correlated between 1st and 2nd languages. Bilingual and monolingual children demonstrated similar patterns of responses, but subtle group differences were also revealed. When between-group comparisons were made on English measures, there was a bilingual advantage in paradigmatic responding during the 1st elicitation and for verbs. Conclusion Results support previous studies in finding parallel development in bilinguals’ 1st- and 2nd-language lexical–semantic skills and provide preliminary evidence that bilingualism may enhance paradigmatic organization of the semantic lexicon. PMID:16787896
Neural Coding Mechanisms in Gustation.
1980-09-15
world is composed of four primary tastes ( sweet , sour, salty , and bitter), and that each of these is carried by a separate and private neural line, thus...ted sweet -sour- salty -bitter types. The mathematical method of analysis was hierarchical cluster analysis based on the responses of many neurons (20 to...block number) Taste Neural coding Neural organization Stimulus organization Olfaction AB TRACT M~ea -i .rvm~ .1* N necffas and idmatity by block mmnbwc
A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data
Morton, Elizabeth; Lamitina, Todd
2010-01-01
Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
Automated Concurrent Blackboard System Generation in C++
NASA Technical Reports Server (NTRS)
Kaplan, J. A.; McManus, J. W.; Bynum, W. L.
1999-01-01
In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.
2011-03-01
This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less
Layered Wyner-Ziv video coding.
Xu, Qian; Xiong, Zixiang
2006-12-01
Following recent theoretical works on successive Wyner-Ziv coding (WZC), we propose a practical layered Wyner-Ziv video coder using the DCT, nested scalar quantization, and irregular LDPC code based Slepian-Wolf coding (or lossless source coding with side information at the decoder). Our main novelty is to use the base layer of a standard scalable video coder (e.g., MPEG-4/H.26L FGS or H.263+) as the decoder side information and perform layered WZC for quality enhancement. Similar to FGS coding, there is no performance difference between layered and monolithic WZC when the enhancement bitstream is generated in our proposed coder. Using an H.26L coded version as the base layer, experiments indicate that WZC gives slightly worse performance than FGS coding when the channel (for both the base and enhancement layers) is noiseless. However, when the channel is noisy, extensive simulations of video transmission over wireless networks conforming to the CDMA2000 1X standard show that H.26L base layer coding plus Wyner-Ziv enhancement layer coding are more robust against channel errors than H.26L FGS coding. These results demonstrate that layered Wyner-Ziv video coding is a promising new technique for video streaming over wireless networks.
Investigating the use of quick response codes in the gross anatomy laboratory.
Traser, Courtney J; Hoffman, Leslie A; Seifert, Mark F; Wilson, Adam B
2015-01-01
The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student performance, and evaluated whether performance could be explained by the frequency of QR code usage. Question prompts and QR codes tagged on cadaveric specimens and models were available for four weeks as learning aids to medical (n = 155) and doctor of physical therapy (n = 39) students. Each QR code provided answers to posed questions in the form of embedded text or hyperlinked web pages. Students' perceptions were gathered using a formative questionnaire and practical examination scores were used to assess potential gains in student achievement. Overall, students responded positively to the use of QR codes in the gross anatomy laboratory as 89% (57/64) agreed the codes augmented their learning of anatomy. The users' most noticeable objection to using QR codes was the reluctance to bring their smartphones into the gross anatomy laboratory. A comparison between the performance of QR code users and non-users was found to be nonsignificant (P = 0.113), and no significant gains in performance (P = 0.302) were observed after the intervention. Learners welcomed the implementation of QR code technology in the gross anatomy laboratory, yet this intervention had no apparent effect on practical examination performance. © 2014 American Association of Anatomists.
Posttest analysis of the FFTF inherent safety tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Padilla, A. Jr.; Claybrook, S.W.
Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less
Zebra: An advanced PWR lattice code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, L.; Wu, H.; Zheng, Y.
2012-07-01
This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precisionmore » and a high efficiency. (authors)« less
Wartime Tracking of Class I Surface Shipments from Production or Procurement to Destination
1992-04-01
Armed Forces I ICAF-FAP National Defense University 6c. ADDRESS (City, State, ard ZIP Code ) 7b. ADDRESS (City, State, and ZIP Code ) Fort Lesley J...INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If applicable) 9c. ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK...COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP 19. ABSTRACT (Continue on reverse
Sollie, Annet; Sijmons, Rolf H; Lindhout, Dick; van der Ploeg, Ans T; Rubio Gozalbo, M Estela; Smit, G Peter A; Verheijen, Frans; Waterham, Hans R; van Weely, Sonja; Wijburg, Frits A; Wijburg, Rudolph; Visser, Gepke
2013-07-01
Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of rare and genetic disorders. This prevents the optimal registration of such patients in databases and thus data-sharing efforts. To improve care and to facilitate research for patients with metabolic disorders, we developed a new coding system for metabolic diseases with a dedicated group of clinical specialists. Next, we compared the resulting codes with those in ICD and SNOMED-CT. No matches were found in 76% of cases in ICD-10 and in 54% in SNOMED-CT. We conclude that there are sizable gaps in the SNOMED-CT and ICD coding systems for metabolic disorders. There may be similar gaps for other classes of rare and genetic disorders. We have demonstrated that expert groups can help in addressing such coding issues. Our coding system has been made available to the ICD and SNOMED-CT organizations as well as to the Orphanet and HPO organizations for further public application and updates will be published online (www.ddrmd.nl and www.cineas.org). © 2013 WILEY PERIODICALS, INC.
Leveraging organismal biology to forecast the effects of climate change.
Buckley, Lauren B; Cannistra, Anthony F; John, Aji
2018-04-26
Despite the pressing need for accurate forecasts of ecological and evolutionary responses to environmental change, commonly used modelling approaches exhibit mixed performance because they omit many important aspects of how organisms respond to spatially and temporally variable environments. Integrating models based on organismal phenotypes at the physiological, performance and fitness levels can improve model performance. We summarize current limitations of environmental data and models and discuss potential remedies. The paper reviews emerging techniques for sensing environments at fine spatial and temporal scales, accounting for environmental extremes, and capturing how organisms experience the environment. Intertidal mussel data illustrate biologically important aspects of environmental variability. We then discuss key challenges in translating environmental conditions into organismal performance including accounting for the varied timescales of physiological processes, for responses to environmental fluctuations including the onset of stress and other thresholds, and for how environmental sensitivities vary across lifecycles. We call for the creation of phenotypic databases to parameterize forecasting models and advocate for improved sharing of model code and data for model testing. We conclude with challenges in organismal biology that must be solved to improve forecasts over the next decade.acclimation, biophysical models, ecological forecasting, extremes, microclimate, spatial and temporal variability.
Al Jawaldeh, Ayoub; Sayed, Ghada
2018-04-05
Optimal breastfeeding practices and appropriate complementary feeding improve child health, survival and development. The countries of the Eastern Mediterranean Region have made significant strides in formulation and implementation of legislation to protect and promote breastfeeding based on The International Code of Marketing of Breast-milk Substitutes (the Code) and subsequent relevant World Health Assembly resolutions. To assess the implementation of the Code in the Region. Assessment was conducted by the World Health Organization (WHO) Regional Office for the Eastern Mediterranean using a WHO standard questionnaire. Seventeen countries in the Region have enacted legislation to protect breastfeeding. Only 6 countries have comprehensive legislation or other legal measures reflecting all or most provisions of the Code; 4 countries have legal measures incorporating many provisions of the Code; 7 countries have legal measures that contain a few provisions of the Code; 4 countries are currently studying the issue; and only 1 country has no measures in place. Further analysis of the legislation found that the text of articles in the laws fully reflected the Code articles in only 6 countries. Most countries need to revisit and amend existing national legislation to implement fully the Code and relevant World Health Assembly resolutions, supported by systematic monitoring and reporting. Copyright © World Health Organization (WHO) 2018. Some rights reserved. This work is available under the CC BY-NC-SA 3.0 IGO license (https://creativecommons.org/licenses/by-nc-sa/3.0/igo).
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less
MO-A-213-01: 2015 Economics Update Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dirksen, B.
2015-06-15
The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less
MO-A-213-02: 2015 Economics Update Part 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fontenot, J.
2015-06-15
The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less
NASA Technical Reports Server (NTRS)
Shambayati, Shervin
2001-01-01
In order to evaluate performance of strong channel codes in presence of imperfect carrier phase tracking for residual carrier BPSK modulation in this paper an approximate 'brick wall' model is developed which is independent of the channel code type for high data rates. It is shown that this approximation is reasonably accurate (less than 0.7dB for low FERs for (1784,1/6) code and less than 0.35dB for low FERs for (5920,1/6) code). Based on the approximation's accuracy, it is concluded that the effects of imperfect carrier tracking are more or less independent of the channel code type for strong channel codes. Therefore, the advantage that one strong channel code has over another with perfect carrier tracking translates to nearly the same advantage under imperfect carrier tracking conditions. This will allow the link designers to incorporate projected channel code performance of strong channel codes into their design tables without worrying about their behavior in the face of imperfect carrier phase tracking.
Tse, Tamara; Carey, Leeanne; Cadilhac, Dominique; Koh, Gerald Choon-Huat; Baum, Carolyn
2016-10-01
Aim To examine how Australia, Singapore and the United States of America (USA) match to the World Stroke Organization Global Stroke Services health system monitoring indicators (HSI). Design Descriptive comparative study Participants The health systems of Australia, Singapore, the USA. Outcome measures Published data available from each country were mapped to the 10 health system monitoring indicators proposed by the World Stroke Organization. Results Most health system monitoring indicators were at least partially met in each country. Thrombolytic agents were available for use in acute stroke. Stroke guidelines and stroke registry data were available in all three countries. Stroke incidence, prevalence, and mortality rates were available but at non-uniform times post-stroke. The International Classification of Disease 9 or 10 coding systems are used in all three countries. Standardized clinical audits are routine in Australia and the USA, but not in Singapore. The use of the modified Rankin Scale is collected sub-acutely but not at one year post-stroke in all three countries. Conclusions The three developed countries are performing well against the World Stroke Organization health system monitoring indicators for acute and sub-acute stroke care. However, improvements in stroke risk assessment and at one-year post-stroke outcome measurement are needed.
Optimizing fusion PIC code performance at scale on Cori Phase 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koskela, T. S.; Deslippe, J.
In this paper we present the results of optimizing the performance of the gyrokinetic full-f fusion PIC code XGC1 on the Cori Phase Two Knights Landing system. The code has undergone substantial development to enable the use of vector instructions in its most expensive kernels within the NERSC Exascale Science Applications Program. We study the single-node performance of the code on an absolute scale using the roofline methodology to guide optimization efforts. We have obtained 2x speedups in single node performance due to enabling vectorization and performing memory layout optimizations. On multiple nodes, the code is shown to scale wellmore » up to 4000 nodes, near half the size of the machine. We discuss some communication bottlenecks that were identified and resolved during the work.« less
An OpenMI Implementation of a Water Resources System using Simple Script Wrappers
NASA Astrophysics Data System (ADS)
Steward, D. R.; Aistrup, J. A.; Kulcsar, L.; Peterson, J. M.; Welch, S. M.; Andresen, D.; Bernard, E. A.; Staggenborg, S. A.; Bulatewicz, T.
2013-12-01
This team has developed an adaption of the Open Modelling Interface (OpenMI) that utilizes Simple Script Wrappers. Code is made OpenMI compliant through organization within three modules that initialize, perform time steps, and finalize results. A configuration file is prepared that specifies variables a model expects to receive as input and those it will make available as output. An example is presented for groundwater, economic, and agricultural production models in the High Plains Aquifer region of Kansas. Our models use the programming environments in Scilab and Matlab, along with legacy Fortran code, and our Simple Script Wrappers can also use Python. These models are collectively run within this interdisciplinary framework from initial conditions into the future. It will be shown that by applying model constraints to one model, the impact may be accessed on changes to the water resources system.
Characteristic evaluation of a Lithium-6 loaded neutron coincidence spectrometer.
Hayashi, M; Kaku, D; Watanabe, Y; Sagara, K
2007-01-01
Characteristics of a (6)Li-loaded neutron coincidence spectrometer were investigated from both measurements and Monte Carlo simulations. The spectrometer consists of three (6)Li-glass scintillators embedded in a liquid organic scintillator BC-501A, which can detect selectively neutrons that deposit the total energy in the BC-501A using a coincidence signal generated from the capture event of thermalised neutrons in the (6)Li-glass scintillators. The relative efficiency and the energy response were measured using 4.7, 7.2 and 9.0 MeV monoenergetic neutrons. The measured ones were compared with the Monte Carlo calculations performed by combining the neutron transport code PHITS and the scintillator response calculation code SCINFUL. The experimental light output spectra were in good agreement with the calculated ones in shape. The energy dependence of the detection efficiency was reproduced by the calculation. The response matrices for 1-10 MeV neutrons were finally obtained.
Simulating a transmon implementation of the surface code, Part II
NASA Astrophysics Data System (ADS)
O'Brien, Thomas; Tarasinski, Brian; Rol, Adriaan; Bultink, Niels; Fu, Xiang; Criger, Ben; Dicarlo, Leonardo
The majority of quantum error correcting circuit simulations use Pauli error channels, as they can be efficiently calculated. This raises two questions: what is the effect of more complicated physical errors on the logical qubit error rate, and how much more efficient can decoders become when accounting for realistic noise? To answer these questions, we design a minimal weight perfect matching decoder parametrized by a physically motivated noise model and test it on the full density matrix simulation of Surface-17, a distance-3 surface code. We compare performance against other decoders, for a range of physical parameters. Particular attention is paid to realistic sources of error for transmon qubits in a circuit QED architecture, and the requirements for real-time decoding via an FPGA Research funded by the Foundation for Fundamental Research on Matter (FOM), the Netherlands Organization for Scientific Research (NWO/OCW), IARPA, an ERC Synergy Grant, the China Scholarship Council, and Intel Corporation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Continuing the tradition established in prior years, this panel encompasses one of the broadest ranges of topics and issues of any panel at the Summer Study. It includes papers addressing all sectors, low-income residential to industrial, and views energy efficiency from many perspectives including programmatic, evaluation, codes, standards, legislation, technical transfer, economic development, and least-cost planning. The papers represent work being performed in most geographic regions of the United States and in the international arena, specifically Thailand, China, Europe, and Scandinavia. This delightful smorgasbord has been organized, based on general content area, into the following eight sessions: (1) new directionsmore » for low-income weatherization; (2) pursuing efficiency through legislation and standards; (3) international perspectives on energy efficiency; (4) technical transfer strategies; (5) government energy policy; (6) commercial codes and standards; (7) innovative programs; and, (8) state-of-the-art review. For these conference proceedings, individual papers are processed separately for the Energy Data Base.« less
Can a senior house officer's time be used more effectively?
Mitchell, J; Hayhurst, C; Robinson, S M
2004-09-01
To determine the amount of time senior house officers (SHO) spent performing tasks that could be delegated to a technician or administrative assistant and therefore to quantify the expected benefit that could be obtained by employing such physicians' assistants (PA). SHOs working in the emergency department were observed for one week by pre-clinical students who had been trained to code and time each task performed by SHOs. Activity was grouped into four categories (clinical, technical, administrative, and other). Those activities in the technical and administrative categories were those we believed could be performed by a PA. The SHOs worked 430 hours in total, of which only 25 hours were not coded due to lack of an observer. Of the 405 hours observed 86.2% of time was accounted for by the various codes. The process of taking a history and examining patients accounted for an average of 22% of coded time. Writing the patient's notes accounted for an average of 20% of coded time. Discussion with relatives and patients accounted for 4.7% of coded time and performing procedures accounted for 5.2% of coded time. On average across all shifts, 15% of coded time was spent doing either technical or administrative tasks. In this department an average of 15% of coded SHOs working time was spent performing administrative and technical tasks, rising to 17% of coded time during a night shift. This is equivalent to an average time of 78 minutes per 10 hour shift/SHO. Most tasks included in these categories could be performed by PAs thus potentially decreasing patient waiting times, improving risk management, allowing doctors to spend more time with their patients, and possibly improving doctors' training.
Design and optimization of a portable LQCD Monte Carlo code using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele
The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.
Initial Low-Reynolds Number Iced Aerodynamic Performance for CRM Wing
NASA Technical Reports Server (NTRS)
Woodard, Brian; Diebold, Jeff; Broeren, Andy; Potapczuk, Mark; Lee, Sam; Bragg, Michael
2015-01-01
NASA, FAA, ONERA, and other partner organizations have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large scale, three-dimensional swept wings. These are extremely complex phenomena important to the design, certification and safe operation of small and large transport aircraft. There is increasing demand to balance trade-offs in aircraft efficiency, cost and noise that tend to compete directly with allowable performance degradations over an increasing range of icing conditions. Computational fluid dynamics codes have reached a level of maturity that they are being proposed by manufacturers for use in certification of aircraft for flight in icing. However, sufficient high-quality data to evaluate their performance on iced swept wings are not currently available in the public domain and significant knowledge gaps remain.
COMPARISON OF ORGAN DOSES IN HUMAN PHANTOMS: VARIATIONS DUE TO BODY SIZE AND POSTURE.
Feng, Xu; Xiang-Hong, Jia; Qian, Liu; Xue-Jun, Yu; Zhan-Chun, Pan; Chun-Xin, Yang
2017-04-20
Organ dose calculations performed using human phantoms can provide estimates of astronauts' health risks due to cosmic radiation. However, the characteristics of such phantoms strongly affect the estimation precision. To investigate organ dose variations with body size and posture in human phantoms, a non-uniform rational B-spline boundary surfaces model was constructed based on cryosection images. This model was used to establish four phantoms with different body size and posture parameters, whose organs parameters were changed simultaneously and which were voxelised with 4 × 4 × 4 mm3 resolution. Then, using Monte Carlo transport code, the organ doses caused by ≤500 MeV isotropic incident protons were calculated. The dose variations due to body size differences within a certain range were negligible, and the doses received in crouching and standing-up postures were similar. Therefore, a standard Chinese phantom could be established, and posture changes cannot effectively protect astronauts during solar particle events. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Business Ethics and Your Organisation.
ERIC Educational Resources Information Center
Drummond, John
1990-01-01
Good ethics are good business. Top management should be committed to a code of ethics based on a true participative process. The organization should be willing to commit resources for training to ensure proper implementation of the code. (SK)
Processor-in-memory-and-storage architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeBenedictis, Erik
A method and apparatus for performing reliable general-purpose computing. Each sub-core of a plurality of sub-cores of a processor core processes a same instruction at a same time. A code analyzer receives a plurality of residues that represents a code word corresponding to the same instruction and an indication of whether the code word is a memory address code or a data code from the plurality of sub-cores. The code analyzer determines whether the plurality of residues are consistent or inconsistent. The code analyzer and the plurality of sub-cores perform a set of operations based on whether the code wordmore » is a memory address code or a data code and a determination of whether the plurality of residues are consistent or inconsistent.« less
National Combustion Code Parallel Performance Enhancements
NASA Technical Reports Server (NTRS)
Quealy, Angela; Benyo, Theresa (Technical Monitor)
2002-01-01
The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.
Performance of the OVERFLOW-MLP and LAURA-MLP CFD Codes on the NASA Ames 512 CPU Origin System
NASA Technical Reports Server (NTRS)
Taft, James R.
2000-01-01
The shared memory Multi-Level Parallelism (MLP) technique, developed last year at NASA Ames has been very successful in dramatically improving the performance of important NASA CFD codes. This new and very simple parallel programming technique was first inserted into the OVERFLOW production CFD code in FY 1998. The OVERFLOW-MLP code's parallel performance scaled linearly to 256 CPUs on the NASA Ames 256 CPU Origin 2000 system (steger). Overall performance exceeded 20.1 GFLOP/s, or about 4.5x the performance of a dedicated 16 CPU C90 system. All of this was achieved without any major modification to the original vector based code. The OVERFLOW-MLP code is now in production on the inhouse Origin systems as well as being used offsite at commercial aerospace companies. Partially as a result of this work, NASA Ames has purchased a new 512 CPU Origin 2000 system to further test the limits of parallel performance for NASA codes of interest. This paper presents the performance obtained from the latest optimization efforts on this machine for the LAURA-MLP and OVERFLOW-MLP codes. The Langley Aerothermodynamics Upwind Relaxation Algorithm (LAURA) code is a key simulation tool in the development of the next generation shuttle, interplanetary reentry vehicles, and nearly all "X" plane development. This code sustains about 4-5 GFLOP/s on a dedicated 16 CPU C90. At this rate, expected workloads would require over 100 C90 CPU years of computing over the next few calendar years. It is not feasible to expect that this would be affordable or available to the user community. Dramatic performance gains on cheaper systems are needed. This code is expected to be perhaps the largest consumer of NASA Ames compute cycles per run in the coming year.The OVERFLOW CFD code is extensively used in the government and commercial aerospace communities to evaluate new aircraft designs. It is one of the largest consumers of NASA supercomputing cycles and large simulations of highly resolved full aircraft are routinely undertaken. Typical large problems might require 100s of Cray C90 CPU hours to complete. The dramatic performance gains with the 256 CPU steger system are exciting. Obtaining results in hours instead of months is revolutionizing the way in which aircraft manufacturers are looking at future aircraft simulation work. Figure 2 below is a current state of the art plot of OVERFLOW-MLP performance on the 512 CPU Lomax system. As can be seen, the chart indicates that OVERFLOW-MLP continues to scale linearly with CPU count up to 512 CPUs on a large 35 million point full aircraft RANS simulation. At this point performance is such that a fully converged simulation of 2500 time steps is completed in less than 2 hours of elapsed time. Further work over the next few weeks will improve the performance of this code even further.The LAURA code has been converted to the MLP format as well. This code is currently being optimized for the 512 CPU system. Performance statistics indicate that the goal of 100 GFLOP/s will be achieved by year's end. This amounts to 20x the 16 CPU C90 result and strongly demonstrates the viability of the new parallel systems rapidly solving very large simulations in a production environment.
ERIC Educational Resources Information Center
Geigle, Bryce A.
2014-01-01
The aim of this thesis is to investigate and present the status of student synthesis with color coded formula writing for grade level six through twelve, and to make recommendations for educators to teach writing structure through a color coded formula system in order to increase classroom engagement and lower students' affect. The thesis first…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-04
.... This Rule specifically requires the adoption of a code of ethics by an investment advisor to include... requiring supervised persons to report any violations of the code of ethics promptly to the chief compliance... designated in the code of ethics; and (v) provisions requiring the investment advisor to provide each of the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-05-26
The Circular calls the attention of Coast Guard field units, marine surveyors, shippers and carriers of nuclear materials to the International Maritime Organization (IMO) Code for the Safe Carriage of Irradiated Nuclear Fuel, Plutonium and High-Level Radioactive Wastes in Flasks on Board Ships (IMO Resolution A.748(18)).
Sociological and Communication-Theoretical Perspectives on the Commercialization of the Sciences
NASA Astrophysics Data System (ADS)
Leydesdorff, Loet
2013-10-01
Both self-organization and organization are important for the further development of the sciences: the two dynamics condition and enable each other. Commercial and public considerations can interact and "interpenetrate" in historical organization; different codes of communication are then "recombined". However, self-organization in the symbolically generalized codes of communication can be expected to operate at the global level. The Triple Helix model allows for both a neo-institutional appreciation in terms of historical networks of university-industry-government relations and a neo-evolutionary interpretation in terms of three functions: (1) novelty production, (2) wealth generation, and (3) political control. Using this model, one can appreciate both subdynamics. The mutual information in three dimensions enables us to measure the trade-off between organization and self-organization as a possible synergy. The question of optimization between commercial and public interests in the different sciences can thus be made empirical.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kloc, Malgorzata; Bilinski, Szczepan; Dougherty, Matthew T.
2007-05-01
Recent studies discovered a novel structural role of RNA in maintaining the integrity of the mitotic spindle and cellular cytoskeleton. In Xenopus laevis, non-coding Xlsirts and coding VegT RNAs play a structural role in anchoring localized RNAs, maintaining the organization of the cytokeratin cytoskeleton and germinal granules in the oocyte vegetal cortex and in subsequent development of the germline in the embryo. We studied the ultrastructural effects of antisense oligonucleotide driven ablation of Xlsirts and VegT RNAs on the organization of the cytokeratin, germ plasm and other components of the vegetal cortex. We developed a novel method to immunolabel andmore » visualize cytokeratin at the electron microscopy level, which allowed us to reconstruct the ultrastructural organization of the cytokeratin network relative to the components of the vegetal cortex in Xenopus oocytes. The removal of Xlsirts and VegT RNAs not only disrupts the cytokeratin cytoskeleton but also has a profound transcript-specific effect on the anchoring and distribution of germ plasm islands and their germinal granules and the arrangement of yolk platelets within the vegetal cortex. We suggest that the cytokeratin cytoskeleton plays a role in anchoring of germ plasm islands within the vegetal cortex and germinal granules within the germ plasm islands.« less
Validation of the NCC Code for Staged Transverse Injection and Computations for a RBCC Combustor
NASA Technical Reports Server (NTRS)
Ajmani, Kumud; Liu, Nan-Suey
2005-01-01
The NCC code was validated for a case involving staged transverse injection into Mach 2 flow behind a rearward facing step. Comparisons with experimental data and with solutions from the FPVortex code was then used to perform computations to study fuel-air mixing for the combustor of a candidate rocket based combined cycle engine geometry. Comparisons with a one-dimensional analysis and a three-dimensional code (VULCAN) were performed to assess the qualitative and quantitative performance of the NCC solver.
Extended Range Underwater Loudhailer for Port Security Applications
2006-06-01
and Subtitle Extended Range Underwater Loudhailer for Port Security Applications 6. Performing Organization Code Project No. 5903 7. Author( s ...used in the audio market . The name ‘RCA’ derives from the Radio Corporation of America, which introduced the design, by the early 1940s, to allow...Test June 2005 Test Range (yds) S pe ct ru m B an d Le ve l ( dB re 1 µ P a) S pe ct ru m B an d Le ve l ( dB re 1 µ P a) Figure 9
1984-12-01
AD-RI59 367 STATISTICS FROM THE OPERATION OF THE LOW-LEVEL WIND I/i SHEAR ALERT SYSTEM (L..(U) NATIONAL CENTER FOR ATOMSPHERIC RESEARCH BOULDER CO...NATIONAL BUREAU OF STANDARDS-1963A % % Oh b DOT/FAAIPM-84132 Statistics from the Operation of the Program Engineering Low-Level Wind Shear Alert System and...The Operation of The Low-Level Wind December 1984 Shear Alert System (LLWAS) During The JAWS Project: 6. Performing Organization Code An Interim Report
2006-12-01
COL Timothy A Mitchener, DC USA 5e. TASK NUMBER 6. AUTHOR( S ) 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) 8...SPONSORING/MONITORING AGENCY NAME( S ) AND 10. SPONSOR/MONITOR’S ACRONYM( S ) ADDRESS(ES) 11. SPONSOR/MONITOR’S REPORT NUMBER( S ) 12. DISTRIBUTION/AVAILABILITY...NATO) Standardization Agreement (STANAG), 5th edition, coding scheme. (See P.J. Amoroso, G.S. Smith, and N.S. Bell : Qualitative assessment of cause
1984-04-01
Directorate (Code 6032) V NAVAL AIR DEVELOPMENT CENTER Warminster, PA 18974 and I David A. Fender KETRON. INC. Warminster, PA 18974 DTlC APRIL 1984 ELECTE FINAL...A. D’Aulerio N62269-81-Z-0206 David A. Fender Task No. 630-1944 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMEN1T PROJECT, TASKAREA A...0102LF01401UNCLASSIFIED SECURITY CLAWFICATION OF TNIS PAGE (011t1 Die pewed) UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE w JIMu D#& Ent:ed) 9. Continued Louis A
2016-08-01
REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1...urrendy valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) , 2. REPORT TYPE 3. DATES COVERED (From...NUMBER (Include area code) 919-282-1050 Standard Form 298 (Rev. 8198) Pntscnbed by ANSI Std. Z39.18 Cost & Performance Report 58XX i COST
zorder-lib: Library API for Z-Order Memory Layout
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nowell, Lucy; Edward W. Bethel
2015-04-01
This document describes the motivation for, elements of, and use of the zorder-lib, a library API that implements organization of and access to data in memory using either a-order (also known as "row-major" order) or z-order memory layouts. The primary motivation for this work is to improve the performance of many types of data- intensive codes by increasing both spatial and temporal locality of memory accesses. The basic idea is that the cost associated with accessing a datum is less when it is nearby in either space or time.
On the error statistics of Viterbi decoding and the performance of concatenated codes
NASA Technical Reports Server (NTRS)
Miller, R. L.; Deutsch, L. J.; Butman, S. A.
1981-01-01
Computer simulation results are presented on the performance of convolutional codes of constraint lengths 7 and 10 concatenated with the (255, 223) Reed-Solomon code (a proposed NASA standard). These results indicate that as much as 0.8 dB can be gained by concatenating this Reed-Solomon code with a (10, 1/3) convolutional code, instead of the (7, 1/2) code currently used by the DSN. A mathematical model of Viterbi decoder burst-error statistics is developed and is validated through additional computer simulations.
NASA Technical Reports Server (NTRS)
Logan, Terry G.
1994-01-01
The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.
Performance of concatenated Reed-Solomon trellis-coded modulation over Rician fading channels
NASA Technical Reports Server (NTRS)
Moher, Michael L.; Lodge, John H.
1990-01-01
A concatenated coding scheme for providing very reliable data over mobile-satellite channels at power levels similar to those used for vocoded speech is described. The outer code is a shorter Reed-Solomon code which provides error detection as well as error correction capabilities. The inner code is a 1-D 8-state trellis code applied independently to both the inphase and quadrature channels. To achieve the full error correction potential of this inner code, the code symbols are multiplexed with a pilot sequence which is used to provide dynamic channel estimation and coherent detection. The implementation structure of this scheme is discussed and its performance is estimated.
Statistical mechanics of broadcast channels using low-density parity-check codes.
Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David
2003-03-01
We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.
KilBride, A L; Mason, S A; Honeyman, P C; Pritchard, D G; Hepple, S; Green, L E
2012-02-11
Animal health (AH) defines the outcome of their inspections of livestock holdings as full compliance with the legislation and welfare code (A), compliance with the legislation but not the code (B), non-compliance with legislation but no pain, distress or suffering obvious in the animals (C) or evidence of unnecessary pain or unnecessary distress (D). The aim of the present study was to investigate whether membership of farm assurance or organic certification schemes was associated with compliance with animal welfare legislation as inspected by AH. Participating schemes provided details of their members, past and present, and these records were matched against inspection data from AH. Multivariable multilevel logistic binomial models were built to investigate the association between compliance with legislation and membership of a farm assurance/organic scheme. The percentage of inspections coded A, B, C or D was 37.1, 35.6, 20.2 and 7.1 per cent, respectively. Once adjusted for year, country, enterprise, herd size and reason for inspection, there was a pattern of significantly reduced risk of codes C and D compared with A and B, in certified enterprises compared with the enterprises that were not known to be certified in all species.
Migration of the Gaudi and LHCb software repositories from CVS to Subversion
NASA Astrophysics Data System (ADS)
Clemencic, M.; Degaudenzi, H.; LHCb Collaboration
2011-12-01
A common code repository is of primary importance in a distributed development environment such as large HEP experiments. CVS (Concurrent Versions System) has been used in the past years at CERN for the hosting of shared software repositories, among which were the repositories for the Gaudi Framework and the LHCb software projects. Many developers around the world produced alternative systems to share code and revisions among several developers, mainly to overcome the limitations in CVS, and CERN has recently started a new service for code hosting based on the version control system Subversion. The differences between CVS and Subversion and the way the code was organized in Gaudi and LHCb CVS repositories required careful study and planning of the migration. Special care was used to define the organization of the new Subversion repository. To avoid as much as possible disruption in the development cycle, the migration has been gradual with the help of tools developed explicitly to hide the differences between the two systems. The principles guiding the migration steps, the organization of the Subversion repository and the tools developed will be presented, as well as the problems encountered both from the librarian and the user points of view.
NASA Astrophysics Data System (ADS)
Rivier, Leonard Gilles
Using an efficient parallel code solving the primitive equations of atmospheric dynamics, the jet structure of a Jupiter like atmosphere is modeled. In the first part of this thesis, a parallel spectral code solving both the shallow water equations and the multi-level primitive equations of atmospheric dynamics is built. The implementation of this code called BOB is done so that it runs effectively on an inexpensive cluster of workstations. A one dimensional decomposition and transposition method insuring load balancing among processes is used. The Legendre transform is cache-blocked. A "compute on the fly" of the Legendre polynomials used in the spectral method produces a lower memory footprint and enables high resolution runs on relatively small memory machines. Performance studies are done using a cluster of workstations located at the National Center for Atmospheric Research (NCAR). BOB performances are compared to the parallel benchmark code PSTSWM and the dynamical core of NCAR's CCM3.6.6. In both cases, the comparison favors BOB. In the second part of this thesis, the primitive equation version of the code described in part I is used to study the formation of organized zonal jets and equatorial superrotation in a planetary atmosphere where the parameters are chosen to best model the upper atmosphere of Jupiter. Two levels are used in the vertical and only large scale forcing is present. The model is forced towards a baroclinically unstable flow, so that eddies are generated by baroclinic instability. We consider several types of forcing, acting on either the temperature or the momentum field. We show that only under very specific parametric conditions, zonally elongated structures form and persist resembling the jet structure observed near the cloud level top (1 bar) on Jupiter. We also study the effect of an equatorial heat source, meant to be a crude representation of the effect of the deep convective planetary interior onto the outer atmospheric layer. We show that such heat forcing is able to produce strong equatorial superrotating winds, one of the most striking feature of the Jovian circulation.
Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara
2013-09-01
Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.
Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara
2013-01-01
Introduction: Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. Methods: A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Results: Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. Conclusion: According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives. PMID:25276730
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.
2016-02-16
Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.
2016-03-01
Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less
Identifying Vasopressor and Inotrope Use for Health Services Research
Fawzy, Ashraf; Bradford, Mark; Lindenauer, Peter K.
2016-01-01
Rationale: Identifying vasopressor and inotrope (vasopressor) use from administrative claims data may provide an important resource to study the epidemiology of shock. Objectives: Determine accuracy of identifying vasopressor use using International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) coding. Methods: Using administrative data enriched with pharmacy billing files (Premier, Inc., Charlotte, NC), we identified two cohorts: adult patients admitted with a diagnosis of sepsis from 2010 to 2013 or pulmonary embolism (PE) from 2008 to 2011. Vasopressor administration was obtained using pharmacy billing files (dopamine, dobutamine, epinephrine, milrinone, norepinephrine, phenylephrine, vasopressin) and compared with ICD-9-CM procedure code for vasopressor administration (00.17). We estimated performance characteristics of the ICD-9-CM code and compared patients’ characteristics and mortality rates according to vasopressor identification method. Measurements and Main Results: Using either pharmacy data or the ICD-9-CM procedure code, 29% of 541,144 patients in the sepsis cohort and 5% of 81,588 patients in the PE cohort were identified as receiving a vasopressor. In the sepsis cohort, the ICD-9-CM procedure code had low sensitivity (9.4%; 95% confidence interval, 9.2–9.5), which increased over time. Results were similar in the PE cohort (sensitivity, 5.8%; 95% confidence interval, 5.1–6.6). The ICD-9-CM code exhibited high specificity in the sepsis (99.8%) and PE (100%) cohorts. However, patients identified as receiving vasopressors by ICD-9-CM code had significantly higher unadjusted in-hospital mortality, had more acute organ failures, and were more likely hospitalized in the Northeast and West. Conclusions: The ICD-9-CM procedure code for vasopressor administration has low sensitivity and selects for higher severity of illness in studies of shock. Temporal changes in sensitivity would likely make longitudinal shock surveillance using ICD-9-CM inaccurate. PMID:26653145
Performance optimisations for distributed analysis in ALICE
NASA Astrophysics Data System (ADS)
Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.
2014-06-01
Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.
1983-10-01
Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less
Soft-decision decoding techniques for linear block codes and their error performance analysis
NASA Technical Reports Server (NTRS)
Lin, Shu
1996-01-01
The first paper presents a new minimum-weight trellis-based soft-decision iterative decoding algorithm for binary linear block codes. The second paper derives an upper bound on the probability of block error for multilevel concatenated codes (MLCC). The bound evaluates difference in performance for different decompositions of some codes. The third paper investigates the bit error probability code for maximum likelihood decoding of binary linear codes. The fourth and final paper included in this report is concerns itself with the construction of multilevel concatenated block modulation codes using a multilevel concatenation scheme for the frequency non-selective Rayleigh fading channel.
EUGÈNE'HOM: a generic similarity-based gene finder using multiple homologous sequences
Foissac, Sylvain; Bardou, Philippe; Moisan, Annick; Cros, Marie-Josée; Schiex, Thomas
2003-01-01
EUGÈNE'HOM is a gene prediction software for eukaryotic organisms based on comparative analysis. EUGÈNE'HOM is able to take into account multiple homologous sequences from more or less closely related organisms. It integrates the results of TBLASTX analysis, splice site and start codon prediction and a robust coding/non-coding probabilistic model which allows EUGÈNE'HOM to handle sequences from a variety of organisms. The current target of EUGÈNE'HOM is plant sequences. The EUGÈNE'HOM web site is available at http://genopole.toulouse.inra.fr/bioinfo/eugene/EuGeneHom/cgi-bin/EuGeneHom.pl. PMID:12824408
Color Coding Organic Chemicals for Inventory Control.
ERIC Educational Resources Information Center
Wystrach, V. P.; George, Babu
1985-01-01
Describes a system in which organic chemicals are recoded for inventory control and reshelving purposes. The system works well in undergraduate organic chemistry or biology laboratories but can be expanded to handle a larger and more complicated inventory. (JN)
Comparison of memory thresholds for planar qudit geometries
NASA Astrophysics Data System (ADS)
Marks, Jacob; Jochym-O'Connor, Tomas; Gheorghiu, Vlad
2017-11-01
We introduce and analyze a new type of decoding algorithm called general color clustering, based on renormalization group methods, to be used in qudit color codes. The performance of this decoder is analyzed under a generalized bit-flip error model, and is used to obtain the first memory threshold estimates for qudit 6-6-6 color codes. The proposed decoder is compared with similar decoding schemes for qudit surface codes as well as the current leading qubit decoders for both sets of codes. We find that, as with surface codes, clustering performs sub-optimally for qubit color codes, giving a threshold of 5.6 % compared to the 8.0 % obtained through surface projection decoding methods. However, the threshold rate increases by up to 112% for large qudit dimensions, plateauing around 11.9 % . All the analysis is performed using QTop, a new open-source software for simulating and visualizing topological quantum error correcting codes.
Operational rate-distortion performance for joint source and channel coding of images.
Ruf, M J; Modestino, J W
1999-01-01
This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.
Jardine, Bartholomew; Raymond, Gary M; Bassingthwaighte, James B
2015-01-01
The Modular Program Constructor (MPC) is an open-source Java based modeling utility, built upon JSim's Mathematical Modeling Language (MML) ( http://www.physiome.org/jsim/) that uses directives embedded in model code to construct larger, more complicated models quickly and with less error than manually combining models. A major obstacle in writing complex models for physiological processes is the large amount of time it takes to model the myriad processes taking place simultaneously in cells, tissues, and organs. MPC replaces this task with code-generating algorithms that take model code from several different existing models and produce model code for a new JSim model. This is particularly useful during multi-scale model development where many variants are to be configured and tested against data. MPC encodes and preserves information about how a model is built from its simpler model modules, allowing the researcher to quickly substitute or update modules for hypothesis testing. MPC is implemented in Java and requires JSim to use its output. MPC source code and documentation are available at http://www.physiome.org/software/MPC/.
Two dimension MDW OCDMA code cross-correlation for reduction of phase induced intensity noise
NASA Astrophysics Data System (ADS)
Ahmed, Israa Sh.; Aljunid, Syed A.; Nordin, Junita M.; Dulaimi, Layth A. Khalil Al; Matem, Rima
2017-11-01
In this paper, we first review 2-D MDW code cross correlation equations and table to be improved significantly by using code correlation properties. These codes can be used in the synchronous optical CDMA systems for multi access interference cancellation and maximum suppress the phase induced intensity noise. Low Psr is due to the reduction of interference noise that is induced by the 2-D MDW code PIIN suppression. High data rate causes increases in BER, requires high effective power and severely deteriorates the system performance. The 2-D W/T MDW code has an excellent system performance where the value of PIIN is suppressed as low as possible at the optimum Psr with high data bit rate. The 2-D MDW code shows better tolerance to PIIN in comparison to others with enhanced system performance. We prove by numerical analysis that the PIIN maximally suppressed by MDW code through the minimizing property of cross correlation in comparison to 2-D PDC and 2-D MQC OCDMA code.scheme systems.
Is phonology bypassed in normal or dyslexic development?
Pennington, B F; Lefly, D L; Van Orden, G C; Bookman, M O; Smith, S D
1987-01-01
A pervasive assumption in most accounts of normal reading and spelling development is that phonological coding is important early in development but is subsequently superseded by faster, orthographic coding which bypasses phonology. We call this assumption, which derives from dual process theory, the developmental bypass hypothesis. The present study tests four specific predictions of the developmental bypass hypothesis by comparing dyslexics and nondyslexics from the same families in a cross-sectional design. The four predictions are: 1) That phonological coding skill develops early in normal readers and soon reaches asymptote, whereas orthographic coding skill has a protracted course of development; 2) that the correlation of adult reading or spelling performance with phonological coding skill is considerably less than the correlation with orthographic coding skill; 3) that dyslexics who are mainly deficient in phonological coding skill should be able to bypass this deficit and eventually close the gap in reading and spelling performance; and 4) that the greatest differences between dyslexics and developmental controls on measures of phonological coding skill should be observed early rather than late in development.None of the four predictions of the developmental bypass hypothesis were upheld. Phonological coding skill continued to develop in nondyslexics until adulthood. It accounted for a substantial (32-53 percent) portion of the variance in reading and spelling performance in adult nondyslexics, whereas orthographic coding skill did not account for a statistically reliable portion of this variance. The dyslexics differed little across age in phonological coding skill, but made linear progress in orthographic coding skill, surpassing spelling-age (SA) controls by adulthood. Nonetheless, they didnot close the gap in reading and spelling performance. Finally, dyslexics were significantly worse than SA (and Reading Age [RA]) controls in phonological coding skill only in adulthood.
A scintillator-based approach to monitor secondary neutron production during proton therapy.
Clarke, S D; Pryser, E; Wieger, B M; Pozzi, S A; Haelg, R A; Bashkirov, V A; Schulte, R W
2016-11-01
The primary objective of this work is to measure the secondary neutron field produced by an uncollimated proton pencil beam impinging on different tissue-equivalent phantom materials using organic scintillation detectors. Additionally, the Monte Carlo code mcnpx-PoliMi was used to simulate the detector response for comparison to the measured data. Comparison of the measured and simulated data will validate this approach for monitoring secondary neutron dose during proton therapy. Proton beams of 155- and 200-MeV were used to irradiate a variety of phantom materials and secondary particles were detected using organic liquid scintillators. These detectors are sensitive to fast neutrons and gamma rays: pulse shape discrimination was used to classify each detected pulse as either a neutron or a gamma ray. The mcnpx-PoliMi code was used to simulate the secondary neutron field produced during proton irradiation of the same tissue-equivalent phantom materials. An experiment was performed at the Loma Linda University Medical Center proton therapy research beam line and corresponding models were created using the mcnpx-PoliMi code. The authors' analysis showed agreement between the simulations and the measurements. The simulated detector response can be used to validate the simulations of neutron and gamma doses on a particular beam line with or without a phantom. The authors have demonstrated a method of monitoring the neutron component of the secondary radiation field produced by therapeutic protons. The method relies on direct detection of secondary neutrons and gamma rays using organic scintillation detectors. These detectors are sensitive over the full range of biologically relevant neutron energies above 0.5 MeV and allow effective discrimination between neutron and photon dose. Because the detector system is portable, the described system could be used in the future to evaluate secondary neutron and gamma doses on various clinical beam lines for commissioning and prospective data collection in pediatric patients treated with proton therapy.
33 CFR 143.207 - Requirements for foreign MODUs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the International Maritime Organization (IMO, formerly Inter-Governmental Maritime Consultative Organization or IMCO) (IMO) Code for Construction and Equipment of Mobile Offshore Drilling Units (IMO Assembly...
Metal Hydrides, MOFs, and Carbon Composites as Space Radiation Shielding Mitigators
NASA Technical Reports Server (NTRS)
Atwell, William; Rojdev, Kristina; Liang, Daniel; Hill, Matthew
2014-01-01
Recently, metal hydrides and MOFs (Metal-Organic Framework/microporous organic polymer composites - for their hydrogen and methane storage capabilities) have been studied with applications in fuel cell technology. We have investigated a dual-use of these materials and carbon composites (CNT-HDPE) to include space radiation shielding mitigation. In this paper we present the results of a detailed study where we have analyzed 64 materials. We used the Band fit spectra for the combined 19-24 October 1989 solar proton events as the input source term radiation environment. These computational analyses were performed with the NASA high energy particle transport/dose code HZETRN. Through this analysis we have identified several of the materials that have excellent radiation shielding properties and the details of this analysis will be discussed further in the paper.
CFD Code Survey for Thrust Chamber Application
NASA Technical Reports Server (NTRS)
Gross, Klaus W.
1990-01-01
In the quest fo find analytical reference codes, responses from a questionnaire are presented which portray the current computational fluid dynamics (CFD) program status and capability at various organizations, characterizing liquid rocket thrust chamber flow fields. Sample cases are identified to examine the ability, operational condition, and accuracy of the codes. To select the best suited programs for accelerated improvements, evaluation criteria are being proposed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-13
...-adviser are subject to the provisions of Rule 204A-1 under the Advisers Act relating to codes of ethics. This Rule requires investment advisers to adopt a code of ethics that reflects the fiduciary nature of... specifically requires the adoption of a code of ethics by an investment advisor to include, at a minimum: (i...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-08
... Advisers Act Rule 204A-1. This Rule specifically requires the adoption of a code of ethics by an investment...) provisions requiring supervised persons to report any violations of the code of ethics promptly to the chief... designated in the code of ethics; and (v) provisions requiring the investment advisor to provide each of the...
Schaffler, James J.; Isely, J.J.
2001-01-01
This study demonstrates that coded wire tags can be used to mark certain insect larvae without adverse effects on maturation, and that tags are retained through the adult phase in high enough proportion for practical application. Coded wire tags also offer the benefit that marked organisms can be identified to the batch or individual level.
ERIC Educational Resources Information Center
Hickok, Gregory
2012-01-01
Speech recognition is an active process that involves some form of predictive coding. This statement is relatively uncontroversial. What is less clear is the source of the prediction. The dual-stream model of speech processing suggests that there are two possible sources of predictive coding in speech perception: the motor speech system and the…
NASA Technical Reports Server (NTRS)
Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos
1996-01-01
An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.
Performance measures for transform data coding.
NASA Technical Reports Server (NTRS)
Pearl, J.; Andrews, H. C.; Pratt, W. K.
1972-01-01
This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.
Users manual and modeling improvements for axial turbine design and performance computer code TD2-2
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.
The organic inventory of primitive meteorites
NASA Astrophysics Data System (ADS)
Martins, Zita
Carbonaceous meteorites are primitive samples that provide crucial information about the solar system genesis and evolution. This class of meteorites has also a rich organic inventory, which may have contributed the first prebiotic building blocks of life to the early Earth. We have studied the soluble organic inventory of several CR and CM meteorites, using high performance liquid chromatography with UV fluorescence detection (HPLC-FD), gas chromatography-mass spectrometry (GC-MS) and gas chromatography-combustion-isotope ratio mass spectrometry (GC-C-IRMS). Our target organic molecules include amino acids, nucleobases and polycyclic aromatic hydrocarbons (PAHs), among others. CR chondrites contain the highest amino acids concentration ever detected in a meteorite. The degree of aqueous alteration amongst this class of meteorites seems to be responsible for the amino acid distribution. Pioneering compound-specific carbon isotope measurements of nucleobases present in carbonaceous chondrites show that these compounds have a non-terrestrial origin. This suggests that components of the ge-netic code may have had a crucial role in life's origin. Investigating the abundances, distribution and isotopic composition of organic molecules in primitive meteorites significantly improves our knowledge of the chemistry of the early solar system, and the resources available for the first living organisms on Earth.
GROUND-WATER MODEL TESTING: SYSTEMATIC EVALUATION AND TESTING OF CODE FUNCTIONALITY AND PERFORMANCE
Effective use of ground-water simulation codes as management decision tools requires the establishment of their functionality, performance characteristics, and applicability to the problem at hand. This is accomplished through application of a systematic code-testing protocol and...
Mean Line Pump Flow Model in Rocket Engine System Simulation
NASA Technical Reports Server (NTRS)
Veres, Joseph P.; Lavelle, Thomas M.
2000-01-01
A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.
Coordinated design of coding and modulation systems
NASA Technical Reports Server (NTRS)
Massey, J. L.
1976-01-01
Work on partial unit memory codes continued; it was shown that for a given virtual state complexity, the maximum free distance over the class of all convolutional codes is achieved within the class of unit memory codes. The effect of phase-lock loop (PLL) tracking error on coding system performance was studied by using the channel cut-off rate as the measure of quality of a modulation system. Optimum modulation signal sets for a non-white Gaussian channel considered an heuristic selection rule based on a water-filling argument. The use of error correcting codes to perform data compression by the technique of syndrome source coding was researched and a weight-and-error-locations scheme was developed that is closely related to LDSC coding.
33 CFR 146.205 - Requirements for foreign MODUs.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) The operating standards for mobile offshore drilling units contained in the International Maritime Organization (IMO, formerly Inter-Governmental Maritime Consultative Organization or IMCO) (IMO) Code for the...
Low-Density Parity-Check (LDPC) Codes Constructed from Protographs
NASA Astrophysics Data System (ADS)
Thorpe, J.
2003-08-01
We introduce a new class of low-density parity-check (LDPC) codes constructed from a template called a protograph. The protograph serves as a blueprint for constructing LDPC codes of arbitrary size whose performance can be predicted by analyzing the protograph. We apply standard density evolution techniques to predict the performance of large protograph codes. Finally, we use a randomized search algorithm to find good protographs.
A GPU-accelerated implicit meshless method for compressible flows
NASA Astrophysics Data System (ADS)
Zhang, Jia-Le; Ma, Zhi-Hua; Chen, Hong-Quan; Cao, Cheng
2018-05-01
This paper develops a recently proposed GPU based two-dimensional explicit meshless method (Ma et al., 2014) by devising and implementing an efficient parallel LU-SGS implicit algorithm to further improve the computational efficiency. The capability of the original 2D meshless code is extended to deal with 3D complex compressible flow problems. To resolve the inherent data dependency of the standard LU-SGS method, which causes thread-racing conditions destabilizing numerical computation, a generic rainbow coloring method is presented and applied to organize the computational points into different groups by painting neighboring points with different colors. The original LU-SGS method is modified and parallelized accordingly to perform calculations in a color-by-color manner. The CUDA Fortran programming model is employed to develop the key kernel functions to apply boundary conditions, calculate time steps, evaluate residuals as well as advance and update the solution in the temporal space. A series of two- and three-dimensional test cases including compressible flows over single- and multi-element airfoils and a M6 wing are carried out to verify the developed code. The obtained solutions agree well with experimental data and other computational results reported in the literature. Detailed analysis on the performance of the developed code reveals that the developed CPU based implicit meshless method is at least four to eight times faster than its explicit counterpart. The computational efficiency of the implicit method could be further improved by ten to fifteen times on the GPU.
Numerical investigation of design and operational parameters on CHI spheromak performance
NASA Astrophysics Data System (ADS)
O'Bryan, J. B.; Romero-Talamas, C. A.; Woodruff, S.
2016-10-01
Nonlinear, extended-MHD computation with the NIMROD code is used to explore magnetic self-organization and performance with respect to externally controllable parameters in spheromaks formed with coaxial helicity injection. The goal of this study is to inform the design and operational parameters of proposed proof-of-principle spheromak experiment. The calculations explore multiple distinct phases of evolution (including adiabatic magnetic compression), which must be explored and optimized separately. Results indicate that modest changes to the design and operation of past experiments, e.g. SSPX [E.B. Hooper et al. PPCF 2012], could have significantly improved the plasma-current injector coupling efficiency and performance, particularly with respect to peak temperature and lifetime. Though we frequently characterize performance relative to SSPX, we are also exploring fundamentally different designs and modes of operation, e.g. flux compression. This work is supported by DAPRA under Grant No. N66001-14-1-4044.
LOINC, a universal standard for identifying laboratory observations: a 5-year update.
McDonald, Clement J; Huff, Stanley M; Suico, Jeffrey G; Hill, Gilbert; Leavelle, Dennis; Aller, Raymond; Forrey, Arden; Mercer, Kathy; DeMoor, Georges; Hook, John; Williams, Warren; Case, James; Maloney, Pat
2003-04-01
The Logical Observation Identifier Names and Codes (LOINC) database provides a universal code system for reporting laboratory and other clinical observations. Its purpose is to identify observations in electronic messages such as Health Level Seven (HL7) observation messages, so that when hospitals, health maintenance organizations, pharmaceutical manufacturers, researchers, and public health departments receive such messages from multiple sources, they can automatically file the results in the right slots of their medical records, research, and/or public health systems. For each observation, the database includes a code (of which 25 000 are laboratory test observations), a long formal name, a "short" 30-character name, and synonyms. The database comes with a mapping program called Regenstrief LOINC Mapping Assistant (RELMA(TM)) to assist the mapping of local test codes to LOINC codes and to facilitate browsing of the LOINC results. Both LOINC and RELMA are available at no cost from http://www.regenstrief.org/loinc/. The LOINC medical database carries records for >30 000 different observations. LOINC codes are being used by large reference laboratories and federal agencies, e.g., the CDC and the Department of Veterans Affairs, and are part of the Health Insurance Portability and Accountability Act (HIPAA) attachment proposal. Internationally, they have been adopted in Switzerland, Hong Kong, Australia, and Canada, and by the German national standards organization, the Deutsches Instituts für Normung. Laboratories should include LOINC codes in their outbound HL7 messages so that clinical and research clients can easily integrate these results into their clinical and research repositories. Laboratories should also encourage instrument vendors to deliver LOINC codes in their instrument outputs and demand LOINC codes in HL7 messages they get from reference laboratories to avoid the need to lump so many referral tests under the "send out lab" code.
WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code
NASA Astrophysics Data System (ADS)
Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O'Neill, B. J.; Nolting, C.; Edmon, P.; Donnert, J. M. F.; Jones, T. W.
2017-02-01
We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.
Kim, Seungill; Kim, Myung-Shin; Kim, Yong-Min; Yeom, Seon-In; Cheong, Kyeongchae; Kim, Ki-Tae; Jeon, Jongbum; Kim, Sunggil; Kim, Do-Sun; Sohn, Seong-Han; Lee, Yong-Hwan; Choi, Doil
2015-02-01
The onion (Allium cepa L.) is one of the most widely cultivated and consumed vegetable crops in the world. Although a considerable amount of onion transcriptome data has been deposited into public databases, the sequences of the protein-coding genes are not accurate enough to be used, owing to non-coding sequences intermixed with the coding sequences. We generated a high-quality, annotated onion transcriptome from de novo sequence assembly and intensive structural annotation using the integrated structural gene annotation pipeline (ISGAP), which identified 54,165 protein-coding genes among 165,179 assembled transcripts totalling 203.0 Mb by eliminating the intron sequences. ISGAP performed reliable annotation, recognizing accurate gene structures based on reference proteins, and ab initio gene models of the assembled transcripts. Integrative functional annotation and gene-based SNP analysis revealed a whole biological repertoire of genes and transcriptomic variation in the onion. The method developed in this study provides a powerful tool for the construction of reference gene sets for organisms based solely on de novo transcriptome data. Furthermore, the reference genes and their variation described here for the onion represent essential tools for molecular breeding and gene cloning in Allium spp. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.
Rao, Anoop; Wiley, Meg; Iyengar, Sridhar; Nadeau, Dan; Carnevale, Julie
2010-01-01
Background Studies have shown that controlling blood glucose can reduce the onset and progression of the long-term microvascular and neuropathic complications associated with the chronic course of diabetes mellitus. Improved glycemic control can be achieved by frequent testing combined with changes in medication, exercise, and diet. Technological advancements have enabled improvements in analytical accuracy of meters, and this paper explores two such parameters to which that accuracy can be attributed. Methods Four blood glucose monitoring systems (with or without dynamic electrochemistry algorithms, codeless or requiring coding prior to testing) were evaluated and compared with respect to their accuracy. Results Altogether, 108 blood glucose values were obtained for each system from 54 study participants and compared with the reference values. The analysis depicted in the International Organization for Standardization table format indicates that the devices with dynamic electrochemistry and the codeless feature had the highest proportion of acceptable results overall (System A, 101/103). Results were significant when compared at the 10% bias level with meters that were codeless and utilized static electrochemistry (p = .017) or systems that had static electrochemistry but needed coding (p = .008). Conclusions Analytical performance of these blood glucose meters differed significantly depending on their technologic features. Meters that utilized dynamic electrochemistry and did not require coding were more accurate than meters that used static electrochemistry or required coding. PMID:20167178
Rao, Anoop; Wiley, Meg; Iyengar, Sridhar; Nadeau, Dan; Carnevale, Julie
2010-01-01
Studies have shown that controlling blood glucose can reduce the onset and progression of the long-term microvascular and neuropathic complications associated with the chronic course of diabetes mellitus. Improved glycemic control can be achieved by frequent testing combined with changes in medication, exercise, and diet. Technological advancements have enabled improvements in analytical accuracy of meters, and this paper explores two such parameters to which that accuracy can be attributed. Four blood glucose monitoring systems (with or without dynamic electrochemistry algorithms, codeless or requiring coding prior to testing) were evaluated and compared with respect to their accuracy. Altogether, 108 blood glucose values were obtained for each system from 54 study participants and compared with the reference values. The analysis depicted in the International Organization for Standardization table format indicates that the devices with dynamic electrochemistry and the codeless feature had the highest proportion of acceptable results overall (System A, 101/103). Results were significant when compared at the 10% bias level with meters that were codeless and utilized static electrochemistry (p = .017) or systems that had static electrochemistry but needed coding (p = .008). Analytical performance of these blood glucose meters differed significantly depending on their technologic features. Meters that utilized dynamic electrochemistry and did not require coding were more accurate than meters that used static electrochemistry or required coding. 2010 Diabetes Technology Society.
Low Density Parity Check Codes Based on Finite Geometries: A Rediscovery and More
NASA Technical Reports Server (NTRS)
Kou, Yu; Lin, Shu; Fossorier, Marc
1999-01-01
Low density parity check (LDPC) codes with iterative decoding based on belief propagation achieve astonishing error performance close to Shannon limit. No algebraic or geometric method for constructing these codes has been reported and they are largely generated by computer search. As a result, encoding of long LDPC codes is in general very complex. This paper presents two classes of high rate LDPC codes whose constructions are based on finite Euclidean and projective geometries, respectively. These classes of codes a.re cyclic and have good constraint parameters and minimum distances. Cyclic structure adows the use of linear feedback shift registers for encoding. These finite geometry LDPC codes achieve very good error performance with either soft-decision iterative decoding based on belief propagation or Gallager's hard-decision bit flipping algorithm. These codes can be punctured or extended to obtain other good LDPC codes. A generalization of these codes is also presented.
A Response Surface Methodology for Bi-Level Integrated System Synthesis (BLISS)
NASA Technical Reports Server (NTRS)
Altus, Troy David; Sobieski, Jaroslaw (Technical Monitor)
2002-01-01
The report describes a new method for optimization of engineering systems such as aerospace vehicles whose design must harmonize a number of subsystems and various physical phenomena, each represented by a separate computer code, e.g., aerodynamics, structures, propulsion, performance, etc. To represent the system internal couplings, the codes receive output from other codes as part of their inputs. The system analysis and optimization task is decomposed into subtasks that can be executed concurrently, each subtask conducted using local state and design variables and holding constant a set of the system-level design variables. The subtasks results are stored in form of the Response Surfaces (RS) fitted in the space of the system-level variables to be used as the subtask surrogates in a system-level optimization whose purpose is to optimize the system objective(s) and to reconcile the system internal couplings. By virtue of decomposition and execution concurrency, the method enables a broad workfront in organization of an engineering project involving a number of specialty groups that might be geographically dispersed, and it exploits the contemporary computing technology of massively concurrent and distributed processing. The report includes a demonstration test case of supersonic business jet design.
NASA Astrophysics Data System (ADS)
Stoddard, M. A.; Etienne, L.; Fournier, M.; Pelot, R.; Beveridge, L.
2016-04-01
Maritime traffic volume in the Arctic is growing for several reasons: climate change is resulting in less ice in extent, duration, and thickness; economic drivers are inducing growth in resource extraction traffic, community size (affecting resupply) and adventure tourism. This dynamic situation, coupled with harsh weather, variable operating conditions, remoteness, and lack of straightforward emergency response options, demand robust risk management processes. The requirements for risk management for polar ship operations are specified in the new International Maritime Organization (IMO) International Code for Ships Operating in Polar Waters (Polar Code). The goal of the Polar Code is to provide for safe ship operations and protection of the polar environment by addressing the risk present in polar waters. Risk management is supported by evidence-based models, including threat identification (types and frequency of hazards), exposure levels, and receptor characterization. Most of the information used to perform risk management in polar waters is attained in-situ, but increasingly is being augmented with open-access remote sensing information. In this paper we focus on the use of open-access historical ice charts as an integral part of northern navigation, especially for route planning and evaluation.
HERCULES: A Pattern Driven Code Transformation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing
2012-01-01
New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less
Performance analysis of optical wireless communication system based on two-fold turbo code
NASA Astrophysics Data System (ADS)
Chen, Jun; Huang, Dexiu; Yuan, Xiuhua
2005-11-01
Optical wireless communication (OWC) is beginning to emerge in the telecommunications market as a strategy to meet last-mile demand owing to its unique combination of features. Turbo codes have an impressive near Shannon-limit error correcting performance. Twofold turbo codes have been recently introduced as the least complex member of the multifold turbo code family. In this paper, at first, we present the mathematical model of signal and optical wireless channel with fading and bit error rate model with scintillation, then we provide a new turbo code method to use in OWC system, we can obtain a better BER curse of OWC system with twofold turbo code than with common turbo code.
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson; ...
2018-06-14
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
Dai, Lengshi; Shinn-Cunningham, Barbara G
2016-01-01
Listeners with normal hearing thresholds (NHTs) differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in the cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding), onset event-related potentials (ERPs) from the scalp (reflecting cortical responses to sound) and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones); however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance), inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with NHTs can arise due to both subcortical coding differences and differences in attentional control, depending on stimulus characteristics and task demands.
Increasing Flexibility in Energy Code Compliance: Performance Packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Philip R.; Rosenberg, Michael I.
Energy codes and standards have provided significant increases in building efficiency over the last 38 years, since the first national energy code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. As the code matures, the prescriptive path becomes more complicated, and also more restrictive. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. Performance code paths are increasing in popularity; however, there remains a significant designmore » team overhead in following the performance path, especially for smaller buildings. This paper focuses on development of one alternative format, prescriptive packages. A method to develop building-specific prescriptive packages is reviewed based on a multiple runs of prototypical building models that are used to develop parametric decision analysis to determines a set of packages with equivalent energy performance. The approach is designed to be cost-effective and flexible for the design team while achieving a desired level of energy efficiency performance. A demonstration of the approach based on mid-sized office buildings with two HVAC system types is shown along with a discussion of potential applicability in the energy code process.« less
NASA Technical Reports Server (NTRS)
Divsalar, D.; Pollara, F.
1995-01-01
In this article, we design new turbo codes that can achieve near-Shannon-limit performance. The design criterion for random interleavers is based on maximizing the effective free distance of the turbo code, i.e., the minimum output weight of codewords due to weight-2 input sequences. An upper bound on the effective free distance of a turbo code is derived. This upper bound can be achieved if the feedback connection of convolutional codes uses primitive polynomials. We review multiple turbo codes (parallel concatenation of q convolutional codes), which increase the so-called 'interleaving gain' as q and the interleaver size increase, and a suitable decoder structure derived from an approximation to the maximum a posteriori probability decision rule. We develop new rate 1/3, 2/3, 3/4, and 4/5 constituent codes to be used in the turbo encoder structure. These codes, for from 2 to 32 states, are designed by using primitive polynomials. The resulting turbo codes have rates b/n (b = 1, 2, 3, 4 and n = 2, 3, 4, 5, 6), and include random interleavers for better asymptotic performance. These codes are suitable for deep-space communications with low throughput and for near-Earth communications where high throughput is desirable. The performance of these codes is within 1 dB of the Shannon limit at a bit-error rate of 10(exp -6) for throughputs from 1/15 up to 4 bits/s/Hz.
Concatenated Coding Using Trellis-Coded Modulation
NASA Technical Reports Server (NTRS)
Thompson, Michael W.
1997-01-01
In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Contractor. The individual or organization with whom the borrower enters into a contract for construction or...)(i)(E) of this subpart. (2) Voluntary national model building codes (model codes). Comprehensive... Department of Housing and Urban Development (HUD) Minimum Property Standards for Housing, Handbook 4910.1...
Transient analysis techniques in performing impact and crash dynamic studies
NASA Technical Reports Server (NTRS)
Pifko, A. B.; Winter, R.
1989-01-01
Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.
Hyperspectral IASI L1C Data Compression.
García-Sobrino, Joaquín; Serra-Sagristà, Joan; Bartrina-Rapesta, Joan
2017-06-16
The Infrared Atmospheric Sounding Interferometer (IASI), implemented on the MetOp satellite series, represents a significant step forward in atmospheric forecast and weather understanding. The instrument provides infrared soundings of unprecedented accuracy and spectral resolution to derive humidity and atmospheric temperature profiles, as well as some of the chemical components playing a key role in climate monitoring. IASI collects rich spectral information, which results in large amounts of data (about 16 Gigabytes per day). Efficient compression techniques are requested for both transmission and storage of such huge data. This study reviews the performance of several state of the art coding standards and techniques for IASI L1C data compression. Discussion embraces lossless, near-lossless and lossy compression. Several spectral transforms, essential to achieve improved coding performance due to the high spectral redundancy inherent to IASI products, are also discussed. Illustrative results are reported for a set of 96 IASI L1C orbits acquired over a full year (4 orbits per month for each IASI-A and IASI-B from July 2013 to June 2014) . Further, this survey provides organized data and facts to assist future research and the atmospheric scientific community.
High Performance Input/Output for Parallel Computer Systems
NASA Technical Reports Server (NTRS)
Ligon, W. B.
1996-01-01
The goal of our project is to study the I/O characteristics of parallel applications used in Earth Science data processing systems such as Regional Data Centers (RDCs) or EOSDIS. Our approach is to study the runtime behavior of typical programs and the effect of key parameters of the I/O subsystem both under simulation and with direct experimentation on parallel systems. Our three year activity has focused on two items: developing a test bed that facilitates experimentation with parallel I/O, and studying representative programs from the Earth science data processing application domain. The Parallel Virtual File System (PVFS) has been developed for use on a number of platforms including the Tiger Parallel Architecture Workbench (TPAW) simulator, The Intel Paragon, a cluster of DEC Alpha workstations, and the Beowulf system (at CESDIS). PVFS provides considerable flexibility in configuring I/O in a UNIX- like environment. Access to key performance parameters facilitates experimentation. We have studied several key applications fiom levels 1,2 and 3 of the typical RDC processing scenario including instrument calibration and navigation, image classification, and numerical modeling codes. We have also considered large-scale scientific database codes used to organize image data.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file, a discussion of radar cross section computations, a discussion of some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
Classification Techniques for Digital Map Compression
1989-03-01
classification improved the performance of the K-means classification algorithm resulting in a compression of 8.06:1 with Lempel - Ziv coding. Run-length coding... compression performance are run-length coding [2], [8] and Lempel - Ziv coding 110], [11]. These techniques are chosen because they are most efficient when...investigated. After the classification, some standard file compression methods, such as Lempel - Ziv and run-length encoding were applied to the
Extracellular Matrix Induced Integrin Signal Transduction and Breast Cancer Invasion.
1995-10-01
Metalloproteinase, breast, mammary, integrin, collagen, RGDS, matrilysin 49 breast cancer 16. PRICE CODE 17. SECURITY CLASSIFICATION 18. SECURITY...Organization Name(s) and Address(es). Self-explanatory. Block 16. Price Code. Enter appropriate price Block 8. Performinc!_rcanization Report code...areas of necrosis in the center of the tumor; a portion of the mammary gland can be seen in the lower right . The matrilysin in situ showed
ERIC Educational Resources Information Center
Nakashian, Mary
2008-01-01
Researchers from the Mailman School of Public Health at Columbia University prepared a case study of CODES (Community Outreach and Development Efforts Save). CODES is a coalition of 35 people and organizations in northern Manhattan committed to promoting safe streets, parks and schools. The case study analyzed the factors that prompted CODES'…
ERIC Educational Resources Information Center
World Health Organization, Copenhagen (Denmark). Regional Office for Europe.
For various reasons, several countries have had difficulty implementing the International Code of Marketing of Breast-milk Substitutes. To address those problems, a meeting was convened under the auspices of the World Health Organization. Specific purposes of the meeting were to inform member states about the Code and to develop national…
Evaluation of the DRAGON code for VHTR design analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division
2006-01-12
This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Tammie Renee; Tretiak, Sergei
2017-01-06
Understanding and controlling excited state dynamics lies at the heart of all our efforts to design photoactive materials with desired functionality. This tailor-design approach has become the standard for many technological applications (e.g., solar energy harvesting) including the design of organic conjugated electronic materials with applications in photovoltaic and light-emitting devices. Over the years, our team has developed efficient LANL-based codes to model the relevant photophysical processes following photoexcitation (spatial energy transfer, excitation localization/delocalization, and/or charge separation). The developed approach allows the non-radiative relaxation to be followed on up to ~10 ps timescales for large realistic molecules (hundreds of atomsmore » in size) in the realistic solvent dielectric environment. The Collective Electronic Oscillator (CEO) code is used to compute electronic excited states, and the Non-adiabatic Excited State Molecular Dynamics (NA-ESMD) code is used to follow the non-adiabatic dynamics on multiple coupled Born-Oppenheimer potential energy surfaces. Our preliminary NA-ESMD simulations have revealed key photoinduced mechanisms controlling competing interactions and relaxation pathways in complex materials, including organic conjugated polymer materials, and have provided a detailed understanding of photochemical products and intermediates and the internal conversion process during the initiation of energetic materials. This project will be using LANL-based CEO and NA-ESMD codes to model nonradiative relaxation in organic and energetic materials. The NA-ESMD and CEO codes belong to a class of electronic structure/quantum chemistry codes that require large memory, “long-queue-few-core” distribution of resources in order to make useful progress. The NA-ESMD simulations are trivially parallelizable requiring ~300 processors for up to one week runtime to reach a meaningful restart point.« less
Performance Bounds on Two Concatenated, Interleaved Codes
NASA Technical Reports Server (NTRS)
Moision, Bruce; Dolinar, Samuel
2010-01-01
A method has been developed of computing bounds on the performance of a code comprised of two linear binary codes generated by two encoders serially concatenated through an interleaver. Originally intended for use in evaluating the performances of some codes proposed for deep-space communication links, the method can also be used in evaluating the performances of short-block-length codes in other applications. The method applies, more specifically, to a communication system in which following processes take place: At the transmitter, the original binary information that one seeks to transmit is first processed by an encoder into an outer code (Co) characterized by, among other things, a pair of numbers (n,k), where n (n > k)is the total number of code bits associated with k information bits and n k bits are used for correcting or at least detecting errors. Next, the outer code is processed through either a block or a convolutional interleaver. In the block interleaver, the words of the outer code are processed in blocks of I words. In the convolutional interleaver, the interleaving operation is performed bit-wise in N rows with delays that are multiples of B bits. The output of the interleaver is processed through a second encoder to obtain an inner code (Ci) characterized by (ni,ki). The output of the inner code is transmitted over an additive-white-Gaussian- noise channel characterized by a symbol signal-to-noise ratio (SNR) Es/No and a bit SNR Eb/No. At the receiver, an inner decoder generates estimates of bits. Depending on whether a block or a convolutional interleaver is used at the transmitter, the sequence of estimated bits is processed through a block or a convolutional de-interleaver, respectively, to obtain estimates of code words. Then the estimates of the code words are processed through an outer decoder, which generates estimates of the original information along with flags indicating which estimates are presumed to be correct and which are found to be erroneous. From the perspective of the present method, the topic of major interest is the performance of the communication system as quantified in the word-error rate and the undetected-error rate as functions of the SNRs and the total latency of the interleaver and inner code. The method is embodied in equations that describe bounds on these functions. Throughout the derivation of the equations that embody the method, it is assumed that the decoder for the outer code corrects any error pattern of t or fewer errors, detects any error pattern of s or fewer errors, may detect some error patterns of more than s errors, and does not correct any patterns of more than t errors. Because a mathematically complete description of the equations that embody the method and of the derivation of the equations would greatly exceed the space available for this article, it must suffice to summarize by reporting that the derivation includes consideration of several complex issues, including relationships between latency and memory requirements for block and convolutional codes, burst error statistics, enumeration of error-event intersections, and effects of different interleaving depths. In a demonstration, the method was used to calculate bounds on the performances of several communication systems, each based on serial concatenation of a (63,56) expurgated Hamming code with a convolutional inner code through a convolutional interleaver. The bounds calculated by use of the method were compared with results of numerical simulations of performances of the systems to show the regions where the bounds are tight (see figure).
1986-09-01
ORGANIZATION Gjeoteehnical Laborator WESGR-M 6c ADDRESS (City, Slate, and ZIP Code ) 7b ADDRESS(City, State. and ZIP Code ) PO Box 631 Vicksburg, MS 39180...of Engineers 8< ADDRESS(City, State, and ZIP Code ) 10 SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT.. ", 1 :, • ; I, - u It ., " ’ ~f...Springfield, VA 22161 17 COSATI CODES 18 SUBJECT TERMS (Continue-On revprse of necessary and identify by block number) " FIELD GROUP SUB GROUP
The Lewis heat pipe code with application to SP-100 GES heat pipes
NASA Astrophysics Data System (ADS)
Baker, Karl W.; Tower, Leonard K.
The NASA Lewis Research Center has a thermal management program supporting SP-100 goals, which includes heat pipe radiator development. As a part of the program Lewis has elected to prepare an in-house heat pipe code tailored to the needs of its SP-100 staff to supplement codes from other sources. The latter, designed to meet the needs of the originating organizations, were deemed not entirely appropriate for use at Lewis. However, a review of their features proved most beneficial in the design of the Lewis code.
Current and anticipated uses of thermal-hydraulic codes in Germany
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teschendorff, V.; Sommer, F.; Depisch, F.
1997-07-01
In Germany, one third of the electrical power is generated by nuclear plants. ATHLET and S-RELAP5 are successfully applied for safety analyses of the existing PWR and BWR reactors and possible future reactors, e.g. EPR. Continuous development and assessment of thermal-hydraulic codes are necessary in order to meet present and future needs of licensing organizations, utilities, and vendors. Desired improvements include thermal-hydraulic models, multi-dimensional simulation, computational speed, interfaces to coupled codes, and code architecture. Real-time capability will be essential for application in full-scope simulators. Comprehensive code validation and quantification of uncertainties are prerequisites for future best-estimate analyses.
Investigation of CSRZ code in FSO communication
NASA Astrophysics Data System (ADS)
Zhang, Zhike; Chang, Mingchao; Zhu, Ninghua; Liu, Yu
2018-02-01
A cost-effective carrier-suppressed return-to-zero (CSRZ) code generation scheme is proposed by employing a directly modulated laser (DML) module operated at 1.5 μm wavelength. Furthermore, the performance of CSRZ code signal in free-space optical (FSO) link transmission is studied by simulation. It is found from the results that the atmospheric turbulence can deteriorate the transmission performance. However, due to have lower average transmit power and higher spectrum efficient, CSRZ code signal can obtain better amplitude suppression ratio compared to the Non-return-to-zero (NRZ) code.
Wetmore, Kelly M.; Price, Morgan N.; Waters, Robert J.; ...
2015-05-12
Transposon mutagenesis with next-generation sequencing (TnSeq) is a powerful approach to annotate gene function in bacteria, but existing protocols for TnSeq require laborious preparation of every sample before sequencing. Thus, the existing protocols are not amenable to the throughput necessary to identify phenotypes and functions for the majority of genes in diverse bacteria. Here, we present a method, random bar code transposon-site sequencing (RB-TnSeq), which increases the throughput of mutant fitness profiling by incorporating random DNA bar codes into Tn5 and mariner transposons and by using bar code sequencing (BarSeq) to assay mutant fitness. RB-TnSeq can be used with anymore » transposon, and TnSeq is performed once per organism instead of once per sample. Each BarSeq assay requires only a simple PCR, and 48 to 96 samples can be sequenced on one lane of an Illumina HiSeq system. We demonstrate the reproducibility and biological significance of RB-TnSeq with Escherichia coli, Phaeobacter inhibens, Pseudomonas stutzeri, Shewanella amazonensis, and Shewanella oneidensis. To demonstrate the increased throughput of RB-TnSeq, we performed 387 successful genome-wide mutant fitness assays representing 130 different bacterium-carbon source combinations and identified 5,196 genes with significant phenotypes across the five bacteria. In P. inhibens, we used our mutant fitness data to identify genes important for the utilization of diverse carbon substrates, including a putative D-mannose isomerase that is required for mannitol catabolism. RB-TnSeq will enable the cost-effective functional annotation of diverse bacteria using mutant fitness profiling. A large challenge in microbiology is the functional assessment of the millions of uncharacterized genes identified by genome sequencing. Transposon mutagenesis coupled to next-generation sequencing (TnSeq) is a powerful approach to assign phenotypes and functions to genes. However, the current strategies for TnSeq are too laborious to be applied to hundreds of experimental conditions across multiple bacteria. Here, we describe an approach, random bar code transposon-site sequencing (RB-TnSeq), which greatly simplifies the measurement of gene fitness by using bar code sequencing (BarSeq) to monitor the abundance of mutants. We performed 387 genome-wide fitness assays across five bacteria and identified phenotypes for over 5,000 genes. RB-TnSeq can be applied to diverse bacteria and is a powerful tool to annotate uncharacterized genes using phenotype data.« less