Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J
1997-01-01
To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity. No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. Is suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record.
Phase II Evaluation of Clinical Coding Schemes
Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith
1997-01-01
Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343
Source Methodology for Turbofan Noise Prediction (SOURCE3D Technical Documentation)
NASA Technical Reports Server (NTRS)
Meyer, Harold D.
1999-01-01
This report provides the analytical documentation for the SOURCE3D Rotor Wake/Stator Interaction Code. It derives the equations for the rotor scattering coefficients and stator source vector and scattering coefficients that are needed for use in the TFANS (Theoretical Fan Noise Design/Prediction System). SOURCE3D treats the rotor and stator as isolated source elements. TFANS uses this information, along with scattering coefficients for inlet and exit elements, and provides complete noise solutions for turbofan engines. SOURCE3D is composed of a collection of FORTRAN programs that have been obtained by extending the approach of the earlier V072 Rotor Wake/Stator Interaction Code. Similar to V072, it treats the rotor and stator as a collection of blades and vanes having zero thickness and camber contained in an infinite, hardwall annular duct. SOURCE3D adds important features to the V072 capability-a rotor element, swirl flow and vorticity waves, actuator disks for flow turning, and combined rotor/actuator disk and stator/actuator disk elements. These items allow reflections from the rotor, frequency scattering, and mode trapping, thus providing more complete noise predictions than previously. The code has been thoroughly verified through comparison with D.B. Hanson's CUP2D two- dimensional code using a narrow annulus test case.
Creation and Delivery of New Superpixelized DIRBE Map Products
NASA Technical Reports Server (NTRS)
Weiland, J.
1998-01-01
Phase 1 called for the following tasks: (1) completion of code to generate intermediate files containing the individual DIRBE observations which would be used to make the superpixelized maps; (2) completion of code necessary to generate the maps themselves; and (3) quality control on test-case maps in the form of point-source extraction and photometry. Items 1 and 2 are well in hand and the tested code is nearly complete. A few test maps have been generated for the tests mentioned in item 3. Map generation is not in production mode yet.
Automated Concurrent Blackboard System Generation in C++
NASA Technical Reports Server (NTRS)
Kaplan, J. A.; McManus, J. W.; Bynum, W. L.
1999-01-01
In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.
NASA Astrophysics Data System (ADS)
Carrasco, D.; Trenti, M.; Mutch, S.; Oesch, P. A.
2018-06-01
The luminosity function is a fundamental observable for characterising how galaxies form and evolve throughout the cosmic history. One key ingredient to derive this measurement from the number counts in a survey is the characterisation of the completeness and redshift selection functions for the observations. In this paper, we present GLACiAR, an open python tool available on GitHub to estimate the completeness and selection functions in galaxy surveys. The code is tailored for multiband imaging surveys aimed at searching for high-redshift galaxies through the Lyman-break technique, but it can be applied broadly. The code generates artificial galaxies that follow Sérsic profiles with different indexes and with customisable size, redshift, and spectral energy distribution properties, adds them to input images, and measures the recovery rate. To illustrate this new software tool, we apply it to quantify the completeness and redshift selection functions for J-dropouts sources (redshift z 10 galaxies) in the Hubble Space Telescope Brightest of Reionizing Galaxies Survey. Our comparison with a previous completeness analysis on the same dataset shows overall agreement, but also highlights how different modelling assumptions for the artificial sources can impact completeness estimates.
Finding Resolution for the Responsible Transparency of Economic Models in Health and Medicine.
Padula, William V; McQueen, Robert Brett; Pronovost, Peter J
2017-11-01
The Second Panel on Cost-Effectiveness in Health and Medicine recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses has a number of questions unanswered with respect to the implementation of transparent, open source code interface for economic models. The possibility of making economic model source code could be positive and progressive for the field; however, several unintended consequences of this system should be first considered before complete implementation of this model. First, there is the concern regarding intellectual property rights that modelers have to their analyses. Second, the open source code could make analyses more accessible to inexperienced modelers, leading to inaccurate or misinterpreted results. We propose several resolutions to these concerns. The field should establish a licensing system of open source code such that the model originators maintain control of the code use and grant permissions to other investigators who wish to use it. The field should also be more forthcoming towards the teaching of cost-effectiveness analysis in medical and health services education so that providers and other professionals are familiar with economic modeling and able to conduct analyses with open source code. These types of unintended consequences need to be fully considered before the field's preparedness to move forward into an era of model transparency with open source code.
Source Code Stylometry Improvements in Python
2017-12-14
person can be identified via their handwriting or an author identified by their style or prose, programmers can be identified by their code...to say , picking 1 author out of a known complete set. However, expanded open-world classification and multiauthor classification have also been
Phylogenetic Network for European mtDNA
Finnilä, Saara; Lehtonen, Mervi S.; Majamaa, Kari
2001-01-01
The sequence in the first hypervariable segment (HVS-I) of the control region has been used as a source of evolutionary information in most phylogenetic analyses of mtDNA. Population genetic inference would benefit from a better understanding of the variation in the mtDNA coding region, but, thus far, complete mtDNA sequences have been rare. We determined the nucleotide sequence in the coding region of mtDNA from 121 Finns, by conformation-sensitive gel electrophoresis and subsequent sequencing and by direct sequencing of the D loop. Furthermore, 71 sequences from our previous reports were included, so that the samples represented all the mtDNA haplogroups present in the Finnish population. We found a total of 297 variable sites in the coding region, which allowed the compilation of unambiguous phylogenetic networks. The D loop harbored 104 variable sites, and, in most cases, these could be localized within the coding-region networks, without discrepancies. Interestingly, many homoplasies were detected in the coding region. Nucleotide variation in the rRNA and tRNA genes was 6%, and that in the third nucleotide positions of structural genes amounted to 22% of that in the HVS-I. The complete networks enabled the relationships between the mtDNA haplogroups to be analyzed. Phylogenetic networks based on the entire coding-region sequence in mtDNA provide a rich source for further population genetic studies, and complete sequences make it easier to differentiate between disease-causing mutations and rare polymorphisms. PMID:11349229
NASA Technical Reports Server (NTRS)
Kilgore, W. Allen; Balakrishna, S.
1991-01-01
The 0.3 m Transonic Cryogenic Tunnel (TCT) microcomputer based controller has been operating for several thousand hours in a safe and efficient manner. A complete listing is provided of the source codes for the tunnel controller and tunnel simulator. Included also is a listing of all the variables used in these programs. Several changes made to the controller are described. These changes are to improve the controller ease of use and safety.
Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim
2003-01-01
With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.
Hoelzer, Simon; Schweiger, Ralf K.; Dudeck, Joachim
2003-01-01
With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or “semantically associated” parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813
Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code
NASA Astrophysics Data System (ADS)
Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.
2015-12-01
WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).
Back to the Source, or It's A You-Bet-Your-Business Game!
ERIC Educational Resources Information Center
Galvin, Wayne W.
1987-01-01
Many administrators are signing contracts for software products that leave their institutions completely unprotected in the event of a default by the vendor. It is proper for a customer to include contractual provisions whereby they may gain legal access to the program source code. (MLW)
FEMFLOW3D; a finite-element program for the simulation of three-dimensional aquifers; version 1.0
Durbin, Timothy J.; Bond, Linda D.
1998-01-01
This document also includes model validation, source code, and example input and output files. Model validation was performed using four test problems. For each test problem, the results of a model simulation with FEMFLOW3D were compared with either an analytic solution or the results of an independent numerical approach. The source code, written in the ANSI x3.9-1978 FORTRAN standard, and the complete input and output of an example problem are listed in the appendixes.
Particle model of a cylindrical inductively coupled ion source
NASA Astrophysics Data System (ADS)
Ippolito, N. D.; Taccogna, F.; Minelli, P.; Cavenago, M.; Veltri, P.
2017-08-01
In spite of the wide use of RF sources, a complete understanding of the mechanisms regulating the RF-coupling of the plasma is still lacking so self-consistent simulations of the involved physics are highly desirable. For this reason we are developing a 2.5D fully kinetic Particle-In-Cell Monte-Carlo-Collision (PIC-MCC) model of a cylindrical ICP-RF source, keeping the time step of the simulation small enough to resolve the plasma frequency scale. The grid cell dimension is now about seven times larger than the average Debye length, because of the large computational demand of the code. It will be scaled down in the next phase of the development of the code. The filling gas is Xenon, in order to minimize the time lost by the MCC collision module in the first stage of development of the code. The results presented here are preliminary, with the code already showing a good robustness. The final goal will be the modeling of the NIO1 (Negative Ion Optimization phase 1) source, operating in Padua at Consorzio RFX.
Wilson, Reda J; O'Neil, M E; Ntekop, E; Zhang, Kevin; Ren, Y
2014-01-01
Calculating accurate estimates of cancer survival is important for various analyses of cancer patient care and prognosis. Current US survival rates are estimated based on data from the National Cancer Institute's (NCI's) Surveillance, Epidemiology, and End RESULTS (SEER) program, covering approximately 28 percent of the US population. The National Program of Cancer Registries (NPCR) covers about 96 percent of the US population. Using a population-based database with greater US population coverage to calculate survival rates at the national, state, and regional levels can further enhance the effective monitoring of cancer patient care and prognosis in the United States. The first step is to establish the coding completeness and coding quality of the NPCR data needed for calculating survival rates and conducting related validation analyses. Using data from the NPCR-Cancer Surveillance System (CSS) from 1995 through 2008, we assessed coding completeness and quality on 26 data elements that are needed to calculate cancer relative survival estimates and conduct related analyses. Data elements evaluated consisted of demographic, follow-up, prognostic, and cancer identification variables. Analyses were performed showing trends of these variables by diagnostic year, state of residence at diagnosis, and cancer site. Mean overall percent coding completeness by each NPCR central cancer registry averaged across all data elements and diagnosis years ranged from 92.3 percent to 100 percent. RESULTS showing the mean percent coding completeness for the relative survival-related variables in NPCR data are presented. All data elements but 1 have a mean coding completeness greater than 90 percent as was the mean completeness by data item group type. Statistically significant differences in coding completeness were found in the ICD revision number, cause of death, vital status, and date of last contact variables when comparing diagnosis years. The majority of data items had a coding quality greater than 90 percent, with exceptions found in cause of death, follow-up source, and the SEER Summary Stage 1977, and SEER Summary Stage 2000. Percent coding completeness and quality are very high for variables in the NPCR-CSS that are covariates to calculating relative survival. NPCR provides the opportunity to calculate relative survival that may be more generalizable to the US population.
Study of information transfer optimization for communication satellites
NASA Technical Reports Server (NTRS)
Odenwalder, J. P.; Viterbi, A. J.; Jacobs, I. M.; Heller, J. A.
1973-01-01
The results are presented of a study of source coding, modulation/channel coding, and systems techniques for application to teleconferencing over high data rate digital communication satellite links. Simultaneous transmission of video, voice, data, and/or graphics is possible in various teleconferencing modes and one-way, two-way, and broadcast modes are considered. A satellite channel model including filters, limiter, a TWT, detectors, and an optimized equalizer is treated in detail. A complete analysis is presented for one set of system assumptions which exclude nonlinear gain and phase distortion in the TWT. Modulation, demodulation, and channel coding are considered, based on an additive white Gaussian noise channel model which is an idealization of an equalized channel. Source coding with emphasis on video data compression is reviewed, and the experimental facility utilized to test promising techniques is fully described.
Uncertainty Analysis Principles and Methods
2007-09-01
error source . The Data Processor converts binary coded numbers to values, performs D/A curve fitting and applies any correction factors that may be...describes the stages or modules involved in the measurement process. We now need to identify all relevant error sources and develop the mathematical... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden
MELCOR/CONTAIN LMR Implementation Report - FY16 Progress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
2016-11-01
This report describes the progress of the CONTAIN - LMR sodium physics and chemistry models to be implemented in MELCOR 2.1. In the past three years , the implementation included the addition of sodium equations of state and sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laboratory by modifying MELCOR to include liquid lithium equation of state as a working fluid to model the nuclear fusion safety research. The second source uses properties generated for the SIMMER code. The implemented modeling has been tested and results are reported inmore » this document. In addition, the CONTAIN - LMR code was derived from an early version of the CONTAIN code, and many physical models that were developed since this early version of CONTAIN are not available in this early code version. Therefore, CONTAIN 2 has been updated with the sodium models in CONTAIN - LMR as CONTAIN2 - LMR, which may be used to provide code-to-code comparison with CONTAIN - LMR and MELCOR when the sodium chemistry models from CONTAIN - LMR have been completed. Both the spray fire and pool fire chemistry routines from CONTAIN - LMR have been integrated into MELCOR 2.1, and debugging and testing are in progress. Because MELCOR only models the equation of state for liquid and gas phases of the coolant, a modeling gap still exists when dealing with experiments or accident conditions that take place when the ambient temperature is below the freezing point of sodium. An alternative method is under investigation to overcome this gap . We are no longer working on the separate branch from the main branch of MELCOR 2.1 since the major modeling of MELCOR 2.1 has been completed. At the current stage, the newly implemented sodium chemistry models will be a part of the main MELCOR release version (MELCOR 2.2). This report will discuss the accomplishments and issues relating to the implementation. Also, we will report on the planned completion of all remaining tasks in the upcoming FY2017, including the atmospheric chemistry model and sodium - concrete interaction model implementation .« less
Reporting of occupational injury and illness in the semiconductor manufacturing industry.
McCurdy, S A; Schenker, M B; Samuels, S J
1991-01-01
In the United States, occupational illness and injury cases meeting specific reporting criteria are recorded on company Occupational Safety and Health Administration (OSHA) 200 logs; case description data are submitted to participating state agencies for coding and entry in the national Supplementary Data System (SDS). We evaluated completeness of reporting (the percentage of reportable cases that were recorded in the company OSHA 200 log) in the semiconductor manufacturing industry by reviewing company health clinic records for 1984 of 10 manufacturing sites of member companies of a national semiconductor manufacturing industry trade association. Of 416 randomly selected work-related cases, 101 met OSHA reporting criteria. Reporting completeness was 60 percent and was lowest for occupational illnesses (44 percent). Case-description data from 150 reported cases were submitted twice to state coding personnel to evaluate coding reliability. Reliability was high (kappa 0.82-0.93) for "nature," "affected body part," "source," and "type" variables. Coding for the SDS appears reliable; reporting completeness may be improved by use of a stepwise approach by company personnel responsible for reporting decisions.
The Simple Video Coder: A free tool for efficiently coding social video data.
Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C
2017-08-01
Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.
EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual
NASA Technical Reports Server (NTRS)
Raju, M. S.
1998-01-01
EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.
A Qualitative Analysis of the Navy’s HSI Billet Structure
2008-06-01
of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...subspecialty code. The research results support the hypothesis that the work requirements of the July 2007 data set of 4600P-coded billets (billets
47 CFR 11.54 - EAS operation during a National Level emergency.
Code of Federal Regulations, 2010 CFR
2010-10-01
... licensees and DBS providers may choose their two EAS sources, one of which must be a PEP station. (2... header codes for a national emergency. (3) After completing the above transmission procedures, key EAS...
47 CFR 11.54 - EAS operation during a National Level emergency.
Code of Federal Regulations, 2011 CFR
2011-10-01
... licensees and DBS providers may choose their two EAS sources, one of which must be a PEP station. (2... header codes for a national emergency. (3) After completing the above transmission procedures, key EAS...
User's manual for PRESTO: A computer code for the performance of regenerative steam turbine cycles
NASA Technical Reports Server (NTRS)
Fuller, L. C.; Stovall, T. K.
1979-01-01
Standard turbine cycles for baseload power plants and cycles with such additional features as process steam extraction and induction and feedwater heating by external heat sources may be modeled. Peaking and high back pressure cycles are also included. The code's methodology is to use the expansion line efficiencies, exhaust loss, leakages, mechanical losses, and generator losses to calculate the heat rate and generator output. A general description of the code is given as well as the instructions for input data preparation. Appended are two complete example cases.
Synchrotron Radiation Workshop (SRW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chubar, O.; Elleaume, P.
2013-03-01
"Synchrotron Radiation Workshop" (SRW) is a physical optics computer code for calculation of detailed characteristics of Synchrotron Radiation (SR) generated by relativistic electrons in magnetic fields of arbitrary configuration and for simulation of the radiation wavefront propagation through optical systems of beamlines. Frequency-domain near-field methods are used for the SR calculation, and the Fourier-optics based approach is generally used for the wavefront propagation simulation. The code enables both fully- and partially-coherent radiation propagation simulations in steady-state and in frequency-/time-dependent regimes. With these features, the code has already proven its utility for a large number of applications in infrared, UV, softmore » and hard X-ray spectral range, in such important areas as analysis of spectral performances of new synchrotron radiation sources, optimization of user beamlines, development of new optical elements, source and beamline diagnostics, and even complete simulation of SR based experiments. Besides the SR applications, the code can be efficiently used for various simulations involving conventional lasers and other sources. SRW versions interfaced to Python and to IGOR Pro (WaveMetrics), as well as cross-platform library with C API, are available.« less
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
FRAGS: estimation of coding sequence substitution rates from fragmentary data
Swart, Estienne C; Hide, Winston A; Seoighe, Cathal
2004-01-01
Background Rates of substitution in protein-coding sequences can provide important insights into evolutionary processes that are of biomedical and theoretical interest. Increased availability of coding sequence data has enabled researchers to estimate more accurately the coding sequence divergence of pairs of organisms. However the use of different data sources, alignment protocols and methods to estimate substitution rates leads to widely varying estimates of key parameters that define the coding sequence divergence of orthologous genes. Although complete genome sequence data are not available for all organisms, fragmentary sequence data can provide accurate estimates of substitution rates provided that an appropriate and consistent methodology is used and that differences in the estimates obtainable from different data sources are taken into account. Results We have developed FRAGS, an application framework that uses existing, freely available software components to construct in-frame alignments and estimate coding substitution rates from fragmentary sequence data. Coding sequence substitution estimates for human and chimpanzee sequences, generated by FRAGS, reveal that methodological differences can give rise to significantly different estimates of important substitution parameters. The estimated substitution rates were also used to infer upper-bounds on the amount of sequencing error in the datasets that we have analysed. Conclusion We have developed a system that performs robust estimation of substitution rates for orthologous sequences from a pair of organisms. Our system can be used when fragmentary genomic or transcript data is available from one of the organisms and the other is a completely sequenced genome within the Ensembl database. As well as estimating substitution statistics our system enables the user to manage and query alignment and substitution data. PMID:15005802
HAL/S-360 compiler test activity report
NASA Technical Reports Server (NTRS)
Helmers, C. T.
1974-01-01
The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.
NASA Astrophysics Data System (ADS)
Wang, Rongjiang; Heimann, Sebastian; Zhang, Yong; Wang, Hansheng; Dahm, Torsten
2017-04-01
A hybrid method is proposed to calculate complete synthetic seismograms based on a spherically symmetric and self-gravitating Earth with a multi-layered structure of atmosphere, ocean, mantle, liquid core and solid core. For large wavelengths, a numerical scheme is used to solve the geodynamic boundary-value problem without any approximation on the deformation and gravity coupling. With the decreasing wavelength, the gravity effect on the deformation becomes negligible and the analytical propagator scheme can be used. Many useful approaches are used to overcome the numerical problems that may arise in both analytical and numerical schemes. Some of these approaches have been established in the seismological community and the others are developed for the first time. Based on the stable and efficient hybrid algorithm, an all-in-one code QSSP is implemented to cover the complete spectrum of seismological interests. The performance of the code is demonstrated by various tests including the curvature effect on teleseismic body and surface waves, the appearance of multiple reflected, teleseismic core phases, the gravity effect on long period surface waves and free oscillations, the simulation of near-field displacement seismograms with the static offset, the coupling of tsunami and infrasound waves, and free oscillations of the solid Earth, the atmosphere and the ocean. QSSP is open source software that can be used as a stand-alone FORTRAN code or may be applied in combination with a Python toolbox to calculate and handle Green's function databases for efficient coding of source inversion problems.
NASA Astrophysics Data System (ADS)
Wang, Rongjiang; Heimann, Sebastian; Zhang, Yong; Wang, Hansheng; Dahm, Torsten
2017-09-01
A hybrid method is proposed to calculate complete synthetic seismograms based on a spherically symmetric and self-gravitating Earth with a multilayered structure of atmosphere, ocean, mantle, liquid core and solid core. For large wavelengths, a numerical scheme is used to solve the geodynamic boundary-value problem without any approximation on the deformation and gravity coupling. With decreasing wavelength, the gravity effect on the deformation becomes negligible and the analytical propagator scheme can be used. Many useful approaches are used to overcome the numerical problems that may arise in both analytical and numerical schemes. Some of these approaches have been established in the seismological community and the others are developed for the first time. Based on the stable and efficient hybrid algorithm, an all-in-one code QSSP is implemented to cover the complete spectrum of seismological interests. The performance of the code is demonstrated by various tests including the curvature effect on teleseismic body and surface waves, the appearance of multiple reflected, teleseismic core phases, the gravity effect on long period surface waves and free oscillations, the simulation of near-field displacement seismograms with the static offset, the coupling of tsunami and infrasound waves, and free oscillations of the solid Earth, the atmosphere and the ocean. QSSP is open source software that can be used as a stand-alone FORTRAN code or may be applied in combination with a Python toolbox to calculate and handle Green's function databases for efficient coding of source inversion problems.
Open source clustering software.
de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S
2004-06-12
We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.
SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)
NASA Technical Reports Server (NTRS)
Manteufel, R.
1994-01-01
The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.
SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)
NASA Technical Reports Server (NTRS)
Merwarth, P. D.
1994-01-01
The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.
Bracken, Robert E.
2004-01-01
A subroutine (FFTDC2) coded in Fortran 77 is described, which performs a Fast Fourier Transform or Discrete Fourier Transform together with necessary conditioning steps of trend removal, extension, and windowing. The source code for the entire library of required subroutines is provided with the digital release of this report. But, there is only one required entry point, the subroutine call to FFTDC2; all the other subroutines are operationally transparent to the user. Complete instructions for use of FFTDC2.F (as well as for all the other subroutines) and some practical theoretical discussions are included as comments at the beginning of the source code. This subroutine is intended to be an efficient tool for the programmer in a variety of production-level signal-processing applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yidong Xia; Mitch Plummer; Robert Podgorney
2016-02-01
Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less
InterProScan 5: genome-scale protein function classification
Jones, Philip; Binns, David; Chang, Hsin-Yu; Fraser, Matthew; Li, Weizhong; McAnulla, Craig; McWilliam, Hamish; Maslen, John; Mitchell, Alex; Nuka, Gift; Pesseat, Sebastien; Quinn, Antony F.; Sangrador-Vegas, Amaia; Scheremetjew, Maxim; Yong, Siew-Yit; Lopez, Rodrigo; Hunter, Sarah
2014-01-01
Motivation: Robust large-scale sequence analysis is a major challenge in modern genomic science, where biologists are frequently trying to characterize many millions of sequences. Here, we describe a new Java-based architecture for the widely used protein function prediction software package InterProScan. Developments include improvements and additions to the outputs of the software and the complete reimplementation of the software framework, resulting in a flexible and stable system that is able to use both multiprocessor machines and/or conventional clusters to achieve scalable distributed data analysis. InterProScan is freely available for download from the EMBl-EBI FTP site and the open source code is hosted at Google Code. Availability and implementation: InterProScan is distributed via FTP at ftp://ftp.ebi.ac.uk/pub/software/unix/iprscan/5/ and the source code is available from http://code.google.com/p/interproscan/. Contact: http://www.ebi.ac.uk/support or interhelp@ebi.ac.uk or mitchell@ebi.ac.uk PMID:24451626
Time synchronized video systems
NASA Technical Reports Server (NTRS)
Burnett, Ron
1994-01-01
The idea of synchronizing multiple video recordings to some type of 'range' time has been tried to varying degrees of success in the past. Combining this requirement with existing time code standards (SMPTE) and the new innovations in desktop multimedia however, have afforded an opportunity to increase the flexibility and usefulness of such efforts without adding costs over the traditional data recording and reduction systems. The concept described can use IRIG, GPS or a battery backed internal clock as the master time source. By converting that time source to Vertical Interval Time Code or Longitudinal Time Code, both in accordance with the SMPTE standards, the user will obtain a tape that contains machine/computer readable time code suitable for use with editing equipment that is available off-the-shelf. Accuracy on playback is then determined by the playback system chosen by the user. Accuracies of +/- 2 frames are common among inexpensive systems and complete frame accuracy is more a matter of the users' budget than the capability of the recording system.
2010-09-01
the time for reviewing instruction, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the...Col Terry Smith for his time spent serving as the second reader of this thesis. xxii THIS PAGE INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION...However, this is the first time that concatenated coding in combination with noise- normalization has been considered. 2 B. THESIS OUTLINE The thesis
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
MELCOR/CONTAIN LMR Implementation Report-Progress FY15
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L.; Louie, David L.Y.
2016-01-01
This report describes the progress of the CONTAIN-LMR sodium physics and chemistry models to be implemented in to MELCOR 2.1. It also describes the progress to implement these models into CONT AIN 2 as well. In the past two years, the implementation included the addition of sodium equations of state and sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laborat ory by modifying MELCOR to include liquid lithium equation of state as a working fluid to mode l the nuclear fusion safety research. The second source uses properties generatedmore » for the SIMMER code. Testing and results from this implementation of sodium pr operties are given. In addition, the CONTAIN-LMR code was derived from an early version of C ONTAIN code. Many physical models that were developed sin ce this early version of CONTAIN are not captured by this early code version. Therefore, CONTAIN 2 is being updated with the sodium models in CONTAIN-LMR in or der to facilitate verification of these models with the MELCOR code. Although CONTAIN 2, which represents the latest development of CONTAIN, now contains ma ny of the sodium specific models, this work is not complete due to challenges from the lower cell architecture in CONTAIN 2, which is different from CONTAIN- LMR. This implementation should be completed in the coming year, while sodi um models from C ONTAIN-LMR are being integrated into MELCOR. For testing, CONTAIN decks have been developed for verification and validation use. In terms of implementing the sodium m odels into MELCOR, a separate sodium model branch was created for this document . Because of massive development in the main stream MELCOR 2.1 code and the require ment to merge the latest code version into this branch, the integration of the s odium models were re-directed to implement the sodium chemistry models first. This change led to delays of the actual implementation. For aid in the future implementation of sodium models, a new sodium chemistry package was created. Thus reporting for the implementation of the sodium chemistry is discussed in this report.« less
NASA Technical Reports Server (NTRS)
Cross, James H., II
1990-01-01
The study, formulation, and generation of structures for Ada (GRASP/Ada) are discussed in this second phase report of a three phase effort. Various graphical representations that can be extracted or generated from source code are described and categorized with focus on reverse engineering. The overall goal is to provide the foundation for a CASE (computer-aided software design) environment in which reverse engineering and forward engineering (development) are tightly coupled. Emphasis is on a subset of architectural diagrams that can be generated automatically from source code with the control structure diagram (CSD) included for completeness.
Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language
NASA Astrophysics Data System (ADS)
Heaphy, R. T.; Burke, M. P.; Love, J. T.
2015-12-01
Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.
NASA Technical Reports Server (NTRS)
Bade, W. L.; Yos, J. M.
1975-01-01
The present, third volume of the final report is a programmer's manual for the code. It provides a listing of the FORTRAN 4 source program; a complete glossary of FORTRAN symbols; a discussion of the purpose and method of operation of each subroutine (including mathematical analyses of special algorithms); and a discussion of the operation of the code on IBM/360 and UNIVAC 1108 systems, including required control cards and the overlay structure used to accommodate the code to the limited core size of the 1108. In addition, similar information is provided to document the programming of the NOZFIT code, which is employed to set up nozzle profile curvefits for use in NATA.
NASA Astrophysics Data System (ADS)
Larmat, C. S.; Delorey, A.; Rougier, E.; Knight, E. E.; Steedman, D. W.; Bradley, C. R.
2017-12-01
This presentation reports numerical modeling efforts to improve knowledge of the processes that affect seismic wave generation and propagation from underground explosions, with a focus on Rg waves. The numerical model is based on the coupling of hydrodynamic simulation codes (Abaqus, CASH and HOSS), with a 3D full waveform propagation code, SPECFEM3D. Validation datasets are provided by the Source Physics Experiment (SPE) which is a series of highly instrumented chemical explosions at the Nevada National Security Site with yields from 100kg to 5000kg. A first series of explosions in a granite emplacement has just been completed and a second series in alluvium emplacement is planned for 2018. The long-term goal of this research is to review and improve current existing seismic sources models (e.g. Mueller & Murphy, 1971; Denny & Johnson, 1991) by providing first principles calculations provided by the coupled codes capability. The hydrodynamic codes, Abaqus, CASH and HOSS, model the shocked, hydrodynamic region via equations of state for the explosive, borehole stemming and jointed/weathered granite. A new material model for unconsolidated alluvium materials has been developed and validated with past nuclear explosions, including the 10 kT 1965 Merlin event (Perret, 1971) ; Perret and Bass, 1975). We use the efficient Spectral Element Method code, SPECFEM3D (e.g. Komatitsch, 1998; 2002), and Geologic Framework Models to model the evolution of wavefield as it propagates across 3D complex structures. The coupling interface is a series of grid points of the SEM mesh situated at the edge of the hydrodynamic code domain. We will present validation tests and waveforms modeled for several SPE tests which provide evidence that the damage processes happening in the vicinity of the explosions create secondary seismic sources. These sources interfere with the original explosion moment and reduces the apparent seismic moment at the origin of Rg waves up to 20%.
An Optically Implemented Kalman Filter Algorithm.
1983-12-01
8b. OFFICE SYMOOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER 8c. ADDRESS (City, State and ZIP Code ) 10. SOURCE OF FUNDING NOS.______ PROGRAM...are completely speci- fied for the correlation stage to perform the required corre- lation in real time, and the filter stage to perform the lin- ear...performance analy- ses indicated an enhanced ability of the nonadaptive filter to track a realistic distant point source target with an error standard
Hand Gesture Data Collection Procedure Using a Myo Armband for Machine Learning
2015-09-01
instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing the collection information...data using a Myo armband. The source code for this work is included as an Appendix. 15. SUBJECT TERMS Myo, Machine Learning, Classifier, Data...development in multiple platfonns (e.g., Windows, iOS, Android , etc.) and many languages (e.g. , Java, C++, C#, Lua, etc.). For the data collection
MELCOR/CONTAIN LMR Implementation Report. FY14 Progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L; Louie, David L.Y.
2014-10-01
This report describes the preliminary implementation of the sodium thermophysical properties and the design documentation for the sodium models of CONTAIN-LMR to be implemented into MELCOR 2.1. In the past year, the implementation included two separate sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laboratory by modifying MELCOR to include liquid lithium equation of state as a working fluid to model the nuclear fusion safety research. To minimize the impact to MELCOR, the implementation of the fusion safety database (FSD) was done by utilizing the detection of the datamore » input file as a way to invoking the FSD. The FSD methodology has been adapted currently for this work, but it may subject modification as the project continues. The second source uses properties generated for the SIMMER code. Preliminary testing and results from this implementation of sodium properties are given. In this year, the design document for the CONTAIN-LMR sodium models, such as the two condensable option, sodium spray fire, and sodium pool fire is being developed. This design document is intended to serve as a guide for the MELCOR implementation. In addition, CONTAIN-LMR code used was based on the earlier version of CONTAIN code. Many physical models that were developed since this early version of CONTAIN may not be captured by the code. Although CONTAIN 2, which represents the latest development of CONTAIN, contains some sodium specific models, which are not complete, the utilizing CONTAIN 2 with all sodium models implemented from CONTAIN-LMR as a comparison code for MELCOR should be done. This implementation should be completed in early next year, while sodium models from CONTAIN-LMR are being integrated into MELCOR. For testing, CONTAIN decks have been developed for verification and validation use.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curry, Matthew L.; Ferreira, Kurt Brian; Pedretti, Kevin Thomas Tauke
2012-03-01
This report documents thirteen of Sandia's contributions to the Computational Systems and Software Environment (CSSE) within the Advanced Simulation and Computing (ASC) program between fiscal years 2009 and 2012. It describes their impact on ASC applications. Most contributions are implemented in lower software levels allowing for application improvement without source code changes. Improvements are identified in such areas as reduced run time, characterizing power usage, and Input/Output (I/O). Other experiments are more forward looking, demonstrating potential bottlenecks using mini-application versions of the legacy codes and simulating their network activity on Exascale-class hardware. The purpose of this report is to provemore » that the team has completed milestone 4467-Demonstration of a Legacy Application's Path to Exascale. Cielo is expected to be the last capability system on which existing ASC codes can run without significant modifications. This assertion will be tested to determine where the breaking point is for an existing highly scalable application. The goal is to stretch the performance boundaries of the application by applying recent CSSE RD in areas such as resilience, power, I/O, visualization services, SMARTMAP, lightweight LWKs, virtualization, simulation, and feedback loops. Dedicated system time reservations and/or CCC allocations will be used to quantify the impact of system-level changes to extend the life and performance of the ASC code base. Finally, a simulation of anticipated exascale-class hardware will be performed using SST to supplement the calculations. Determine where the breaking point is for an existing highly scalable application: Chapter 15 presented the CSSE work that sought to identify the breaking point in two ASC legacy applications-Charon and CTH. Their mini-app versions were also employed to complete the task. There is no single breaking point as more than one issue was found with the two codes. The results were that applications can expect to encounter performance issues related to the computing environment, system software, and algorithms. Careful profiling of runtime performance will be needed to identify the source of an issue, in strong combination with knowledge of system software and application source code.« less
TANDEM: matching proteins with tandem mass spectra.
Craig, Robertson; Beavis, Ronald C
2004-06-12
Tandem mass spectra obtained from fragmenting peptide ions contain some peptide sequence specific information, but often there is not enough information to sequence the original peptide completely. Several proprietary software applications have been developed to attempt to match the spectra with a list of protein sequences that may contain the sequence of the peptide. The application TANDEM was written to provide the proteomics research community with a set of components that can be used to test new methods and algorithms for performing this type of sequence-to-data matching. The source code and binaries for this software are available at http://www.proteome.ca/opensource.html, for Windows, Linux and Macintosh OSX. The source code is made available under the Artistic License, from the authors.
Chronic myelogenous leukemia in eastern Pennsylvania: an assessment of registry reporting.
Mertz, Kristen J; Buchanich, Jeanine M; Washington, Terri L; Irvin-Barnwell, Elizabeth A; Woytowitz, Donald V; Smith, Roy E
2015-01-01
Chronic myelogenous leukemia (CML) has been reportable to the Pennsylvania Cancer Registry (PCR) since the 1980s, but the completeness of reporting is unknown. This study assessed CML reporting in eastern Pennsylvania where a cluster of another myeloproliferative neoplasm was previously identified. Cases were identified from 2 sources: 1) PCR case reports for residents of Carbon, Luzerne, or Schuylkill County with International Classification of Diseases for Oncology, Third Edition (ICD-O-3) codes 9875 (CML, BCR-ABL+), 9863 (CML, NOS), and 9860 (myeloid leukemia) and date of diagnosis 2001-2009, and 2) review of billing records at hematology practices. Participants were interviewed and their medical records were reviewed by board-certified hematologists. PCR reports included 99 cases coded 9875 or 9863 and 9 cases coded 9860; 2 additional cases were identified by review of billing records. Of the 110 identified cases, 93 were mailed consent forms, 23 consented, and 12 medical records were reviewed. Hematologists confirmed 11 of 12 reviewed cases as CML cases; all 11 confirmed cases were BCR/ABL positive, but only 1 was coded as positive (code 9875). Very few unreported CML cases were identified, suggesting relatively complete reporting to the PCR. Cases reviewed were accurately diagnosed, but ICD-0-3 coding often did not reflect BCR-ABL-positive tests. Cancer registry abstracters should look for these test results and code accordingly.
NASA Technical Reports Server (NTRS)
Ancheta, T. C., Jr.
1976-01-01
A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.
Gamma-ray spectroscopy: The diffuse galactic glow
NASA Technical Reports Server (NTRS)
Hartmann, Dieter H.
1991-01-01
The goal of this project is the development of a numerical code that provides statistical models of the sky distribution of gamma-ray lines due to the production of radioactive isotopes by ongoing Galactic nucleosynthesis. We are particularly interested in quasi-steady emission from novae, supernovae, and stellar winds, but continuum radiation and transient sources must also be considered. We have made significant progress during the first half period of this project and expect the timely completion of a code that can be applied to Oriented Scintillation Spectrometer Experiment (OSSE) Galactic plane survey data.
Cross-terminology mapping challenges: a demonstration using medication terminological systems.
Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer V; Chute, Christopher G; Johnson, Todd R
2012-08-01
Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems-a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. Copyright © 2012 Elsevier Inc. All rights reserved.
Cross-terminology mapping challenges: A demonstration using medication terminological systems
Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer; Chute, Christopher G.; Johnson, Todd R.
2015-01-01
Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems—a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED-CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. PMID:22750536
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... when discharges change as a result of the use of an alternative methodology or data for computing flood... land use regulation. (ii) It must be well-documented including source codes and user's manuals. (iii... projects that may effect map changes when they are completed. (4) The datum and date of releveling of...
[Fatal occupational accidents: updating of data from a mortality register].
Mantero, Silvia; Baldasseroni, A; Chellini, Elisabetta; Giovanetti, Lucia
2005-01-01
In Italy, almost one thousand deaths due to occupational accidents are usually registered by INAIL each year. Case registration by INAIL has merely administrative purposes and therefore it is necessary to use other sources for case ascertainment in order to better estimate the real number of deaths related to occupational accidents, as shown also by previous papers. Evaluation of the contribution of another data source, namely the Tuscany Regional Mortality Registry, to obtain the correct figure for occupational accident deaths through the use of a place-of-occurrence notation on the death certificate. Cases that occurred in residents in Tuscany in 2000-2001 were considered. They were identified from : a) the Tuscany Regional Mortality Registry (RMR) using the E code of the ICD LX code of death, the year and place of occurrence; b) the INAIL archive using the year of event, the type of definition and management. The INAIL source was without doubt the most informative but was only 51% complete, whereas the RMR source, although less informative, was more complete (82.4%) and allowed identification of cases not registered by INAIL, that had occurred for instance in the Armed Forces and in the National Railway Company. However, the vast majority of RMR extra-cases occurred in subjects aged 65+, in agriculture and in the building industry. It is currently possible to plan a systematic linkage of the two sources due to the new possibilities that are available: the place-of-occurrence in the death certificate and the availability of individual data in the INAIL source.
Syndrome source coding and its universal generalization
NASA Technical Reports Server (NTRS)
Ancheta, T. C., Jr.
1975-01-01
A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.
MetaJC++: A flexible and automatic program transformation technique using meta framework
NASA Astrophysics Data System (ADS)
Beevi, Nadera S.; Reghu, M.; Chitraprasad, D.; Vinodchandra, S. S.
2014-09-01
Compiler is a tool to translate abstract code containing natural language terms to machine code. Meta compilers are available to compile more than one languages. We have developed a meta framework intends to combine two dissimilar programming languages, namely C++ and Java to provide a flexible object oriented programming platform for the user. Suitable constructs from both the languages have been combined, thereby forming a new and stronger Meta-Language. The framework is developed using the compiler writing tools, Flex and Yacc to design the front end of the compiler. The lexer and parser have been developed to accommodate the complete keyword set and syntax set of both the languages. Two intermediate representations have been used in between the translation of the source program to machine code. Abstract Syntax Tree has been used as a high level intermediate representation that preserves the hierarchical properties of the source program. A new machine-independent stack-based byte-code has also been devised to act as a low level intermediate representation. The byte-code is essentially organised into an output class file that can be used to produce an interpreted output. The results especially in the spheres of providing C++ concepts in Java have given an insight regarding the potential strong features of the resultant meta-language.
NASA Astrophysics Data System (ADS)
Kwon, N.; Gentle, J.; Pierce, S. A.
2015-12-01
Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.
NASA Astrophysics Data System (ADS)
Selker, J. S.; Roques, C.; Higgins, C. W.; Good, S. P.; Hut, R.; Selker, A.
2015-12-01
The confluence of 3-Dimensional printing, low-cost solid-state-sensors, low-cost, low-power digital controllers (e.g., Arduinos); and open-source publishing (e.g., Github) is poised to transform environmental sensing. The Open-Source Published Environmental Sensing (OPENS) laboratory has launched and is available for all to use. OPENS combines cutting edge technologies and makes them available to the global environmental sensing community. OPENS includes a Maker lab space where people may collaborate in person or virtually via on-line forum for the publication and discussion of environmental sensing technology (Corvallis, Oregon, USA, please feel free to request a free reservation for space and equipment use). The physical lab houses a test-bed for sensors, as well as a complete classical machine shop, 3-D printers, electronics development benches, and workstations for code development. OPENS will provide a web-based formal publishing framework wherein global students and scientists can peer-review publish (with DOI) novel and evolutionary advancements in environmental sensor systems. This curated and peer-reviewed digital collection will include complete sets of "printable" parts and operating computer code for sensing systems. The physical lab will include all of the machines required to produce these sensing systems. These tools can be addressed in person or virtually, creating a truly global venue for advancement in monitoring earth's environment and agricultural systems. In this talk we will present an example of the process of design and publication the design and data from the OPENS-Permeameter. The publication includes 3-D printing code, Arduino (or other control/logging platform) operational code; sample data sets, and a full discussion of the design set in the scientific context of previous related devices. Editors for the peer-review process are currently sought - contact John.Selker@Oregonstate.edu or Clement.Roques@Oregonstate.edu.
JBioWH: an open-source Java framework for bioinformatics data integration
Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor
2013-01-01
The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh PMID:23846595
JBioWH: an open-source Java framework for bioinformatics data integration.
Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor
2013-01-01
The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh.
Rosetta3: An Object-Oriented Software Suite for the Simulation and Design of Macromolecules
Leaver-Fay, Andrew; Tyka, Michael; Lewis, Steven M.; Lange, Oliver F.; Thompson, James; Jacak, Ron; Kaufman, Kristian; Renfrew, P. Douglas; Smith, Colin A.; Sheffler, Will; Davis, Ian W.; Cooper, Seth; Treuille, Adrien; Mandell, Daniel J.; Richter, Florian; Ban, Yih-En Andrew; Fleishman, Sarel J.; Corn, Jacob E.; Kim, David E.; Lyskov, Sergey; Berrondo, Monica; Mentzer, Stuart; Popović, Zoran; Havranek, James J.; Karanicolas, John; Das, Rhiju; Meiler, Jens; Kortemme, Tanja; Gray, Jeffrey J.; Kuhlman, Brian; Baker, David; Bradley, Philip
2013-01-01
We have recently completed a full re-architecturing of the Rosetta molecular modeling program, generalizing and expanding its existing functionality. The new architecture enables the rapid prototyping of novel protocols by providing easy to use interfaces to powerful tools for molecular modeling. The source code of this rearchitecturing has been released as Rosetta3 and is freely available for academic use. At the time of its release, it contained 470,000 lines of code. Counting currently unpublished protocols at the time of this writing, the source includes 1,285,000 lines. Its rapid growth is a testament to its ease of use. This document describes the requirements for our new architecture, justifies the design decisions, sketches out central classes, and highlights a few of the common tasks that the new software can perform. PMID:21187238
NASA Technical Reports Server (NTRS)
Gezari, Daniel Y.; Schmitz, Marion; Pitts, Patricia S.; Mead, Jaylee M.
1993-01-01
The Far Infrared Supplement contains a subset of the data in the full Catalog of Infrared Observations (all observations at wavelengths greater than 4.6 microns). The Catalog of Infrared Observations (CIO), NASA RP-1294, is a compilation of infrared astronomical observational data obtained from an extensive literature search of scientific journals and major astronomical catalogs and surveys. The literature search is complete for years 1965 through 1990 in this third edition. The catalog contains about 210,000 observations of roughly 20,000 individual sources, and supporting appendices. The expanded third edition contains coded IRAS 4-band data for all CIO sources detected by IRAS. The appendices include an atlas of infrared source positions (also included in this volume), two bibliographies of catalog listings, and an atlas of infrared spectral ranges. The complete CIO database is available to qualified users in printed, microfiche, and magnetic tape formats.
Catalog of Infrared Observations, Third Edition
NASA Technical Reports Server (NTRS)
Gezari, Daniel Y.; Schmitz, Marion; Pitts, Patricia S.; Mead, Jaylee M.
1993-01-01
The Far Infrared Supplement contains a subset of the data in the full Catalog of Infrared Observations (all observations at wavelengths greater than 4.6 microns). The Catalog of Infrared Observations (CIO), NASA RP-1294, is a compilation of infrared astronomical observational data obtained from an extensive literature search of scientific journals and major astronomical catalogs and surveys. The literature search is complete for years 1965 through 1990 in this Third Edition. The Catalog contains about 210,000 observations of roughly 20,000 individual sources and supporting appendices. The expanded Third Edition contains coded IRAS 4-band data for all CIO sources detected by IRAS. The appendices include an atlas of infrared source positions (also included in this volume), two bibliographies of Catalog listings, and an atlas of infrared spectral ranges. The complete CIO database is available to qualified users in printed, microfiche, and magnetic-tape formats.
OpenSQUID: A Flexible Open-Source Software Framework for the Control of SQUID Electronics
Jaeckel, Felix T.; Lafler, Randy J.; Boyd, S. T. P.
2013-02-06
We report commercially available computer-controlled SQUID electronics are usually delivered with software providing a basic user interface for adjustment of SQUID tuning parameters, such as bias current, flux offset, and feedback loop settings. However, in a research context it would often be useful to be able to modify this code and/or to have full control over all these parameters from researcher-written software. In the case of the STAR Cryoelectronics PCI/PFL family of SQUID control electronics, the supplied software contains modules for automatic tuning and noise characterization, but does not provide an interface for user code. On the other hand, themore » Magnicon SQUIDViewer software package includes a public application programming interface (API), but lacks auto-tuning and noise characterization features. To overcome these and other limitations, we are developing an "open-source" framework for controlling SQUID electronics which should provide maximal interoperability with user software, a unified user interface for electronics from different manufacturers, and a flexible platform for the rapid development of customized SQUID auto-tuning and other advanced features. Finally, we have completed a first implementation for the STAR Cryoelectronics hardware and have made the source code for this ongoing project available to the research community on SourceForge (http://opensquid.sourceforge.net) under the GNU public license.« less
Simonaitis, Linas; McDonald, Clement J
2009-10-01
The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.
Practices in Code Discoverability: Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.
2012-09-01
Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.
User's manual: Subsonic/supersonic advanced panel pilot code
NASA Technical Reports Server (NTRS)
Moran, J.; Tinoco, E. N.; Johnson, F. T.
1978-01-01
Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.
The radiation fields around a proton therapy facility: A comparison of Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Ottaviano, G.; Picardi, L.; Pillon, M.; Ronsivalle, C.; Sandri, S.
2014-02-01
A proton therapy test facility with a beam current lower than 10 nA in average, and an energy up to 150 MeV, is planned to be sited at the Frascati ENEA Research Center, in Italy. The accelerator is composed of a sequence of linear sections. The first one is a commercial 7 MeV proton linac, from which the beam is injected in a SCDTL (Side Coupled Drift Tube Linac) structure reaching the energy of 52 MeV. Then a conventional CCL (coupled Cavity Linac) with side coupling cavities completes the accelerator. The linear structure has the important advantage that the main radiation losses during the acceleration process occur to protons with energy below 20 MeV, with a consequent low production of neutrons and secondary radiation. From the radiation protection point of view the source of radiation for this facility is then almost completely located at the final target. Physical and geometrical models of the device have been developed and implemented into radiation transport computer codes based on the Monte Carlo method. The scope is the assessment of the radiation field around the main source for supporting the safety analysis. For the assessment independent researchers used two different Monte Carlo computer codes named FLUKA (FLUktuierende KAskade) and MCNPX (Monte Carlo N-Particle eXtended) respectively. Both are general purpose tools for calculations of particle transport and interactions with matter, covering an extended range of applications including proton beam analysis. Nevertheless each one utilizes its own nuclear cross section libraries and uses specific physics models for particle types and energies. The models implemented into the codes are described and the results are presented. The differences between the two calculations are reported and discussed pointing out disadvantages and advantages of each code in the specific application.
Communication Breakdown: DHS Operations During a Cyber Attack
2010-12-01
is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of...Presidential Directive, Malware, National Exercise, Quadrennial Homeland Security Review , Trusted Internet Connections, Zero-Day Exploits 16. PRICE CODE 17
Revised Class IV Planning Factors
1997-01-01
including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the...1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188), Washington, DC 20503. 1. AGENCY...October 1996 meeting. 14. SUBJECT TERMS 15. NUMBER OF PAGES supply management mobilization 26 planning factors 16. PRICE CODE construction materials 17
Nordic Cancer Registries - an overview of their procedures and data comparability.
Pukkala, Eero; Engholm, Gerda; Højsgaard Schmidt, Lise Kristine; Storm, Hans; Khan, Staffan; Lambe, Mats; Pettersson, David; Ólafsdóttir, Elínborg; Tryggvadóttir, Laufey; Hakanen, Tiina; Malila, Nea; Virtanen, Anni; Johannesen, Tom Børge; Larønningen, Siri; Ursin, Giske
2018-04-01
The Nordic Cancer Registries are among the oldest population-based registries in the world, with more than 60 years of complete coverage of what is now a combined population of 26 million. However, despite being the source of a substantial number of studies, there is no published paper comparing the different registries. Therefore, we did a systematic review to identify similarities and dissimilarities of the Nordic Cancer Registries, which could possibly explain some of the differences in cancer incidence rates across these countries. We describe and compare here the core characteristics of each of the Nordic Cancer Registries: (i) data sources; (ii) registered disease entities and deviations from IARC multiple cancer coding rules; (iii) variables and related coding systems. Major changes over time are described and discussed. All Nordic Cancer Registries represent a high quality standard in terms of completeness and accuracy of the registered data. Even though the information in the Nordic Cancer Registries in general can be considered more similar than any other collection of data from five different countries, there are numerous differences in registration routines, classification systems and inclusion of some tumors. These differences are important to be aware of when comparing time trends in the Nordic countries.
Assessment of Current Jet Noise Prediction Capabilities
NASA Technical Reports Server (NTRS)
Hunter, Craid A.; Bridges, James E.; Khavaran, Abbas
2008-01-01
An assessment was made of the capability of jet noise prediction codes over a broad range of jet flows, with the objective of quantifying current capabilities and identifying areas requiring future research investment. Three separate codes in NASA s possession, representative of two classes of jet noise prediction codes, were evaluated, one empirical and two statistical. The empirical code is the Stone Jet Noise Module (ST2JET) contained within the ANOPP aircraft noise prediction code. It is well documented, and represents the state of the art in semi-empirical acoustic prediction codes where virtual sources are attributed to various aspects of noise generation in each jet. These sources, in combination, predict the spectral directivity of a jet plume. A total of 258 jet noise cases were examined on the ST2JET code, each run requiring only fractions of a second to complete. Two statistical jet noise prediction codes were also evaluated, JeNo v1, and Jet3D. Fewer cases were run for the statistical prediction methods because they require substantially more resources, typically a Reynolds-Averaged Navier-Stokes solution of the jet, volume integration of the source statistical models over the entire plume, and a numerical solution of the governing propagation equation within the jet. In the evaluation process, substantial justification of experimental datasets used in the evaluations was made. In the end, none of the current codes can predict jet noise within experimental uncertainty. The empirical code came within 2dB on a 1/3 octave spectral basis for a wide range of flows. The statistical code Jet3D was within experimental uncertainty at broadside angles for hot supersonic jets, but errors in peak frequency and amplitude put it out of experimental uncertainty at cooler, lower speed conditions. Jet3D did not predict changes in directivity in the downstream angles. The statistical code JeNo,v1 was within experimental uncertainty predicting noise from cold subsonic jets at all angles, but did not predict changes with heating of the jet and did not account for directivity changes at supersonic conditions. Shortcomings addressed here give direction for future work relevant to the statistical-based prediction methods. A full report will be released as a chapter in a NASA publication assessing the state of the art in aircraft noise prediction.
Traceability Through Automatic Program Generation
NASA Technical Reports Server (NTRS)
Richardson, Julian; Green, Jeff
2003-01-01
Program synthesis is a technique for automatically deriving programs from specifications of their behavior. One of the arguments made in favour of program synthesis is that it allows one to trace from the specification to the program. One way in which traceability information can be derived is to augment the program synthesis system so that manipulations and calculations it carries out during the synthesis process are annotated with information on what the manipulations and calculations were and why they were made. This information is then accumulated throughout the synthesis process, at the end of which, every artifact produced by the synthesis is annotated with a complete history relating it to every other artifact (including the source specification) which influenced its construction. This approach requires modification of the entire synthesis system - which is labor-intensive and hard to do without influencing its behavior. In this paper, we introduce a novel, lightweight technique for deriving traceability from a program specification to the corresponding synthesized code. Once a program has been successfully synthesized from a specification, small changes are systematically made to the specification and the effects on the synthesized program observed. We have partially automated the technique and applied it in an experiment to one of our program synthesis systems, AUTOFILTER, and to the GNU C compiler, GCC. The results are promising: 1. Manual inspection of the results indicates that most of the connections derived from the source (a specification in the case of AUTOFILTER, C source code in the case of GCC) to its generated target (C source code in the case of AUTOFILTER, assembly language code in the case of GCC) are correct. 2. Around half of the lines in the target can be traced to at least one line of the source. 3. Small changes in the source often induce only small changes in the target.
Asres, Yihunie Hibstie; Mathuthu, Manny; Birhane, Marelgn Derso
2018-04-22
This study provides current evidence about cross-section production processes in the theoretical and experimental results of neutron induced reaction of uranium isotope on projectile energy range of 1-100 MeV in order to improve the reliability of nuclear stimulation. In such fission reactions of 235 U within nuclear reactors, much amount of energy would be released as a product that able to satisfy the needs of energy to the world wide without polluting processes as compared to other sources. The main objective of this work is to transform a related knowledge in the neutron-induced fission reactions on 235 U through describing, analyzing and interpreting the theoretical results of the cross sections obtained from computer code COMPLET by comparing with the experimental data obtained from EXFOR. The cross section value of 235 U(n,2n) 234 U, 235 U(n,3n) 233 U, 235 U(n,γ) 236 U, 235 U(n,f) are obtained using computer code COMPLET and the corresponding experimental values were browsed by EXFOR, IAEA. The theoretical results are compared with the experimental data taken from EXFOR Data Bank. Computer code COMPLET has been used for the analysis with the same set of input parameters and the graphs were plotted by the help of spreadsheet & Origin-8 software. The quantification of uncertainties stemming from both experimental data and computer code calculation plays a significant role in the final evaluated results. The calculated results for total cross sections were compared with the experimental data taken from EXFOR in the literature, and good agreement was found between the experimental and theoretical data. This comparison of the calculated data was analyzed and interpreted with tabulation and graphical descriptions, and the results were briefly discussed within the text of this research work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
EUPDF-II: An Eulerian Joint Scalar Monte Carlo PDF Module : User's Manual
NASA Technical Reports Server (NTRS)
Raju, M. S.; Liu, Nan-Suey (Technical Monitor)
2004-01-01
EUPDF-II provides the solution for the species and temperature fields based on an evolution equation for PDF (Probability Density Function) and it is developed mainly for application with sprays, combustion, parallel computing, and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase CFD and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with an understanding of the various models involved in the PDF formulation, its code structure and solution algorithm, and various other issues related to parallelization and its coupling with other solvers. The source code of EUPDF-II will be available with National Combustion Code (NCC) as a complete package.
Facilitating Internet-Scale Code Retrieval
ERIC Educational Resources Information Center
Bajracharya, Sushil Krishna
2010-01-01
Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…
Li, Juan; Chen, Fen; Sugiyama, Hiromu; Blair, David; Lin, Rui-Qing; Zhu, Xing-Quan
2015-07-01
In the present study, near-complete mitochondrial (mt) genome sequences for Schistosoma japonicum from different regions in the Philippines and Japan were amplified and sequenced. Comparisons among S. japonicum from the Philippines, Japan, and China revealed a geographically based length difference in mt genomes, but the mt genomic organization and gene arrangement were the same. Sequence differences among samples from the Philippines and all samples from the three endemic areas were 0.57-2.12 and 0.76-3.85 %, respectively. The most variable part of the mt genome was the non-coding region. In the coding portion of the genome, protein-coding genes varied more than rRNA genes and tRNAs. The near-complete mt genome sequences for Philippine specimens were identical in length (14,091 bp) which was 4 bp longer than those of S. japonicum samples from Japan and China. This indel provides a unique genetic marker for S. japonicum samples from the Philippines. Phylogenetic analyses based on the concatenated amino acids of 12 protein-coding genes showed that samples of S. japonicum clustered according to their geographical origins. The identified mitochondrial indel marker will be useful for tracing the source of S. japonicum infection in humans and animals in Southeast Asia.
Berben, Tom; Sorokin, Dimitry Y.; Ivanova, Natalia; ...
2015-11-19
Thioalkalivibrio paradoxus strain ARh 1 T is a chemolithoautotrophic, non-motile, Gram-negative bacterium belonging to the Gammaproteobacteria that was isolated from samples of haloalkaline soda lakes. It derives energy from the oxidation of reduced sulfur compounds and is notable for its ability to grow on thiocyanate as its sole source of electrons, sulfur and nitrogen. The full genome consists of 3,756,729 bp and comprises 3,500 protein-coding and 57 RNA-coding genes. Moreover, this organism was sequenced as part of the community science program at the DOE Joint Genome Institute.
Implementing the UCSD PASCAL system on the MODCOMP computer. [deep space network
NASA Technical Reports Server (NTRS)
Wolfe, T.
1980-01-01
The implementation of an interactive software development system (UCSD PASCAL) on the MODCOMP computer is discussed. The development of an interpreter for the MODCOMP II and the MODCOMP IV computers, written in MODCOMP II assembly language, is described. The complete Pascal programming system was run successfully on a MODCOMP II and MODCOMP IV under both the MAX II/III and MAX IV operating systems. The source code for an 8080 microcomputer version of the interpreter was used as the design for the MODCOMP interpreter. A mapping of the functions within the 8080 interpreter into MODCOMP II assembly language was the method used to code the interpreter.
Mosaic of coded aperture arrays
Fenimore, Edward E.; Cannon, Thomas M.
1980-01-01
The present invention pertains to a mosaic of coded aperture arrays which is capable of imaging off-axis sources with minimum detector size. Mosaics of the basic array pattern create a circular on periodic correlation of the object on a section of the picture plane. This section consists of elements of the central basic pattern as well as elements from neighboring patterns and is a cyclic version of the basic pattern. Since all object points contribute a complete cyclic version of the basic pattern, a section of the picture, which is the size of the basic aperture pattern, contains all the information necessary to image the object with no artifacts.
Joint source-channel coding for motion-compensated DCT-based SNR scalable video.
Kondi, Lisimachos P; Ishtiaq, Faisal; Katsaggelos, Aggelos K
2002-01-01
In this paper, we develop an approach toward joint source-channel coding for motion-compensated DCT-based scalable video coding and transmission. A framework for the optimal selection of the source and channel coding rates over all scalable layers is presented such that the overall distortion is minimized. The algorithm utilizes universal rate distortion characteristics which are obtained experimentally and show the sensitivity of the source encoder and decoder to channel errors. The proposed algorithm allocates the available bit rate between scalable layers and, within each layer, between source and channel coding. We present the results of this rate allocation algorithm for video transmission over a wireless channel using the H.263 Version 2 signal-to-noise ratio (SNR) scalable codec for source coding and rate-compatible punctured convolutional (RCPC) codes for channel coding. We discuss the performance of the algorithm with respect to the channel conditions, coding methodologies, layer rates, and number of layers.
Synthetic neutron camera and spectrometer in JET based on AFSI-ASCOT simulations
NASA Astrophysics Data System (ADS)
Sirén, P.; Varje, J.; Weisen, H.; Koskela, T.; contributors, JET
2017-09-01
The ASCOT Fusion Source Integrator (AFSI) has been used to calculate neutron production rates and spectra corresponding to the JET 19-channel neutron camera (KN3) and the time-of-flight spectrometer (TOFOR) as ideal diagnostics, without detector-related effects. AFSI calculates fusion product distributions in 4D, based on Monte Carlo integration from arbitrary reactant distribution functions. The distribution functions were calculated by the ASCOT Monte Carlo particle orbit following code for thermal, NBI and ICRH particle reactions. Fusion cross-sections were defined based on the Bosch-Hale model and both DD and DT reactions have been included. Neutrons generated by AFSI-ASCOT simulations have already been applied as a neutron source of the Serpent neutron transport code in ITER studies. Additionally, AFSI has been selected to be a main tool as the fusion product generator in the complete analysis calculation chain: ASCOT - AFSI - SERPENT (neutron and gamma transport Monte Carlo code) - APROS (system and power plant modelling code), which encompasses the plasma as an energy source, heat deposition in plant structures as well as cooling and balance-of-plant in DEMO applications and other reactor relevant analyses. This conference paper presents the first results and validation of the AFSI DD fusion model for different auxiliary heating scenarios (NBI, ICRH) with very different fast particle distribution functions. Both calculated quantities (production rates and spectra) have been compared with experimental data from KN3 and synthetic spectrometer data from ControlRoom code. No unexplained differences have been observed. In future work, AFSI will be extended for synthetic gamma diagnostics and additionally, AFSI will be used as part of the neutron transport calculation chain to model real diagnostics instead of ideal synthetic diagnostics for quantitative benchmarking.
Access Based Cost Estimation for Beddown Analysis
2006-03-23
logic. This research expands upon the existing research by using Visual Basic for Applications ( VBA ) to further customize and streamline the...methods with the use of VBA . Calculations are completed in either underlying Form VBA code or through global modules accessible throughout the...query and SQL referencing. Attempts were made where possible to align data structures with possible external sources to minimize import errors and
The Astrophysics Source Code Library: An Update
NASA Astrophysics Data System (ADS)
Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.
2012-01-01
The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.
Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J
2004-09-24
Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.
Authorship Attribution of Source Code
ERIC Educational Resources Information Center
Tennyson, Matthew F.
2013-01-01
Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…
Schroedinger’s code: Source code availability and transparency in astrophysics
NASA Astrophysics Data System (ADS)
Ryan, PW; Allen, Alice; Teuben, Peter
2018-01-01
Astronomers use software for their research, but how many of the codes they use are available as source code? We examined a sample of 166 papers from 2015 for clearly identified software use, then searched for source code for the software packages mentioned in these research papers. We categorized the software to indicate whether source code is available for download and whether there are restrictions to accessing it, and if source code was not available, whether some other form of the software, such as a binary, was. Over 40% of the source code for the software used in our sample was not available for download.As URLs have often been used as proxy citations for software, we also extracted URLs from one journal’s 2015 research articles, removed those from certain long-term, reliable domains, and tested the remainder to determine what percentage of these URLs were still accessible in September and October, 2017.
NASA Technical Reports Server (NTRS)
1992-01-01
This document describes the Advanced Imaging System CCD based camera. The AIS1 camera system was developed at Photometric Ltd. in Tucson, Arizona as part of a Phase 2 SBIR contract No. NAS5-30171 from the NASA/Goddard Space Flight Center in Greenbelt, Maryland. The camera project was undertaken as a part of the Space Telescope Imaging Spectrograph (STIS) project. This document is intended to serve as a complete manual for the use and maintenance of the camera system. All the different parts of the camera hardware and software are discussed and complete schematics and source code listings are provided.
Liang, Su-Ying; Phillips, Kathryn A.; Wang, Grace; Keohane, Carol; Armstrong, Joanne; Morris, William M.; Haas, Jennifer S.
2012-01-01
Background Administrative claims and medical records are important data sources to examine healthcare utilization and outcomes. Little is known about identifying personalized medicine technologies in these sources. Objectives To describe agreement, sensitivity, and specificity of administrative claims compared to medical records for two pairs of targeted tests and treatments for breast cancer. Research Design Retrospective analysis of medical records linked to administrative claims from a large health plan. We examined whether agreement varied by factors that facilitate tracking in claims (coding and cost) and that enhance medical record completeness (records from multiple providers). Subjects Women (35 – 65 years) with incident breast cancer diagnosed in 2006–2007 (n=775). Measures Use of human epidermal growth factor receptor 2 (HER2) and gene expression profiling (GEP) testing, trastuzumab and adjuvant chemotherapy in claims and medical records. Results Agreement between claims and records was substantial for GEP, trastuzumab, and chemotherapy, and lowest for HER2 tests. GEP, an expensive test with unique billing codes, had higher agreement (91.6% vs. 75.2%), sensitivity (94.9% vs. 76.7%), and specificity (90.1% vs. 29.2%) than HER2, a test without unique billing codes. Trastuzumab, a treatment with unique billing codes, had slightly higher agreement (95.1% vs. 90%) and sensitivity (98.1% vs. 87.9%) than adjuvant chemotherapy. Conclusions Higher agreement and specificity were associated with services that had unique billing codes and high cost. Administrative claims may be sufficient for examining services with unique billing codes. Medical records provide better data for identifying tests lacking specific codes and for research requiring detailed clinical information. PMID:21422962
Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murata, K.K.; Williams, D.C.; Griffith, R.O.
1997-12-01
The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of themore » input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.« less
Pharmer: efficient and exact pharmacophore search.
Koes, David Ryan; Camacho, Carlos J
2011-06-27
Pharmacophore search is a key component of many drug discovery efforts. Pharmer is a new computational approach to pharmacophore search that scales with the breadth and complexity of the query, not the size of the compound library being screened. Two novel methods for organizing pharmacophore data, the Pharmer KDB-tree and Bloom fingerprints, enable Pharmer to perform an exact pharmacophore search of almost two million structures in less than a minute. In general, Pharmer is more than an order of magnitude faster than existing technologies. The complete source code is available under an open-source license at http://pharmer.sourceforge.net .
Residential Segregation and the Availability of Primary Care Physicians
Gaskin, Darrell J; Dinwiddie, Gniesha Y; Chan, Kitty S; McCleary, Rachael R
2012-01-01
Objective To examine the association between residential segregation and geographic access to primary care physicians (PCPs) in metropolitan statistical areas (MSAs). Data Sources We combined zip code level data on primary care physicians from the 2006 American Medical Association master file with demographic, socioeconomic, and segregation measures from the 2000 U.S. Census. Our sample consisted of 15,465 zip codes located completely or partially in an MSA. Methods We defined PCP shortage areas as those zip codes with no PCP or a population to PCP ratio of >3,500. Using logistic regressions, we estimated the association between a zip code's odds of being a PCP shortage area and its minority composition and degree of segregation in its MSA. Principal Findings We found that odds of being a PCP shortage area were 67 percent higher for majority African American zip codes but 27 percent lower for majority Hispanic zip codes. The association varied with the degree of segregation. As the degree of segregation increased, the odds of being a PCP shortage area increased for majority African American zip codes; however, the converse was true for majority Hispanic and Asian zip codes. Conclusions Efforts to address PCP shortages should target African American communities especially in segregated MSAs. PMID:22524264
Measuring Diagnoses: ICD Code Accuracy
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-01-01
Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999
Quality of narrative operative reports in pancreatic surgery
Wiebe, Meagan E.; Sandhu, Lakhbir; Takata, Julie L.; Kennedy, Erin D.; Baxter, Nancy N.; Gagliardi, Anna R.; Urbach, David R.; Wei, Alice C.
2013-01-01
Background Quality in health care can be evaluated using quality indicators (QIs). Elements contained in the surgical operative report are potential sources for QI data, but little is known about the completeness of the narrative operative report (NR). We evaluated the completeness of the NR for patients undergoing a pancreaticoduodenectomy. Methods We reviewed NRs for patients undergoing a pancreaticoduodenectomy over a 1-year period. We extracted 79 variables related to patient and narrator characteristics, process of care measures, surgical technique and oncology-related outcomes by document analysis. Data were coded and evaluated for completeness. Results We analyzed 74 NRs. The median number of variables reported was 43.5 (range 13–54). Variables related to surgical technique were most complete. Process of care and oncology-related variables were often omitted. Completeness of the NR was associated with longer operative duration. Conclusion The NRs were often incomplete and of poor quality. Important elements, including process of care and oncology-related data, were frequently missing. Thus, the NR is an inadequate data source for QI. Development and use of alternative reporting methods, including standardized synoptic operative reports, should be encouraged to improve documentation of care and serve as a measure of quality of surgical care. PMID:24067527
Quality of narrative operative reports in pancreatic surgery.
Wiebe, Meagan E; Sandhu, Lakhbir; Takata, Julie L; Kennedy, Erin D; Baxter, Nancy N; Gagliardi, Anna R; Urbach, David R; Wei, Alice C
2013-10-01
Quality in health care can be evaluated using quality indicators (QIs). Elements contained in the surgical operative report are potential sources for QI data, but little is known about the completeness of the narrative operative report (NR). We evaluated the completeness of the NR for patients undergoing a pancreaticoduodenectomy. We reviewed NRs for patients undergoing a pancreaticoduodenectomy over a 1-year period. We extracted 79 variables related to patient and narrator characteristics, process of care measures, surgical technique and oncology-related outcomes by document analysis. Data were coded and evaluated for completeness. We analyzed 74 NRs. The median number of variables reported was 43.5 (range 13-54). Variables related to surgical technique were most complete. Process of care and oncology-related variables were often omitted. Completeness of the NR was associated with longer operative duration. The NRs were often incomplete and of poor quality. Important elements, including process of care and oncology-related data, were frequently missing. Thus, the NR is an inadequate data source for QI. Development and use of alternative reporting methods, including standardized synoptic operative reports, should be encouraged to improve documentation of care and serve as a measure of quality of surgical care.
The sound of moving bodies. Ph.D. Thesis - Cambridge Univ.
NASA Technical Reports Server (NTRS)
Brentner, Kenneth Steven
1990-01-01
The importance of the quadrupole source term in the Ffowcs, Williams, and Hawkings (FWH) equation was addressed. The quadrupole source contains fundamental components of the complete fluid mechanics problem, which are ignored only at the risk of error. The results made it clear that any application of the acoustic analogy should begin with all of the source terms in the FWH theory. The direct calculation of the acoustic field as part of the complete unsteady fluid mechanics problem using CFD is considered. It was shown that aeroelastic calculation can indeed be made with CFD codes. The results indicate that the acoustic field is the most susceptible component of the computation to numerical error. Therefore, the ability to measure the damping of acoustic waves is absolutely essential both to develop acoustic computations. Essential groundwork for a new approach to the problem of sound generation by moving bodies is presented. This new computational acoustic approach holds the promise of solving many problems hitherto pushed aside.
Pneumatically Modulated Liquid Delivery System for Nebulizers
2011-12-02
VII. Acknowledgements 18 APPENDIX A: Complete Parts List 19 APPENDIX B: Source code for the Arduino Uno microcontroller (CD) 23 1 I...implemented. The Arduino Uno is a well-established hobbyist microcontroller, focused on ease-of-use and teaching non-computer programmers about embedded...circuits. The Arduino Uno uses an Atmega328 microcontroller with thirteen digital TTL control lines, six 10-bit resolution 0-5 V analog inputs, TTL
2013-01-01
Background The objective was to examine feasibility of using hospital discharge register data for studying fire-related injuries. Methods The Finnish National Hospital Discharge Register (FHDR) was the database used to select relevant hospital discharge data to study usability and data quality issues. Patterns of E-coding were assessed, as well as prominent challenges in defining the incidence of injuries. Additionally, the issue of defining the relevant amount of hospital days accounted for in injury care was considered. Results Directly after the introduction of the ICD-10 classification system, in 1996, the completeness of E-coding was found to be poor, but to have improved dramatically around 2000 and thereafter. The scale of the challenges to defining the incidence of injuries was found to be manageable. In counting the relevant hospital days, psychiatric and long-term care were found to be the obvious and possible sources of overestimation. Conclusions The FHDR was found to be a feasible data source for studying fire-related injuries so long as potential challenges are acknowledged and taken into account. Hospital discharge data can be a unique and powerful means for injury research as issues of representativeness and coverage of traditional probability samples can frequently be completely avoided. PMID:23496937
PLUMED 2: New feathers for an old bird
NASA Astrophysics Data System (ADS)
Tribello, Gareth A.; Bonomi, Massimiliano; Branduardi, Davide; Camilloni, Carlo; Bussi, Giovanni
2014-02-01
Enhancing sampling and analyzing simulations are central issues in molecular simulation. Recently, we introduced PLUMED, an open-source plug-in that provides some of the most popular molecular dynamics (MD) codes with implementations of a variety of different enhanced sampling algorithms and collective variables (CVs). The rapid changes in this field, in particular new directions in enhanced sampling and dimensionality reduction together with new hardware, require a code that is more flexible and more efficient. We therefore present PLUMED 2 here—a complete rewrite of the code in an object-oriented programming language (C++). This new version introduces greater flexibility and greater modularity, which both extends its core capabilities and makes it far easier to add new methods and CVs. It also has a simpler interface with the MD engines and provides a single software library containing both tools and core facilities. Ultimately, the new code better serves the ever-growing community of users and contributors in coping with the new challenges arising in the field.
An experimental MOSFET approach to characterize (192)Ir HDR source anisotropy.
Toye, W C; Das, K R; Todd, S P; Kenny, M B; Franich, R D; Johnston, P N
2007-09-07
The dose anisotropy around a (192)Ir HDR source in a water phantom has been measured using MOSFETs as relative dosimeters. In addition, modeling using the EGSnrc code has been performed to provide a complete dose distribution consistent with the MOSFET measurements. Doses around the Nucletron 'classic' (192)Ir HDR source were measured for a range of radial distances from 5 to 30 mm within a 40 x 30 x 30 cm(3) water phantom, using a TN-RD-50 MOSFET dosimetry system with an active area of 0.2 mm by 0.2 mm. For each successive measurement a linear stepper capable of movement in intervals of 0.0125 mm re-positioned the MOSFET at the required radial distance, while a rotational stepper enabled angular displacement of the source at intervals of 0.9 degrees . The source-dosimeter arrangement within the water phantom was modeled using the standardized cylindrical geometry of the DOSRZnrc user code. In general, the measured relative anisotropy at each radial distance from 5 mm to 30 mm is in good agreement with the EGSnrc simulations, benchmark Monte Carlo simulation and TLD measurements where they exist. The experimental approach employing a MOSFET detection system of small size, high spatial resolution and fast read out capability allowed a practical approach to the determination of dose anisotropy around a HDR source.
Operational rate-distortion performance for joint source and channel coding of images.
Ruf, M J; Modestino, J W
1999-01-01
This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.
The diagnosis related groups enhanced electronic medical record.
Müller, Marcel Lucas; Bürkle, Thomas; Irps, Sebastian; Roeder, Norbert; Prokosch, Hans-Ulrich
2003-07-01
The introduction of Diagnosis Related Groups as a basis for hospital payment in Germany announced essential changes in the hospital reimbursement practice. A hospital's economical survival will depend vitally on the accuracy and completeness of the documentation of DRG relevant data like diagnosis and procedure codes. In order to enhance physicians' coding compliance, an easy-to-use interface integrating coding tasks seamlessly into clinical routine had to be developed. A generic approach should access coding and clinical guidelines from different information sources. Within the Electronic Medical Record (EMR) a user interface ('DRG Control Center') for all DRG relevant clinical and administrative data has been built. A comprehensive DRG-related web site gives online access to DRG grouping software and an electronic coding expert. Both components are linked together using an application supporting bi-directional communication. Other web based services like a guideline search engine can be integrated as well. With the proposed method, the clinician gains quick access to context sensitive clinical guidelines for appropriate treatment of his/her patient and administrative guidelines for the adequate coding of the diagnoses and procedures. This paper describes the design and current implementation and discusses our experiences.
Continuation of research into language concepts for the mission support environment: Source code
NASA Technical Reports Server (NTRS)
Barton, Timothy J.; Ratner, Jeremiah M.
1991-01-01
Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.
Measuring diagnoses: ICD code accuracy.
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-10-01
To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.
Martins, Renata Cristófani; Buchalla, Cassia Maria
2015-01-01
To prepare a dictionary in Portuguese for using in Iris and to evaluate its completeness for coding causes of death. Iniatially, a dictionary with all illness and injuries was created based on the International Classification of Diseases - tenth revision (ICD-10) codes. This dictionary was based on two sources: the electronic file of ICD-10 volume 1 and the data from Thesaurus of the International Classification of Primary Care (ICPC-2). Then, a death certificate sample from the Program of Improvement of Mortality Information in São Paulo (PRO-AIM) was coded manually and by Iris version V4.0.34, and the causes of death were compared. Whenever Iris was not able to code the causes of death, adjustments were made in the dictionary. Iris was able to code all causes of death in 94.4% death certificates, but only 50.6% were directly coded, without adjustments. Among death certificates that the software was unable to fully code, 89.2% had a diagnosis of external causes (chapter XX of ICD-10). This group of causes of death showed less agreement when comparing the coding by Iris to the manual one. The software performed well, but it needs adjustments and improvement in its dictionary. In the upcoming versions of the software, its developers are trying to solve the external causes of death problem.
Complete mitochondrial genome of the Yellownose skate: Zearaja chilensis (Rajiformes, Rajidae).
Jeong, Dageum; Lee, Youn-Ho
2016-01-01
The complete sequence of mitochondrial DNA of a Yellownose skate, Zearaja chilensis was determined for the first time. It is 16,909 bp in length covering 2 rRNA, 22 tRNA and 13 protein coding genes with the identical gene order and structure as those of other Rajidae species. The nucleotide of L-strand is composed of low G (14.3%), and slightly high A + T (58.9%) nucleotides. The strong codon usage bias against the use of G (6.0%) is found at the third codon positions. Twelve of the 13 protein coding genes use ATG as the start codon while COX1 starts with GTG. As for the stop codon, only ND4 shows an incomplete stop codon TA. This is the first report of the mitogenome for a species in the genus Zearaja, providing a valuable source of genetic information on the evolution of the family Rajidae and the genus Zearaja as well as for establishment of a sustainble fishery management plan of the species.
Dynamical beam manipulation based on 2-bit digitally-controlled coding metasurface.
Huang, Cheng; Sun, Bo; Pan, Wenbo; Cui, Jianhua; Wu, Xiaoyu; Luo, Xiangang
2017-02-08
Recently, a concept of digital metamaterials has been proposed to manipulate field distribution through proper spatial mixtures of digital metamaterial bits. Here, we present a design of 2-bit digitally-controlled coding metasurface that can effectively modulate the scattered electromagnetic wave and realize different far-field beams. Each meta-atom of this metasurface integrates two pin diodes, and by tuning their operating states, the metasurface has four phase responses of 0, π/2, π, and 3π/2, corresponding to four basic digital elements "00", "01", "10", and "11", respectively. By designing the coding sequence of the above digital element array, the reflected beam can be arbitrarily controlled. The proposed 2-bit digital metasurface has been demonstrated to possess capability of achieving beam deflection, multi-beam and beam diffusion, and the dynamical switching of these different scattering patterns is completed by a programmable electric source.
Some practical universal noiseless coding techniques, part 2
NASA Technical Reports Server (NTRS)
Rice, R. F.; Lee, J. J.
1983-01-01
This report is an extension of earlier work (Part 1) which provided practical adaptive techniques for the efficient noiseless coding of a broad class of data sources characterized by only partially known and varying statistics (JPL Publication 79-22). The results here, while still claiming such general applicability, focus primarily on the noiseless coding of image data. A fairly complete and self-contained treatment is provided. Particular emphasis is given to the requirements of the forthcoming Voyager II encounters of Uranus and Neptune. Performance evaluations are supported both graphically and pictorially. Expanded definitions of the algorithms in Part 1 yield a computationally improved set of options for applications requiring efficient performance at entropies above 4 bits/sample. These expanded definitions include as an important subset, a somewhat less efficient but extremely simple "FAST' compressor which will be used at the Voyager Uranus encounter. Additionally, options are provided which enhance performance when atypical data spikes may be present.
Dynamical beam manipulation based on 2-bit digitally-controlled coding metasurface
Huang, Cheng; Sun, Bo; Pan, Wenbo; Cui, Jianhua; Wu, Xiaoyu; Luo, Xiangang
2017-01-01
Recently, a concept of digital metamaterials has been proposed to manipulate field distribution through proper spatial mixtures of digital metamaterial bits. Here, we present a design of 2-bit digitally-controlled coding metasurface that can effectively modulate the scattered electromagnetic wave and realize different far-field beams. Each meta-atom of this metasurface integrates two pin diodes, and by tuning their operating states, the metasurface has four phase responses of 0, π/2, π, and 3π/2, corresponding to four basic digital elements “00”, “01”, “10”, and “11”, respectively. By designing the coding sequence of the above digital element array, the reflected beam can be arbitrarily controlled. The proposed 2-bit digital metasurface has been demonstrated to possess capability of achieving beam deflection, multi-beam and beam diffusion, and the dynamical switching of these different scattering patterns is completed by a programmable electric source. PMID:28176870
2003-04-01
gener- ally considered to be passive data . Instead the genetic material should be capable of being algorith - mic information, that is, program code or...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other
Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems
2010-12-01
the software for reevaluation. Once the ree- valuation process is completed, CERT provides the client a report detailing the software’s con - formance...Flagged Nonconformities (FNC) Software System TP/FNC Ratio Mozilla Firefox version 2.0 6/12 50% Linux kernel version 2.6.15 10/126 8% Wine...inappropriately tuned for analysis of the Linux kernel, which has anomalous results. Customizing SCALe to work with energy system software will help
NASA Technical Reports Server (NTRS)
Martini, W. R.
1981-01-01
A series of computer programs are presented with full documentation which simulate the transient behavior of a modern 4 cylinder Siemens arrangement Stirling engine with burner and air preheater. Cold start, cranking, idling, acceleration through 3 gear changes and steady speed operation are simulated. Sample results and complete operating instructions are given. A full source code listing of all programs are included.
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley
2018-05-01
We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.
McKenzie, Kirsten; Mitchell, Rebecca; Scott, Deborah Anne; Harrison, James Edward; McClure, Roderick John
2009-08-01
To examine the reliability of work-related activity coding for injury-related hospitalisations in Australia. A random sample of 4,373 injury-related hospital separations from 1 July 2002 to 30 June 2004 were obtained from a stratified random sample of 50 hospitals across four states in Australia. From this sample, cases were identified as work-related if they contained an ICD-10-AM work-related activity code (U73) allocated by either: (i) the original coder; (ii) an independent auditor, blinded to the original code; or (iii) a research assistant, blinded to both the original and auditor codes, who reviewed narrative text extracted from the medical record. The concordance of activity coding and number of cases identified as work-related using each method were compared. Of the 4,373 cases sampled, 318 cases were identified as being work-related using any of the three methods for identification. The original coder identified 217 and the auditor identified 266 work-related cases (68.2% and 83.6% of the total cases identified, respectively). Around 10% of cases were only identified through the text description review. The original coder and auditor agreed on the assignment of work-relatedness for 68.9% of cases. The best estimates of the frequency of hospital admissions for occupational injury underestimate the burden by around 32%. This is a substantial underestimate that has major implications for public policy, and highlights the need for further work on improving the quality and completeness of routine, administrative data sources for a more complete identification of work-related injuries.
You've Written a Cool Astronomy Code! Now What Do You Do with It?
NASA Astrophysics Data System (ADS)
Allen, Alice; Accomazzi, A.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P. J.; Wallin, J. F.
2014-01-01
Now that you've written a useful astronomy code for your soon-to-be-published research, you have to figure out what you want to do with it. Our suggestion? Share it! This presentation highlights the means and benefits of sharing your code. Make your code citable -- submit it to the Astrophysics Source Code Library and have it indexed by ADS! The Astrophysics Source Code Library (ASCL) is a free online registry of source codes of interest to astronomers and astrophysicists. With over 700 codes, it is continuing its rapid growth, with an average of 17 new codes a month. The editors seek out codes for inclusion; indexing by ADS improves the discoverability of codes and provides a way to cite codes as separate entries, especially codes without papers that describe them.
Finite elements numerical codes as primary tool to improve beam optics in NIO1
NASA Astrophysics Data System (ADS)
Baltador, C.; Cavenago, M.; Veltri, P.; Serianni, G.
2017-08-01
The RF negative ion source NIO1, built at Consorzio RFX in Padua (Italy), is aimed to investigate general issues on ion source physics in view of the full-size ITER injector MITICA as well as DEMO relevant solutions, like energy recovery and alternative neutralization systems, crucial for neutral beam injectors in future fusion experiments. NIO1 has been designed to produce 9 H-beamlets (in a 3x3 pattern) of 15mA each and 60keV, using a three electrodes system downstream the plasma source. At the moment the source is at its early operational stage and only operation at low power and low beam energy is possible. In particular, NIO1 presents a too strong set of SmCo co-extraction electron suppression magnets (CESM) in the extraction grid (EG) that will be replaced by a weaker set of Ferrite magnets. A completely new set of magnets will be also designed and mounted on the new EG that will be installed next year, replacing the present one. In this paper, the finite element code OPERA 3D is used to investigate the effects of the three sets of magnets on beamlet optics. A comparison of numerical results with measurements will be provided where possible.
Ultra-high-energy cosmic rays from low-luminosity active galactic nuclei
NASA Astrophysics Data System (ADS)
Duţan, Ioana; Caramete, Laurenţiu I.
2015-03-01
We investigate the production of ultra-high-energy cosmic ray (UHECR) in relativistic jets from low-luminosity active galactic nuclei (LLAGN). We start by proposing a model for the UHECR contribution from the black holes (BHs) in LLAGN, which present a jet power Pj ⩽1046 erg s-1. This is in contrast to the opinion that only high-luminosity AGN can accelerate particles to energies ⩾ 50 EeV. We rewrite the equations which describe the synchrotron self-absorbed emission of a non-thermal particle distribution to obtain the observed radio flux density from sources with a flat-spectrum core and its relationship to the jet power. We found that the UHECR flux is dependent on the observed radio flux density, the distance to the AGN, and the BH mass, where the particle acceleration regions can be sustained by the magnetic energy extraction from the BH at the center of the AGN. We use a complete sample of 29 radio sources with a total flux density at 5 GHz greater than 0.5 Jy to make predictions for the maximum particle energy, luminosity, and flux of the UHECRs from nearby AGN. These predictions are then used in a semi-analytical code developed in Mathematica (SAM code) as inputs for the Monte-Carlo simulations to obtain the distribution of the arrival direction at the Earth and the energy spectrum of the UHECRs, taking into account their deflection in the intergalactic magnetic fields. For comparison, we also use the CRPropa code with the same initial conditions as for the SAM code. Importantly, to calculate the energy spectrum we also include the weighting of the UHECR flux per each UHECR source. Next, we compare the energy spectrum of the UHECRs with that obtained by the Pierre Auger Observatory.
The random energy model in a magnetic field and joint source channel coding
NASA Astrophysics Data System (ADS)
Merhav, Neri
2008-09-01
We demonstrate that there is an intimate relationship between the magnetic properties of Derrida’s random energy model (REM) of spin glasses and the problem of joint source-channel coding in Information Theory. In particular, typical patterns of erroneously decoded messages in the coding problem have “magnetization” properties that are analogous to those of the REM in certain phases, where the non-uniformity of the distribution of the source in the coding problem plays the role of an external magnetic field applied to the REM. We also relate the ensemble performance (random coding exponents) of joint source-channel codes to the free energy of the REM in its different phases.
Frustrations among graduates of athletic training education programs.
Bowman, Thomas G; Dodge, Thomas M
2013-01-01
Although previous researchers have begun to identify sources of athletic training student stress, the specific reasons for student frustrations are not yet fully understood. It is important for athletic training administrators to understand sources of student frustration to provide a supportive learning environment. To determine the factors that lead to feelings of frustration while completing a professional athletic training education program (ATEP). Qualitative study. National Athletic Trainers' Association (NATA) accredited postprofessional education program. Fourteen successful graduates (12 women, 2 men) of accredited professional undergraduate ATEPs enrolled in an NATA-accredited postprofessional education program. We conducted semistructured interviews and analyzed data with a grounded theory approach using open, axial, and selective coding procedures. We negotiated over the coding scheme and performed peer debriefings and member checks to ensure trustworthiness of the results. Four themes emerged from the data: (1) Athletic training student frustrations appear to stem from the amount of stress involved in completing an ATEP, leading to anxiety and feelings of being overwhelmed. (2) The interactions students have with classmates, faculty, and preceptors can also be a source of frustration for athletic training students. (3) Monotonous clinical experiences often left students feeling disengaged. (4) Students questioned entering the athletic training profession because of the fear of work-life balance problems and low compensation. In order to reduce frustration, athletic training education programs should validate students' decisions to pursue athletic training and validate their contributions to the ATEP; provide clinical education experiences with graded autonomy; encourage positive personal interactions between students, faculty, and preceptors; and successfully model the benefits of a career in athletic training.
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1993-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application. An overview of the GRASP/Ada project with an emphasis on the current update is provided.
Design of the low energy beam transport line for the China spallation neutron source
NASA Astrophysics Data System (ADS)
Li, Jin-Hai; Ouyang, Hua-Fu; Fu, Shi-Nian; Zhang, Hua-Shun; He, Wei
2008-03-01
The design of the China Spallation Neutron Source (CSNS) low-energy beam transport (LEBT) line, which locates between the ion source and the radio-frequency quadrupole (RFQ), has been completed with the TRACE3D code. The design aims at perfect matching, primary chopping, a small emittance growth and sufficient space for beam diagnostics. The line consists of three solenoids, three vacuum chambers, two steering magnets and a pre-chopper. The total length of LEBT is about 1.74 m. This LEBT is designed to transfer 20 mA of H-pulsed beam from the ion source to the RFQ. An induction cavity is adopted as the pre-chopper. The electrostatic octupole steerer is discussed as a candidate. A four-quadrant aperture for beam scraping and beam position monitoring is designed.
2014-06-01
User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E) by James P. Larentzos...Laboratory Aberdeen Proving Ground, MD 21005-5069 ARL-SR-290 June 2014 User Manual and Source Code for a LAMMPS Implementation of Constant...3. DATES COVERED (From - To) September 2013–February 2014 4. TITLE AND SUBTITLE User Manual and Source Code for a LAMMPS Implementation of
Analyzing and modeling gravity and magnetic anomalies using the SPHERE program and Magsat data
NASA Technical Reports Server (NTRS)
Braile, L. W.; Hinze, W. J.; Vonfrese, R. R. B. (Principal Investigator)
1981-01-01
Computer codes were completed, tested, and documented for analyzing magnetic anomaly vector components by equivalent point dipole inversion. The codes are intended for use in inverting the magnetic anomaly due to a spherical prism in a horizontal geomagnetic field and for recomputing the anomaly in a vertical geomagnetic field. Modeling of potential fields at satellite elevations that are derived from three dimensional sources by program SPHERE was made significantly more efficient by improving the input routines. A preliminary model of the Andean subduction zone was used to compute the anomaly at satellite elevations using both actual geomagnetic parameters and vertical polarization. Program SPHERE is also being used to calculate satellite level magnetic and gravity anomalies from the Amazon River Aulacogen.
pyOpenMS: a Python-based interface to the OpenMS mass-spectrometry algorithm library.
Röst, Hannes L; Schmitt, Uwe; Aebersold, Ruedi; Malmström, Lars
2014-01-01
pyOpenMS is an open-source, Python-based interface to the C++ OpenMS library, providing facile access to a feature-rich, open-source algorithm library for MS-based proteomics analysis. It contains Python bindings that allow raw access to the data structures and algorithms implemented in OpenMS, specifically those for file access (mzXML, mzML, TraML, mzIdentML among others), basic signal processing (smoothing, filtering, de-isotoping, and peak-picking) and complex data analysis (including label-free, SILAC, iTRAQ, and SWATH analysis tools). pyOpenMS thus allows fast prototyping and efficient workflow development in a fully interactive manner (using the interactive Python interpreter) and is also ideally suited for researchers not proficient in C++. In addition, our code to wrap a complex C++ library is completely open-source, allowing other projects to create similar bindings with ease. The pyOpenMS framework is freely available at https://pypi.python.org/pypi/pyopenms while the autowrap tool to create Cython code automatically is available at https://pypi.python.org/pypi/autowrap (both released under the 3-clause BSD licence). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Astronomy education and the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; Nemiroff, Robert J.
2016-01-01
The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dixon, David A.; Hughes, Henry Grady
In this paper, we expand on previous validation work by Dixon and Hughes. That is, we present a more complete suite of validation results with respect to to the well-known Lockwood energy deposition experiment. Lockwood et al. measured energy deposition in materials including beryllium, carbon, aluminum, iron, copper, molybdenum, tantalum, and uranium, for both single- and multi-layer 1-D geometries. Source configurations included mono-energetic, mono-directional electron beams with energies of 0.05-MeV, 0.1-MeV, 0.3- MeV, 0.5-MeV, and 1-MeV, in both normal and off-normal angles of incidence. These experiments are particularly valuable for validating electron transport codes, because they are closely represented bymore » simulating pencil beams incident on 1-D semi-infinite slabs with and without material interfaces. Herein, we include total energy deposition and energy deposition profiles for the single-layer experiments reported by Lockwood et al. (a more complete multi-layer validation will follow in another report).« less
NASA Astrophysics Data System (ADS)
Ofek, Eran O.; Zackay, Barak
2018-04-01
Detection of templates (e.g., sources) embedded in low-number count Poisson noise is a common problem in astrophysics. Examples include source detection in X-ray images, γ-rays, UV, neutrinos, and search for clusters of galaxies and stellar streams. However, the solutions in the X-ray-related literature are sub-optimal in some cases by considerable factors. Using the lemma of Neyman–Pearson, we derive the optimal statistics for template detection in the presence of Poisson noise. We demonstrate that, for known template shape (e.g., point sources), this method provides higher completeness, for a fixed false-alarm probability value, compared with filtering the image with the point-spread function (PSF). In turn, we find that filtering by the PSF is better than filtering the image using the Mexican-hat wavelet (used by wavdetect). For some background levels, our method improves the sensitivity of source detection by more than a factor of two over the popular Mexican-hat wavelet filtering. This filtering technique can also be used for fast PSF photometry and flare detection; it is efficient and straightforward to implement. We provide an implementation in MATLAB. The development of a complete code that works on real data, including the complexities of background subtraction and PSF variations, is deferred for future publication.
Data processing with microcode designed with source coding
McCoy, James A; Morrison, Steven E
2013-05-07
Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.
NASA Astrophysics Data System (ADS)
Panuzzo, P.; Li, J.; Caux, E.
2012-09-01
The Herschel Interactive Processing Environment (HIPE) was developed by the European Space Agency (ESA) in collaboration with NASA and the Herschel Instrument Control Centres, to provide the astronomical community a complete environment to process and analyze the data gathered by the Herschel Space Observatory. One of the most important components of HIPE is the plotting system (named PlotXY) that we present here. With PlotXY it is possible to produce easily high quality publication-ready 2D plots. It provides a long list of features, with fully configurable components, and interactive zooming. The entire code of HIPE is written in Java and is open source released under the GNU Lesser General Public License version 3. A new version of PlotXY is being developed to be independent from the HIPE code base; it is available to the software development community for the inclusion in other projects at the URL http://code.google.com/p/jplot2d/.
The Sensitivity of Coded Mask Telescopes
NASA Technical Reports Server (NTRS)
Skinner, Gerald K.
2008-01-01
Simple formulae are often used to estimate the sensitivity of coded mask X-ray or gamma-ray telescopes, but t,hese are strictly only applicable if a number of basic assumptions are met. Complications arise, for example, if a grid structure is used to support the mask elements, if the detector spatial resolution is not good enough to completely resolve all the detail in the shadow of the mask or if any of a number of other simplifying conditions are not fulfilled. We derive more general expressions for the Poisson-noise-limited sensitivity of astronomical telescopes using the coded mask technique, noting explicitly in what circumstances they are applicable. The emphasis is on using nomenclature and techniques that result in simple and revealing results. Where no convenient expression is available a procedure is given which allows the calculation of the sensitivity. We consider certain aspects of the optimisation of the design of a coded mask telescope and show that when the detector spatial resolution and the mask to detector separation are fixed, the best source location accuracy is obtained when the mask elements are equal in size to the detector pixels.
Modeling activities on the negative-ion-based Neutral Beam Injectors of the Large Helical Device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agostinetti, P.; Antoni, V.; Chitarin, G.
2011-09-26
At the National Institute for Fusion Science (NIFS) large-scaled negative ion sources have been widely used for the Neutral Beam Injectors (NBIs) mounted on the Large Helical Device (LHD), which is the world-largest superconducting helical system. These injectors have achieved outstanding performances in terms of beam energy, negative-ion current and optics, and represent a reference for the development of heating and current drive NBIs for ITER.In the framework of the support activities for the ITER NBIs, the PRIMA test facility, which includes a RF-drive ion source with 100 keV accelerator (SPIDER) and a complete 1 MeV Neutral Beam system (MITICA)more » is under construction at Consorzio RFX in Padova.An experimental validation of the codes has been undertaken in order to prove the accuracy of the simulations and the soundness of the SPIDER and MITICA design. To this purpose, the whole set of codes have been applied to the LHD NBIs in a joint activity between Consorzio RFX and NIFS, with the goal of comparing and benchmarking the codes with the experimental data. A description of these modeling activities and a discussion of the main results obtained are reported in this paper.« less
Complete genome sequence of Desulfarculus baarsii type strain (2st14T)
Sun, Hui; Spring, Stefan; Lapidus, Alla; Davenport, Karen; Del Rio, Tijana Glavina; Tice, Hope; Nolan, Matt; Copeland, Alex; Cheng, Jan-Fang; Lucas, Susan; Tapia, Roxanne; Goodwin, Lynne; Pitluck, Sam; Ivanova, Natalia; Pagani, Ionna; Mavromatis, Konstantinos; Ovchinnikova, Galina; Pati, Amrita; Chen, Amy; Palaniappan, Krishna; Hauser, Loren; Chang, Yun-Juan; Jeffries, Cynthia D.; Detter, John C.; Han, Cliff; Rohde, Manfred; Brambilla, Evelyne; Göker, Markus; Woyke, Tanja; Bristow, Jim; Eisen, Jonathan A.; Markowitz, Victor; Hugenholtz, Philip; Kyrpides, Nikos C; Klenk, Hans-Peter; Land, Miriam
2010-01-01
Desulfarculus baarsii (Widdel 1981) Kuever et al. 2006 is the type and only species of the genus Desulfarculus, which represents the family Desulfarculaceae and the order Desulfarculales. This species is a mesophilic sulfate-reducing bacterium with the capability to oxidize acetate and fatty acids of up to 18 carbon atoms completely to CO2. The acetyl-CoA/CODH (Wood-Ljungdahl) pathway is used by this species for the complete oxidation of carbon sources and autotrophic growth on formate. The type strain 2st14T was isolated from a ditch sediment collected near the University of Konstanz, Germany. This is the first completed genome sequence of a member of the order Desulfarculales. The 3,655,731 bp long single replicon genome with its 3,303 protein-coding and 52 RNA genes is a part of the Genomic Encyclopedia of Bacteria and Archaea project. PMID:21304732
Adaptive distributed source coding.
Varodayan, David; Lin, Yao-Chung; Girod, Bernd
2012-05-01
We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.
VHDL implementation of feature-extraction algorithm for the PANDA electromagnetic calorimeter
NASA Astrophysics Data System (ADS)
Guliyev, E.; Kavatsyuk, M.; Lemmens, P. J. J.; Tambave, G.; Löhner, H.; Panda Collaboration
2012-02-01
A simple, efficient, and robust feature-extraction algorithm, developed for the digital front-end electronics of the electromagnetic calorimeter of the PANDA spectrometer at FAIR, Darmstadt, is implemented in VHDL for a commercial 16 bit 100 MHz sampling ADC. The source-code is available as an open-source project and is adaptable for other projects and sampling ADCs. Best performance with different types of signal sources can be achieved through flexible parameter selection. The on-line data-processing in FPGA enables to construct an almost dead-time free data acquisition system which is successfully evaluated as a first step towards building a complete trigger-less readout chain. Prototype setups are studied to determine the dead-time of the implemented algorithm, the rate of false triggering, timing performance, and event correlations.
The Particle Accelerator Simulation Code PyORBIT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M
2015-01-01
The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less
2009-09-01
nuclear industry for conducting performance assessment calculations. The analytical FORTRAN code for the DNAPL source function, REMChlor, was...project. The first was to apply existing deterministic codes , such as T2VOC and UTCHEM, to the DNAPL source zone to simulate the remediation processes...but describe the spatial variability of source zones unlike one-dimensional flow and transport codes that assume homogeneity. The Lagrangian models
SU-E-T-278: Realization of Dose Verification Tool for IMRT Plan Based On DPM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Jinfeng; Cao, Ruifen; Dai, Yumei
Purpose: To build a Monte Carlo dose verification tool for IMRT Plan by implementing a irradiation source model into DPM code. Extend the ability of DPM to calculate any incident angles and irregular-inhomogeneous fields. Methods: With the virtual source and the energy spectrum which unfolded from the accelerator measurement data,combined with optimized intensity maps to calculate the dose distribution of the irradiation irregular-inhomogeneous field. The irradiation source model of accelerator was substituted by a grid-based surface source. The contour and the intensity distribution of the surface source were optimized by ARTS (Accurate/Advanced Radiotherapy System) optimization module based on the tumormore » configuration. The weight of the emitter was decided by the grid intensity. The direction of the emitter was decided by the combination of the virtual source and the emitter emitting position. The photon energy spectrum unfolded from the accelerator measurement data was adjusted by compensating the contaminated electron source. For verification, measured data and realistic clinical IMRT plan were compared with DPM dose calculation. Results: The regular field was verified by comparing with the measured data. It was illustrated that the differences were acceptable (<2% inside the field, 2–3mm in the penumbra). The dose calculation of irregular field by DPM simulation was also compared with that of FSPB (Finite Size Pencil Beam) and the passing rate of gamma analysis was 95.1% for peripheral lung cancer. The regular field and the irregular rotational field were all within the range of permitting error. The computing time of regular fields were less than 2h, and the test of peripheral lung cancer was 160min. Through parallel processing, the adapted DPM could complete the calculation of IMRT plan within half an hour. Conclusion: The adapted parallelized DPM code with irradiation source model is faster than classic Monte Carlo codes. Its computational accuracy and speed satisfy the clinical requirement, and it is expectable to be a Monte Carlo dose verification tool for IMRT Plan. Strategic Priority Research Program of the China Academy of Science(XDA03040000); National Natural Science Foundation of China (81101132)« less
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexandrov, Boian S.; Lliev, Filip L.; Stanev, Valentin G.
This code is a toy (short) version of CODE-2016-83. From a general perspective, the code represents an unsupervised adaptive machine learning algorithm that allows efficient and high performance de-mixing and feature extraction of a multitude of non-negative signals mixed and recorded by a network of uncorrelated sensor arrays. The code identifies the number of the mixed original signals and their locations. Further, the code also allows deciphering of signals that have been delayed in regards to the mixing process in each sensor. This code is high customizable and it can be efficiently used for a fast macro-analyses of data. Themore » code is applicable to a plethora of distinct problems: chemical decomposition, pressure transient decomposition, unknown sources/signal allocation, EM signal decomposition. An additional procedure for allocation of the unknown sources is incorporated in the code.« less
Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey
NASA Astrophysics Data System (ADS)
Guillemot, Christine; Siohan, Pierre
2005-12-01
Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.
Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.
2013-12-01
Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.
Perfect quantum multiple-unicast network coding protocol
NASA Astrophysics Data System (ADS)
Li, Dan-Dan; Gao, Fei; Qin, Su-Juan; Wen, Qiao-Yan
2018-01-01
In order to realize long-distance and large-scale quantum communication, it is natural to utilize quantum repeater. For a general quantum multiple-unicast network, it is still puzzling how to complete communication tasks perfectly with less resources such as registers. In this paper, we solve this problem. By applying quantum repeaters to multiple-unicast communication problem, we give encoding-decoding schemes for source nodes, internal ones and target ones, respectively. Source-target nodes share EPR pairs by using our encoding-decoding schemes over quantum multiple-unicast network. Furthermore, quantum communication can be accomplished perfectly via teleportation. Compared with existed schemes, our schemes can reduce resource consumption and realize long-distance transmission of quantum information.
Multidimensional incremental parsing for universal source coding.
Bae, Soo Hyun; Juang, Biing-Hwang
2008-10-01
A multidimensional incremental parsing algorithm (MDIP) for multidimensional discrete sources, as a generalization of the Lempel-Ziv coding algorithm, is investigated. It consists of three essential component schemes, maximum decimation matching, hierarchical structure of multidimensional source coding, and dictionary augmentation. As a counterpart of the longest match search in the Lempel-Ziv algorithm, two classes of maximum decimation matching are studied. Also, an underlying behavior of the dictionary augmentation scheme for estimating the source statistics is examined. For an m-dimensional source, m augmentative patches are appended into the dictionary at each coding epoch, thus requiring the transmission of a substantial amount of information to the decoder. The property of the hierarchical structure of the source coding algorithm resolves this issue by successively incorporating lower dimensional coding procedures in the scheme. In regard to universal lossy source coders, we propose two distortion functions, the local average distortion and the local minimax distortion with a set of threshold levels for each source symbol. For performance evaluation, we implemented three image compression algorithms based upon the MDIP; one is lossless and the others are lossy. The lossless image compression algorithm does not perform better than the Lempel-Ziv-Welch coding, but experimentally shows efficiency in capturing the source structure. The two lossy image compression algorithms are implemented using the two distortion functions, respectively. The algorithm based on the local average distortion is efficient at minimizing the signal distortion, but the images by the one with the local minimax distortion have a good perceptual fidelity among other compression algorithms. Our insights inspire future research on feature extraction of multidimensional discrete sources.
Cloudy - simulating the non-equilibrium microphysics of gas and dust, and its observed spectrum
NASA Astrophysics Data System (ADS)
Ferland, Gary J.
2014-01-01
Cloudy is an open-source plasma/spectral simulation code, last described in the open-access journal Revista Mexicana (Ferland et al. 2013, 2013RMxAA..49..137F). The project goal is a complete simulation of the microphysics of gas and dust over the full range of density, temperature, and ionization that we encounter in astrophysics, together with a prediction of the observed spectrum. Cloudy is one of the more widely used theory codes in astrophysics with roughly 200 papers citing its documentation each year. It is developed by graduate students, postdocs, and an international network of collaborators. Cloudy is freely available on the web at trac.nublado.org, the user community can post questions on http://groups.yahoo.com/neo/groups/cloudy_simulations/info, and summer schools are organized to learn more about Cloudy and its use (http://cloud9.pa.uky.edu gary/cloudy/CloudySummerSchool/). The code’s widespread use is possible because of extensive automatic testing. It is exercised over its full range of applicability whenever the source is changed. Changes in predicted quantities are automatically detected along with any newly introduced problems. The code is designed to be autonomous and self-aware. It generates a report at the end of a calculation that summarizes any problems encountered along with suggestions of potentially incorrect boundary conditions. This self-monitoring is a core feature since the code is now often used to generate large MPI grids of simulations, making it impossible for a user to verify each calculation by hand. I will describe some challenges in developing a large physics code, with its many interconnected physical processes, many at the frontier of research in atomic or molecular physics, all in an open environment.
An Efficient Variable Length Coding Scheme for an IID Source
NASA Technical Reports Server (NTRS)
Cheung, K. -M.
1995-01-01
A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.
Numerical simulation of flow through the Langley parametric scramjet engine
NASA Technical Reports Server (NTRS)
Srinivasan, Shivakumar; Kamath, Pradeep S.; Mcclinton, Charles R.
1989-01-01
The numerical simulation of a three-dimensional turbulent, reacting flow through the entire Langley parametric scramjet engine has been obtained using a piecewise elliptic approach. The last section in the combustor has been analyzed using a parabolized Navier-Stokes code. The facility nozzle flow was analyzed as a first step. The outflow conditions from the nozzle were chosen as the inflow conditions of the scramjet inlet. The nozzle and the inlet simulation were accomplished by solving the three-dimensional Navier-Stokes equations with a perfect gas assumption. The inlet solution downstream of the scramjet throat was used to provide inflow conditions for the combustor region. The first two regions of the combustor were analyzed using the MacCormack's explicit scheme. However, the source terms in the species equations were solved implicitly. The finite rate chemistry was modeled using the two-step reaction model of Rogers and Chinitz. A complete reaction model was used in the PNS code to solve the last combustor region. The numerical solutions provide an insight of the flow details in a complete hydrogen-fueled scramjet engine module.
Source Code Plagiarism--A Student Perspective
ERIC Educational Resources Information Center
Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.
2011-01-01
This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…
Recent advances in coding theory for near error-free communications
NASA Technical Reports Server (NTRS)
Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.
1991-01-01
Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.
NASA Technical Reports Server (NTRS)
Meyer, Harold D.
1999-01-01
This report provides a study of rotor and stator scattering using the SOURCE3D Rotor Wake/Stator Interaction Code. SOURCE3D is a quasi-three-dimensional computer program that uses three-dimensional acoustics and two-dimensional cascade load response theory to calculate rotor and stator modal reflection and transmission (scattering) coefficients. SOURCE3D is at the core of the TFaNS (Theoretical Fan Noise Design/Prediction System), developed for NASA, which provides complete fully coupled (inlet, rotor, stator, exit) noise solutions for turbofan engines. The reason for studying scattering is that we must first understand the behavior of the individual scattering coefficients provided by SOURCE3D, before eventually understanding the more complicated predictions from TFaNS. To study scattering, we have derived a large number of scattering curves for vane and blade rows. The curves are plots of output wave power divided by input wave power (in dB units) versus vane/blade ratio. Some of these plots are shown in this report. All of the plots are provided in a separate volume. To assist in understanding the plots, formulas have been derived for special vane/blade ratios for which wavefronts are either parallel or normal to rotor or stator chords. From the plots, we have found that, for the most part, there was strong transmission and weak reflection over most of the vane/blade ratio range for the stator. For the rotor, there was little transmission loss.
Hybrid concatenated codes and iterative decoding
NASA Technical Reports Server (NTRS)
Divsalar, Dariush (Inventor); Pollara, Fabrizio (Inventor)
2000-01-01
Several improved turbo code apparatuses and methods. The invention encompasses several classes: (1) A data source is applied to two or more encoders with an interleaver between the source and each of the second and subsequent encoders. Each encoder outputs a code element which may be transmitted or stored. A parallel decoder provides the ability to decode the code elements to derive the original source information d without use of a received data signal corresponding to d. The output may be coupled to a multilevel trellis-coded modulator (TCM). (2) A data source d is applied to two or more encoders with an interleaver between the source and each of the second and subsequent encoders. Each of the encoders outputs a code element. In addition, the original data source d is output from the encoder. All of the output elements are coupled to a TCM. (3) At least two data sources are applied to two or more encoders with an interleaver between each source and each of the second and subsequent encoders. The output may be coupled to a TCM. (4) At least two data sources are applied to two or more encoders with at least two interleavers between each source and each of the second and subsequent encoders. (5) At least one data source is applied to one or more serially linked encoders through at least one interleaver. The output may be coupled to a TCM. The invention includes a novel way of terminating a turbo coder.
Frustrations Among Graduates of Athletic Training Education Programs
Bowman, Thomas G; Dodge, Thomas M
2013-01-01
Context Although previous researchers have begun to identify sources of athletic training student stress, the specific reasons for student frustrations are not yet fully understood. It is important for athletic training administrators to understand sources of student frustration to provide a supportive learning environment. Objective To determine the factors that lead to feelings of frustration while completing a professional athletic training education program (ATEP). Design Qualitative study. Setting National Athletic Trainers' Association (NATA) accredited postprofessional education program. Patients or Other Participants Fourteen successful graduates (12 women, 2 men) of accredited professional undergraduate ATEPs enrolled in an NATA-accredited postprofessional education program. Data Collection and Analysis We conducted semistructured interviews and analyzed data with a grounded theory approach using open, axial, and selective coding procedures. We negotiated over the coding scheme and performed peer debriefings and member checks to ensure trustworthiness of the results. Results Four themes emerged from the data: (1) Athletic training student frustrations appear to stem from the amount of stress involved in completing an ATEP, leading to anxiety and feelings of being overwhelmed. (2) The interactions students have with classmates, faculty, and preceptors can also be a source of frustration for athletic training students. (3) Monotonous clinical experiences often left students feeling disengaged. (4) Students questioned entering the athletic training profession because of the fear of work-life balance problems and low compensation. Conclusions In order to reduce frustration, athletic training education programs should validate students' decisions to pursue athletic training and validate their contributions to the ATEP; provide clinical education experiences with graded autonomy; encourage positive personal interactions between students, faculty, and preceptors; and successfully model the benefits of a career in athletic training. PMID:23672328
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David; Shaver, Dillon; Liu, Yang
The U.S. Department of Energy, Office of Nuclear Energy charges participants in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program with the development of advanced modeling and simulation capabilities that can be used to address design, performance and safety challenges in the development and deployment of advanced reactor technology. The NEAMS has established a high impact problem (HIP) team to demonstrate the applicability of these tools to identification and mitigation of sources of steam generator flow induced vibration (SGFIV). The SGFIV HIP team is working to evaluate vibration sources in an advanced helical coil steam generator using computational fluidmore » dynamics (CFD) simulations of the turbulent primary coolant flow over the outside of the tubes and CFD simulations of the turbulent multiphase boiling secondary coolant flow inside the tubes integrated with high resolution finite element method assessments of the tubes and their associated structural supports. This report summarizes the demonstration of a methodology for the multiphase boiling flow analysis inside the helical coil steam generator tube. A helical coil steam generator configuration has been defined based on the experiments completed by Polytecnico di Milano in the SIET helical coil steam generator tube facility. Simulations of the defined problem have been completed using the Eulerian-Eulerian multi-fluid modeling capabilities of the commercial CFD code STAR-CCM+. Simulations suggest that the two phases will quickly stratify in the slightly inclined pipe of the helical coil steam generator. These results have been successfully benchmarked against both empirical correlations for pressure drop and simulations using an alternate CFD methodology, the dispersed phase mixture modeling capabilities of the open source CFD code Nek5000.« less
Gschwind, Michael K
2013-07-23
Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zehtabian, M; Zaker, N; Sina, S
2015-06-15
Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 whichmore » is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.« less
Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Hammond, G. E.
2017-12-01
Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A
Parallelization of sequential Gaussian, indicator and direct simulation algorithms
NASA Astrophysics Data System (ADS)
Nunes, Ruben; Almeida, José A.
2010-08-01
Improving the performance and robustness of algorithms on new high-performance parallel computing architectures is a key issue in efficiently performing 2D and 3D studies with large amount of data. In geostatistics, sequential simulation algorithms are good candidates for parallelization. When compared with other computational applications in geosciences (such as fluid flow simulators), sequential simulation software is not extremely computationally intensive, but parallelization can make it more efficient and creates alternatives for its integration in inverse modelling approaches. This paper describes the implementation and benchmarking of a parallel version of the three classic sequential simulation algorithms: direct sequential simulation (DSS), sequential indicator simulation (SIS) and sequential Gaussian simulation (SGS). For this purpose, the source used was GSLIB, but the entire code was extensively modified to take into account the parallelization approach and was also rewritten in the C programming language. The paper also explains in detail the parallelization strategy and the main modifications. Regarding the integration of secondary information, the DSS algorithm is able to perform simple kriging with local means, kriging with an external drift and collocated cokriging with both local and global correlations. SIS includes a local correction of probabilities. Finally, a brief comparison is presented of simulation results using one, two and four processors. All performance tests were carried out on 2D soil data samples. The source code is completely open source and easy to read. It should be noted that the code is only fully compatible with Microsoft Visual C and should be adapted for other systems/compilers.
Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments
ERIC Educational Resources Information Center
Kermek, Dragutin; Novak, Matija
2016-01-01
In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…
40 CFR 51.50 - What definitions apply to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... accuracy description (MAD) codes means a set of six codes used to define the accuracy of latitude/longitude data for point sources. The six codes and their definitions are: (1) Coordinate Data Source Code: The... physical piece of or a closely related set of equipment. The EPA's reporting format for a given inventory...
The Astrophysics Source Code Library by the numbers
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, Peter; Berriman, G. Bruce; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Ryan, PW; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Wallin, John; Warmels, Rein
2018-01-01
The Astrophysics Source Code Library (ASCL, ascl.net) was founded in 1999 by Robert Nemiroff and John Wallin. ASCL editors seek both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and add entries for the found codes to the library. Software authors can submit their codes to the ASCL as well. This ensures a comprehensive listing covering a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL is indexed by both NASA’s Astrophysics Data System (ADS) and Web of Science, making software used in research more discoverable. This presentation covers the growth in the ASCL’s number of entries, the number of citations to its entries, and in which journals those citations appear. It also discusses what changes have been made to the ASCL recently, and what its plans are for the future.
Astrophysics Source Code Library: Incite to Cite!
NASA Astrophysics Data System (ADS)
DuPrie, K.; Allen, A.; Berriman, B.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J. F.
2014-05-01
The Astrophysics Source Code Library (ASCl,http://ascl.net/) is an on-line registry of over 700 source codes that are of interest to astrophysicists, with more being added regularly. The ASCL actively seeks out codes as well as accepting submissions from the code authors, and all entries are citable and indexed by ADS. All codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. In addition to being the largest directory of scientist-written astrophysics programs available, the ASCL is also an active participant in the reproducible research movement with presentations at various conferences, numerous blog posts and a journal article. This poster provides a description of the ASCL and the changes that we are starting to see in the astrophysics community as a result of the work we are doing.
Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.
2013-10-01
The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.
Generating code adapted for interlinking legacy scalar code and extended vector code
Gschwind, Michael K
2013-06-04
Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.
Londraville, R L; Cramer, T D; Franck, J P; Tullis, A; Block, B A
2000-10-01
Complete cDNAs for the fast-twitch Ca2+ -ATPase isoform (SERCA 1) were cloned and sequenced from blue marlin (Makaira nigricans) extraocular muscle (EOM). Complete cDNAs for SERCA 1 were also cloned from fast-twitch skeletal muscle of the same species. The two sequences are identical over the coding region except for the last five codons on the carboxyl end; EOM SERCA 1 cDNA codes for 996 amino acids and the fast-twitch cDNAs code for 991 aa. Phylogenetic analysis revealed that EOM SERCA 1 clusters with an isoform of Ca2+ -ATPase normally expressed in early development of mammals (SERCA 1B). This is the first report of SERCA 1B in an adult vertebrate. RNA hybridization assays indicate that 1B expression is limited to extraocular muscles. Because EOM gives rise to the thermogenic heater organ in marlin, we investigated whether SERCA 1B may play a role in heat generation, or if 1B expression is common in EOM among vertebrates. Chicken also expresses SERCA 1B in EOM, but rat expresses SERCA 1A; because SERCA 1B is not specific to heater tissue we conclude it is unlikely that it plays a specific role in intracellular heat production. Comparative sequence analysis does reveal, however, several sites that may be the source of functional differences between fish and mammalian SERCAs.
Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks
NASA Astrophysics Data System (ADS)
Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.
2011-01-01
In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.
WEC-SIM Validation Testing Plan FY14 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruehl, Kelley Michelle
2016-02-01
The WEC-Sim project is currently on track, having met both the SNL and NREL FY14 Milestones, as shown in Table 1 and Table 2. This is also reflected in the Gantt chart uploaded to the WEC-Sim SharePoint site in the FY14 Q4 Deliverables folder. The work completed in FY14 includes code verification through code-to-code comparison (FY14 Q1 and Q2), preliminary code validation through comparison to experimental data (FY14 Q2 and Q3), presentation and publication of the WEC-Sim project at OMAE 2014 [1], [2], [3] and GMREC/METS 2014 [4] (FY14 Q3), WEC-Sim code development and public open-source release (FY14 Q3), andmore » development of a preliminary WEC-Sim validation test plan (FY14 Q4). This report presents the preliminary Validation Testing Plan developed in FY14 Q4. The validation test effort started in FY14 Q4 and will go on through FY15. Thus far the team has developed a device selection method, selected a device, and placed a contract with the testing facility, established several collaborations including industry contacts, and have working ideas on the testing details such as scaling, device design, and test conditions.« less
Development and application of the GIM code for the Cyber 203 computer
NASA Technical Reports Server (NTRS)
Stainaker, J. F.; Robinson, M. A.; Rawlinson, E. G.; Anderson, P. G.; Mayne, A. W.; Spradley, L. W.
1982-01-01
The GIM computer code for fluid dynamics research was developed. Enhancement of the computer code, implicit algorithm development, turbulence model implementation, chemistry model development, interactive input module coding and wing/body flowfield computation are described. The GIM quasi-parabolic code development was completed, and the code used to compute a number of example cases. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and implicit finite difference scheme were also added. Development was completed on the interactive module for generating the input data for GIM. Solutions for inviscid hypersonic flow over a wing/body configuration are also presented.
Data compression for satellite images
NASA Technical Reports Server (NTRS)
Chen, P. H.; Wintz, P. A.
1976-01-01
An efficient data compression system is presented for satellite pictures and two grey level pictures derived from satellite pictures. The compression techniques take advantages of the correlation between adjacent picture elements. Several source coding methods are investigated. Double delta coding is presented and shown to be the most efficient. Both predictive differential quantizing technique and double delta coding can be significantly improved by applying a background skipping technique. An extension code is constructed. This code requires very little storage space and operates efficiently. Simulation results are presented for various coding schemes and source codes.
Two field trials for deblending of simultaneous source surveys: Why we failed and why we succeeded?
NASA Astrophysics Data System (ADS)
Zu, Shaohuan; Zhou, Hui; Chen, Haolin; Zheng, Hao; Chen, Yangkang
2017-08-01
Currently, deblending is the main strategy for dealing with the intense interference problem of simultaneous source data. Most deblending methods are based on the property that useful signal is coherent while the interference is incoherent in some domains other than common shot domain. In this paper, two simultaneous source field trials were studied in detail. In the first trial, the simultaneous source survey was not optimal, as the dithering code had strong coherency and the minimum distance between the two vessels was also small. The chosen marine shot scheduling and vessel deployment made it difficult to deblend the simultaneous source data, and result was an unexpected failure. Next, we tested different parameters (the dithering code and the minimum distance between vessels) of the simultaneous source survey using the simulated blended data and got some useful insights. Then, we carried out the second field trial with a carefully designed survey that was much different from the first trial. The deblended results in common receiver gather, common shot gather or the final stacked profile were encouraging. We obtained a complete success in the second field trial, which gave us confidence in the further test (such as a full three dimensional acquisition test or a high-resolution acquisition test with denser spatial sampling). Remembering that failures with simultaneous sourcing seldom reported, in this paper, our contribution is the discussion in detail about both our failed and successful field experiments and the lessons we have learned from them with the hope that the experience gained from this study can be very useful to other researchers in the same field.
Distributed Joint Source-Channel Coding in Wireless Sensor Networks
Zhu, Xuqi; Liu, Yu; Zhang, Lin
2009-01-01
Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560
Noniterative MAP reconstruction using sparse matrix representations.
Cao, Guangzhi; Bouman, Charles A; Webb, Kevin J
2009-09-01
We present a method for noniterative maximum a posteriori (MAP) tomographic reconstruction which is based on the use of sparse matrix representations. Our approach is to precompute and store the inverse matrix required for MAP reconstruction. This approach has generally not been used in the past because the inverse matrix is typically large and fully populated (i.e., not sparse). In order to overcome this problem, we introduce two new ideas. The first idea is a novel theory for the lossy source coding of matrix transformations which we refer to as matrix source coding. This theory is based on a distortion metric that reflects the distortions produced in the final matrix-vector product, rather than the distortions in the coded matrix itself. The resulting algorithms are shown to require orthonormal transformations of both the measurement data and the matrix rows and columns before quantization and coding. The second idea is a method for efficiently storing and computing the required orthonormal transformations, which we call a sparse-matrix transform (SMT). The SMT is a generalization of the classical FFT in that it uses butterflies to compute an orthonormal transform; but unlike an FFT, the SMT uses the butterflies in an irregular pattern, and is numerically designed to best approximate the desired transforms. We demonstrate the potential of the noniterative MAP reconstruction with examples from optical tomography. The method requires offline computation to encode the inverse transform. However, once these offline computations are completed, the noniterative MAP algorithm is shown to reduce both storage and computation by well over two orders of magnitude, as compared to a linear iterative reconstruction methods.
NASA Technical Reports Server (NTRS)
Clark, Kenneth; Watney, Garth; Murray, Alexander; Benowitz, Edward
2007-01-01
A computer program translates Unified Modeling Language (UML) representations of state charts into source code in the C, C++, and Python computing languages. ( State charts signifies graphical descriptions of states and state transitions of a spacecraft or other complex system.) The UML representations constituting the input to this program are generated by using a UML-compliant graphical design program to draw the state charts. The generated source code is consistent with the "quantum programming" approach, which is so named because it involves discrete states and state transitions that have features in common with states and state transitions in quantum mechanics. Quantum programming enables efficient implementation of state charts, suitable for real-time embedded flight software. In addition to source code, the autocoder program generates a graphical-user-interface (GUI) program that, in turn, generates a display of state transitions in response to events triggered by the user. The GUI program is wrapped around, and can be used to exercise the state-chart behavior of, the generated source code. Once the expected state-chart behavior is confirmed, the generated source code can be augmented with a software interface to the rest of the software with which the source code is required to interact.
Practices in source code sharing in astrophysics
NASA Astrophysics Data System (ADS)
Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly
2013-02-01
While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.
Flow of GE90 Turbofan Engine Simulated
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
1999-01-01
The objective of this task was to create and validate a three-dimensional model of the GE90 turbofan engine (General Electric) using the APNASA (average passage) flow code. This was a joint effort between GE Aircraft Engines and the NASA Lewis Research Center. The goal was to perform an aerodynamic analysis of the engine primary flow path, in under 24 hours of CPU time, on a parallel distributed workstation system. Enhancements were made to the APNASA Navier-Stokes code to make it faster and more robust and to allow for the analysis of more arbitrary geometry. The resulting simulation exploited the use of parallel computations by using two levels of parallelism, with extremely high efficiency.The primary flow path of the GE90 turbofan consists of a nacelle and inlet, 49 blade rows of turbomachinery, and an exhaust nozzle. Secondary flows entering and exiting the primary flow path-such as bleed, purge, and cooling flows-were modeled macroscopically as source terms to accurately simulate the engine. The information on these source terms came from detailed descriptions of the cooling flow and from thermodynamic cycle system simulations. These provided boundary condition data to the three-dimensional analysis. A simplified combustor was used to feed boundary conditions to the turbomachinery. Flow simulations of the fan, high-pressure compressor, and high- and low-pressure turbines were completed with the APNASA code.
NASA Astrophysics Data System (ADS)
Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián
2018-02-01
In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmer, M.E.
1997-12-05
This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less
NASA Astrophysics Data System (ADS)
Horowitz, F. G.; Gaede, O.
2014-12-01
Wavelet multiscale edge analysis of potential fields (a.k.a. "worms") has been known since Moreau et al. (1997) and was independently derived by Hornby et al. (1999). The technique is useful for producing a scale-explicit overview of the structures beneath a gravity or magnetic survey, including establishing the location and estimating the attitude of surface features, as well as incorporating information about the geometric class (point, line, surface, volume, fractal) of the underlying sources — in a fashion much like traditional structural indices from Euler solutions albeit with better areal coverage. Hornby et al. (2002) show that worms form the locally highest concentration of horizontal edges of a given strike — which in conjunction with the results from Mallat and Zhong (1992) induces a (non-unique!) inversion where the worms are physically interpretable as lateral boundaries in a source distribution that produces a close approximation of the observed potential field. The technique has enjoyed widespread adoption and success in the Australian mineral exploration community — including "ground truth" via successfully drilling structures indicated by the worms. Unfortunately, to our knowledge, all implementations of the code to calculate the worms/multiscale edges (including Horowitz' original research code) are either part of commercial software packages, or have copyright restrictions that impede the use of the technique by the wider community. The technique is completely described mathematically in Hornby et al. (1999) along with some later publications. This enables us to re-implement from scratch the code required to calculate and visualize the worms. We are freely releasing the results under an (open source) BSD two-clause software license. A git repository is available at
Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao
2017-01-01
Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead. PMID:29095934
Number of minimum-weight code words in a product code
NASA Technical Reports Server (NTRS)
Miller, R. L.
1978-01-01
Consideration is given to the number of minimum-weight code words in a product code. The code is considered as a tensor product of linear codes over a finite field. Complete theorems and proofs are presented.
Final Technical Report for GO17004 Regulatory Logic: Codes and Standards for the Hydrogen Economy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakarado, Gary L.
The objectives of this project are to: develop a robust supporting research and development program to provide critical hydrogen behavior data and a detailed understanding of hydrogen combustion and safety across a range of scenarios, needed to establish setback distances in building codes and minimize the overall data gaps in code development; support and facilitate the completion of technical specifications by the International Organization for Standardization (ISO) for gaseous hydrogen refueling (TS 20012) and standards for on-board liquid (ISO 13985) and gaseous or gaseous blend (ISO 15869) hydrogen storage by 2007; support and facilitate the effort, led by the NFPA,more » to complete the draft Hydrogen Technologies Code (NFPA 2) by 2008; with experimental data and input from Technology Validation Program element activities, support and facilitate the completion of standards for bulk hydrogen storage (e.g., NFPA 55) by 2008; facilitate the adoption of the most recently available model codes (e.g., from the International Code Council [ICC]) in key regions; complete preliminary research and development on hydrogen release scenarios to support the establishment of setback distances in building codes and provide a sound basis for model code development and adoption; support and facilitate the development of Global Technical Regulations (GTRs) by 2010 for hydrogen vehicle systems under the United Nations Economic Commission for Europe, World Forum for Harmonization of Vehicle Regulations and Working Party on Pollution and Energy Program (ECE-WP29/GRPE); and to Support and facilitate the completion by 2012 of necessary codes and standards needed for the early commercialization and market entry of hydrogen energy technologies.« less
The mathematical theory of signal processing and compression-designs
NASA Astrophysics Data System (ADS)
Feria, Erlan H.
2006-05-01
The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.
Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low-Altitude VLF Transmitter
2007-08-31
latitude) for 3 different grid spacings. 14 8. Low-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...excellent, validating the new FD code. 16 9. High-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...again excellent. 17 10. Low-altitude fields produced by a 20-k.Hz source computed using the FD and TD codes. 17 11. High-altitude fields produced
Surface hardening using cw CO2 laser: laser heat treatment, modelation, and experimental work
NASA Astrophysics Data System (ADS)
Muniz, German; Alum, Jorge
1996-02-01
In the present work are given the results of the application of laser metal surface hardening techniques using a cw carbon dioxide laser as an energy source on steel 65 G. The laser heat treatment results are presented theoretically and experimentally. Continuous wave carbon dioxide laser of 0.6, 0.3, and 0.4 kW were used. A physical model for the descriptions of the thermophysical laser metal interactions process is given and a numerical algorithm is used to solve this problem by means of the LHT code. The results are compared with the corresponding experimental ones and a very good agreement is observed. The LHT code is able to do predictions of transformation hardening by laser heating. These results will be completed with other ones concerning laser alloying and cladding presented in a second paper.
Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications
NASA Astrophysics Data System (ADS)
Boudillet, O.; Mescam, J.-C.; Dalemagne, D.
2008-08-01
EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.
The Cadmium Zinc Telluride Imager on AstroSat
NASA Astrophysics Data System (ADS)
Bhalerao, V.; Bhattacharya, D.; Vibhute, A.; Pawar, P.; Rao, A. R.; Hingar, M. K.; Khanna, Rakesh; Kutty, A. P. K.; Malkar, J. P.; Patil, M. H.; Arora, Y. K.; Sinha, S.; Priya, P.; Samuel, Essy; Sreekumar, S.; Vinod, P.; Mithun, N. P. S.; Vadawale, S. V.; Vagshette, N.; Navalgund, K. H.; Sarma, K. S.; Pandiyan, R.; Seetha, S.; Subbarao, K.
2017-06-01
The Cadmium Zinc Telluride Imager (CZTI) is a high energy, wide-field imaging instrument on AstroSat. CZTI's namesake Cadmium Zinc Telluride detectors cover an energy range from 20 keV to >200 keV, with 11% energy resolution at 60 keV. The coded aperture mask attains an angular resolution of 17^' over a 4.6° × 4.6° (FWHM) field-of-view. CZTI functions as an open detector above 100 keV, continuously sensitive to GRBs and other transients in about 30% of the sky. The pixellated detectors are sensitive to polarization above ˜ 100 keV, with exciting possibilities for polarization studies of transients and bright persistent sources. In this paper, we provide details of the complete CZTI instrument, detectors, coded aperture mask, mechanical and electronic configuration, as well as data and products.
CHEMICAL STORAGE: MYTHS VERSUS REALITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simmons, F
A large number of resources explaining proper chemical storage are available. These resources include books, databases/tables, and articles that explain various aspects of chemical storage including compatible chemical storage, signage, and regulatory requirements. Another source is the chemical manufacturer or distributor who provides storage information in the form of icons or color coding schemes on container labels. Despite the availability of these resources, chemical accidents stemming from improper storage, according to recent reports (1) (2), make up almost 25% of all chemical accidents. This relatively high percentage of chemical storage accidents suggests that these publications and color coding schemes althoughmore » helpful, still provide incomplete information that may not completely mitigate storage risks. This manuscript will explore some ways published storage information may be incomplete, examine the associated risks, and suggest methods to help further eliminate chemical storage risks.« less
Complete Coding Genome Sequence for Mogiana Tick Virus, a Jingmenvirus Isolated from Ticks in Brazil
2017-05-04
and capable of infecting a wide range of animal hosts (1–5). Here, we report the complete coding genome sequence (i.e., only missing portions of...segmented nature of the genome was not under- stood. Therefore, only the two genome segments with detectable sequence homolo- gies to flaviviruses were...originally reported (2). We revisited the data set of Maruyama et al. (2) and assembled the complete coding sequences for all four genome segments. We
Complete genome sequence of Pedobacter heparinus type strain (HIM 762-3T)
Han, Cliff; Spring, Stefan; Lapidus, Alla; Del Rio, Tijana Glavina; Tice, Hope; Copeland, Alex; Cheng, Jan-Fang; Lucas, Susan; Chen, Feng; Nolan, Matt; Bruce, David; Goodwin, Lynne; Pitluck, Sam; Ivanova, Natalia; Mavromatis, Konstantinos; Mikhailova, Natalia; Pati, Amrita; Chen, Amy; Palaniappan, Krishna; Land, Miriam; Hauser, Loren; Chang, Yun-Juan; Jeffries, Cynthia C.; Saunders, Elizabeth; Chertkov, Olga; Brettin, Thomas; Göker, Markus; Rohde, Manfred; Bristow, Jim; Eisen, Jonathan A.; Markowitz, Victor; Hugenholtz, Philip; Kyrpides, Nikos C.; Klenk, Hans-Peter; Detter, John C.
2009-01-01
Pedobacter heparinus (Payza and Korn 1956) Steyn et al. 1998 comb. nov. is the type species of the rapidly growing genus Pedobacter within the family Sphingobacteriaceae of the phylum ‘Bacteroidetes’. P. heparinus is of interest, because it was the first isolated strain shown to grow with heparin as sole carbon and nitrogen source and because it produces several enzymes involved in the degradation of mucopolysaccharides. All available data about this species are based on a sole strain that was isolated from dry soil. Here we describe the features of this organism, together with the complete genome sequence, and annotation. This is the first report on a complete genome sequence of a member of the genus Pedobacter, and the 5,167,383 bp long single replicon genome with its 4287 protein-coding and 54 RNA genes is part of the Genomic Encyclopedia of Bacteria and Archaea project. PMID:21304637
Measuring homework completion in behavioral activation.
Busch, Andrew M; Uebelacker, Lisa A; Kalibatseva, Zornitsa; Miller, Ivan W
2010-07-01
The aim of this study was to develop and validate an observer-based coding system for the characterization and completion of homework assignments during Behavioral Activation (BA). Existing measures of homework completion are generally unsophisticated, and there is no current measure of homework completion designed to capture the particularities of BA. The tested scale sought to capture the type of assignment, realm of functioning targeted, extent of completion, and assignment difficulty. Homework assignments were drawn from 12 (mean age = 48, 83% female) clients in two trials of a 10-session BA manual targeting treatment-resistant depression in primary care. The two coders demonstrated acceptable or better reliability on most codes, and unreliable codes were dropped from the proposed scale. In addition, correlations between homework completion and outcome were strong, providing some support for construct validity. Ultimately, this line of research aims to develop a user-friendly, reliable measure of BA homework completion that can be completed by a therapist during session.
NASA Astrophysics Data System (ADS)
Wünderlich, D.; Mochalskyy, S.; Montellano, I. M.; Revel, A.
2018-05-01
Particle-in-cell (PIC) codes are used since the early 1960s for calculating self-consistently the motion of charged particles in plasmas, taking into account external electric and magnetic fields as well as the fields created by the particles itself. Due to the used very small time steps (in the order of the inverse plasma frequency) and mesh size, the computational requirements can be very high and they drastically increase with increasing plasma density and size of the calculation domain. Thus, usually small computational domains and/or reduced dimensionality are used. In the last years, the available central processing unit (CPU) power strongly increased. Together with a massive parallelization of the codes, it is now possible to describe in 3D the extraction of charged particles from a plasma, using calculation domains with an edge length of several centimeters, consisting of one extraction aperture, the plasma in direct vicinity of the aperture, and a part of the extraction system. Large negative hydrogen or deuterium ion sources are essential parts of the neutral beam injection (NBI) system in future fusion devices like the international fusion experiment ITER and the demonstration reactor (DEMO). For ITER NBI RF driven sources with a source area of 0.9 × 1.9 m2 and 1280 extraction apertures will be used. The extraction of negative ions is accompanied by the co-extraction of electrons which are deflected onto an electron dump. Typically, the maximum negative extracted ion current is limited by the amount and the temporal instability of the co-extracted electrons, especially for operation in deuterium. Different PIC codes are available for the extraction region of large driven negative ion sources for fusion. Additionally, some effort is ongoing in developing codes that describe in a simplified manner (coarser mesh or reduced dimensionality) the plasma of the whole ion source. The presentation first gives a brief overview of the current status of the ion source development for ITER NBI and of the PIC method. Different PIC codes for the extraction region are introduced as well as the coupling to codes describing the whole source (PIC codes or fluid codes). Presented and discussed are different physical and numerical aspects of applying PIC codes to negative hydrogen ion sources for fusion as well as selected code results. The main focus of future calculations will be the meniscus formation and identifying measures for reducing the co-extracted electrons, in particular for deuterium operation. The recent results of the 3D PIC code ONIX (calculation domain: one extraction aperture and its vicinity) for the ITER prototype source (1/8 size of the ITER NBI source) are presented.
2014-01-01
Background Fascioliasis is an important and neglected disease of humans and other mammals, caused by trematodes of the genus Fasciola. Fasciola hepatica and F. gigantica are valid species that infect humans and animals, but the specific status of Fasciola sp. (‘intermediate form’) is unclear. Methods Single specimens inferred to represent Fasciola sp. (‘intermediate form’; Heilongjiang) and F. gigantica (Guangxi) from China were genetically identified and characterized using PCR-based sequencing of the first and second internal transcribed spacer regions of nuclear ribosomal DNA. The complete mitochondrial (mt) genomes of these representative specimens were then sequenced. The relationships of these specimens with selected members of the Trematoda were assessed by phylogenetic analysis of concatenated amino acid sequence datasets by Bayesian inference (BI). Results The complete mt genomes of representatives of Fasciola sp. and F. gigantica were 14,453 bp and 14,478 bp in size, respectively. Both mt genomes contain 12 protein-coding genes, 22 transfer RNA genes and two ribosomal RNA genes, but lack an atp8 gene. All protein-coding genes are transcribed in the same direction, and the gene order in both mt genomes is the same as that published for F. hepatica. Phylogenetic analysis of the concatenated amino acid sequence data for all 12 protein-coding genes showed that the specimen of Fasciola sp. was more closely related to F. gigantica than to F. hepatica. Conclusions The mt genomes characterized here provide a rich source of markers, which can be used in combination with nuclear markers and imaging techniques, for future comparative studies of the biology of Fasciola sp. from China and other countries. PMID:24685294
Liu, Guo-Hua; Gasser, Robin B; Young, Neil D; Song, Hui-Qun; Ai, Lin; Zhu, Xing-Quan
2014-03-31
Fascioliasis is an important and neglected disease of humans and other mammals, caused by trematodes of the genus Fasciola. Fasciola hepatica and F. gigantica are valid species that infect humans and animals, but the specific status of Fasciola sp. ('intermediate form') is unclear. Single specimens inferred to represent Fasciola sp. ('intermediate form'; Heilongjiang) and F. gigantica (Guangxi) from China were genetically identified and characterized using PCR-based sequencing of the first and second internal transcribed spacer regions of nuclear ribosomal DNA. The complete mitochondrial (mt) genomes of these representative specimens were then sequenced. The relationships of these specimens with selected members of the Trematoda were assessed by phylogenetic analysis of concatenated amino acid sequence datasets by Bayesian inference (BI). The complete mt genomes of representatives of Fasciola sp. and F. gigantica were 14,453 bp and 14,478 bp in size, respectively. Both mt genomes contain 12 protein-coding genes, 22 transfer RNA genes and two ribosomal RNA genes, but lack an atp8 gene. All protein-coding genes are transcribed in the same direction, and the gene order in both mt genomes is the same as that published for F. hepatica. Phylogenetic analysis of the concatenated amino acid sequence data for all 12 protein-coding genes showed that the specimen of Fasciola sp. was more closely related to F. gigantica than to F. hepatica. The mt genomes characterized here provide a rich source of markers, which can be used in combination with nuclear markers and imaging techniques, for future comparative studies of the biology of Fasciola sp. from China and other countries.
Simulation of Laser Cooling and Trapping in Engineering Applications
NASA Technical Reports Server (NTRS)
Ramirez-Serrano, Jaime; Kohel, James; Thompson, Robert; Yu, Nan; Lunblad, Nathan
2005-01-01
An advanced computer code is undergoing development for numerically simulating laser cooling and trapping of large numbers of atoms. The code is expected to be useful in practical engineering applications and to contribute to understanding of the roles that light, atomic collisions, background pressure, and numbers of particles play in experiments using laser-cooled and -trapped atoms. The code is based on semiclassical theories of the forces exerted on atoms by magnetic and optical fields. Whereas computer codes developed previously for the same purpose account for only a few physical mechanisms, this code incorporates many more physical mechanisms (including atomic collisions, sub-Doppler cooling mechanisms, Stark and Zeeman energy shifts, gravitation, and evanescent-wave phenomena) that affect laser-matter interactions and the cooling of atoms to submillikelvin temperatures. Moreover, whereas the prior codes can simulate the interactions of at most a few atoms with a resonant light field, the number of atoms that can be included in a simulation by the present code is limited only by computer memory. Hence, the present code represents more nearly completely the complex physics involved when using laser-cooled and -trapped atoms in engineering applications. Another advantage that the code incorporates is the possibility to analyze the interaction between cold atoms of different atomic number. Some properties that cold atoms of different atomic species have, like cross sections and the particular excited states they can occupy when interacting with each other and light fields, play important roles not yet completely understood in the new experiments that are under way in laboratories worldwide to form ultracold molecules. Other research efforts use cold atoms as holders of quantum information, and more recent developments in cavity quantum electrodynamics also use ultracold atoms to explore and expand new information-technology ideas. These experiments give a hint on the wide range of applications and technology developments that can be tackled using cold atoms and light fields. From more precise atomic clocks and gravity sensors to the development of quantum computers, there will be a need to completely understand the whole ensemble of physical mechanisms that play a role in the development of such technologies. The code also permits the study of the dynamic and steady-state operations of technologies that use cold atoms. The physical characteristics of lasers and fields can be time-controlled to give a realistic simulation of the processes involved such that the design process can determine the best control features to use. It is expected that with the features incorporated into the code it will become a tool for the useful application of ultracold atoms in engineering applications. Currently, the software is being used for the analysis and understanding of simple experiments using cold atoms, and for the design of a modular compact source of cold atoms to be used in future research and development projects. The results so far indicate that the code is a useful design instrument that shows good agreement with experimental measurements (see figure), and a Windows-based user-friendly interface is also under development.
Wang, R; Li, X A
2001-02-01
The dose parameters for the beta-particle emitting 90Sr/90Y source for intravascular brachytherapy (IVBT) have been calculated by different investigators. At a distant distance from the source, noticeable differences are seen in these parameters calculated using different Monte Carlo codes. The purpose of this work is to quantify as well as to understand these differences. We have compared a series of calculations using an EGS4, an EGSnrc, and the MCNP Monte Carlo codes. Data calculated and compared include the depth dose curve for a broad parallel beam of electrons, and radial dose distributions for point electron sources (monoenergetic or polyenergetic) and for a real 90Sr/90Y source. For the 90Sr/90Y source, the doses at the reference position (2 mm radial distance) calculated by the three code agree within 2%. However, the differences between the dose calculated by the three codes can be over 20% in the radial distance range interested in IVBT. The difference increases with radial distance from source, and reaches 30% at the tail of dose curve. These differences may be partially attributed to the different multiple scattering theories and Monte Carlo models for electron transport adopted in these three codes. Doses calculated by the EGSnrc code are more accurate than those by the EGS4. The two calculations agree within 5% for radial distance <6 mm.
Development of an analytical-numerical model to predict radiant emission or absorption
NASA Technical Reports Server (NTRS)
Wallace, Tim L.
1994-01-01
The development of an analytical-numerical model to predict radiant emission or absorption is discussed. A voigt profile is assumed to predict the spectral qualities of a singlet atomic transition line for atomic species of interest to the OPAD program. The present state of this model is described in each progress report required under contract. Model and code development is guided by experimental data where available. When completed, the model will be used to provide estimates of specie erosion rates from spectral data collected from rocket exhaust plumes or other sources.
Test-Case Generation using an Explicit State Model Checker Final Report
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Gao, Jimin
2003-01-01
In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.
Kim, Daehee; Kim, Dongwan; An, Sunshin
2016-07-09
Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.
Kim, Daehee; Kim, Dongwan; An, Sunshin
2016-01-01
Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R
2014-01-01
At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. Tomore » overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.« less
Streamlined Genome Sequence Compression using Distributed Source Coding
Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel
2014-01-01
We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552
The FORTRAN static source code analyzer program (SAP) system description
NASA Technical Reports Server (NTRS)
Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.
1982-01-01
A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.
Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim
2016-01-01
Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458
Implant for in-vivo parameter monitoring, processing and transmitting
Ericson, Milton N [Knoxville, TN; McKnight, Timothy E [Greenback, TN; Smith, Stephen F [London, TN; Hylton, James O [Clinton, TN
2009-11-24
The present invention relates to a completely implantable intracranial pressure monitor, which can couple to existing fluid shunting systems as well as other internal monitoring probes. The implant sensor produces an analog data signal which is then converted electronically to a digital pulse by generation of a spreading code signal and then transmitted to a location outside the patient by a radio-frequency transmitter to an external receiver. The implanted device can receive power from an internal source as well as an inductive external source. Remote control of the implant is also provided by a control receiver which passes commands from an external source to the implant system logic. Alarm parameters can be programmed into the device which are capable of producing an audible or visual alarm signal. The utility of the monitor can be greatly expanded by using multiple pressure sensors simultaneously or by combining sensors of various physiological types.
Implantable device for in-vivo intracranial and cerebrospinal fluid pressure monitoring
Ericson, Milton N.; McKnight, Timothy E.; Smith, Stephen F.; Hylton, James O.
2003-01-01
The present invention relates to a completely implantable intracranial pressure monitor, which can couple to existing fluid shunting systems as well as other internal monitoring probes. The implant sensor produces an analog data signal which is then converted electronically to a digital pulse by generation of a spreading code signal and then transmitted to a location outside the patient by a radio-frequency transmitter to an external receiver. The implanted device can receive power from an internal source as well as an inductive external source. Remote control of the implant is also provided by a control receiver which passes commands from an external source to the implant system logic. Alarm parameters can be programmed into the device which are capable of producing an audible or visual alarm signal. The utility of the monitor can be greatly expanded by using multiple pressure sensors simultaneously or by combining sensors of various physiological types.
Entropy-Based Bounds On Redundancies Of Huffman Codes
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J.
1992-01-01
Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.
40 CFR Appendix A to Subpart A of... - Tables
Code of Federal Regulations, 2010 CFR
2010-07-01
... phone number ✓ ✓ (6) FIPS code ✓ ✓ (7) Facility ID codes ✓ ✓ (8) Unit ID code ✓ ✓ (9) Process ID code... for Reporting on Emissions From Nonpoint Sources and Nonroad Mobile Sources, Where Required by 40 CFR... start date ✓ ✓ (3) Inventory end date ✓ ✓ (4) Contact name ✓ ✓ (5) Contact phone number ✓ ✓ (6) FIPS...
SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacMillan, D.B.
1960-06-01
>A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2011 CFR
2011-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2012 CFR
2012-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2014 CFR
2014-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2010 CFR
2010-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...
LDPC-based iterative joint source-channel decoding for JPEG2000.
Pu, Lingling; Wu, Zhenyu; Bilgin, Ali; Marcellin, Michael W; Vasic, Bane
2007-02-01
A framework is proposed for iterative joint source-channel decoding of JPEG2000 codestreams. At the encoder, JPEG2000 is used to perform source coding with certain error-resilience (ER) modes, and LDPC codes are used to perform channel coding. During decoding, the source decoder uses the ER modes to identify corrupt sections of the codestream and provides this information to the channel decoder. Decoding is carried out jointly in an iterative fashion. Experimental results indicate that the proposed method requires fewer iterations and improves overall system performance.
Primer development to obtain complete coding sequence of HA and NA genes of influenza A/H3N2 virus.
Agustiningsih, Agustiningsih; Trimarsanto, Hidayat; Setiawaty, Vivi; Artika, I Made; Muljono, David Handojo
2016-08-30
Influenza is an acute respiratory illness and has become a serious public health problem worldwide. The need to study the HA and NA genes in influenza A virus is essential since these genes frequently undergo mutations. This study describes the development of primer sets for RT-PCR to obtain complete coding sequence of Hemagglutinin (HA) and Neuraminidase (NA) genes of influenza A/H3N2 virus from Indonesia. The primers were developed based on influenza A/H3N2 sequence worldwide from Global Initiative on Sharing All Influenza Data (GISAID) and further tested using Indonesian influenza A/H3N2 archived samples of influenza-like illness (ILI) surveillance from 2008 to 2009. An optimum RT-PCR condition was acquired for all HA and NA fragments designed to cover complete coding sequence of HA and NA genes. A total of 71 samples were successfully sequenced for complete coding sequence both of HA and NA genes out of 145 samples of influenza A/H3N2 tested. The developed primer sets were suitable for obtaining complete coding sequences of HA and NA genes of Indonesian samples from 2008 to 2009.
Lamb, Mary K; Innes, Kerry; Saad, Patricia; Rust, Julie; Dimitropoulos, Vera; Cumerlato, Megan
The Performance Indicators for Coding Quality (PICQ) is a data quality assessment tool developed by Australia's National Centre for Classification in Health (NCCH). PICQ consists of a number of indicators covering all ICD-10-AM disease chapters, some procedure chapters from the Australian Classification of Health Intervention (ACHI) and some Australian Coding Standards (ACS). The indicators can be used to assess the coding quality of hospital morbidity data by monitoring compliance of coding conventions and ACS; this enables the identification of particular records that may be incorrectly coded, thus providing a measure of data quality. There are 31 obstetric indicators available for the ICD-10-AM Fourth Edition. Twenty of these 31 indicators were classified as Fatal, nine as Warning and two Relative. These indicators were used to examine coding quality of obstetric records in the 2004-2005 financial year Australian national hospital morbidity dataset. Records with obstetric disease or procedure codes listed anywhere in the code string were extracted and exported from the SPSS source file. Data were then imported into a Microsoft Access database table as per PICQ instructions, and run against all Fatal and Warning and Relative (N=31) obstetric PICQ 2006 Fourth Edition Indicators v.5 for the ICD-10- AM Fourth Edition. There were 689,905 gynaecological and obstetric records in the 2004-2005 financial year, of which 1.14% were found to have triggered Fatal degree errors, 3.78% Warning degree errors and 8.35% Relative degree errors. The types of errors include completeness, redundancy, specificity and sequencing problems. It was found that PICQ is a useful initial screening tool for the assessment of ICD-10-AM/ACHI coding quality. The overall quality of codes assigned to obstetric records in the 2004- 2005 Australian national morbidity dataset is of fair quality.
Validity of data in the Danish Colorectal Cancer Screening Database.
Thomsen, Mette Kielsholm; Njor, Sisse Helle; Rasmussen, Morten; Linnemann, Dorte; Andersen, Berit; Baatrup, Gunnar; Friis-Hansen, Lennart Jan; Jørgensen, Jens Christian Riis; Mikkelsen, Ellen Margrethe
2017-01-01
In Denmark, a nationwide screening program for colorectal cancer was implemented in March 2014. Along with this, a clinical database for program monitoring and research purposes was established. The aim of this study was to estimate the agreement and validity of diagnosis and procedure codes in the Danish Colorectal Cancer Screening Database (DCCSD). All individuals with a positive immunochemical fecal occult blood test (iFOBT) result who were invited to screening in the first 3 months since program initiation were identified. From these, a sample of 150 individuals was selected using stratified random sampling by age, gender and region of residence. Data from the DCCSD were compared with data from hospital records, which were used as the reference. Agreement, sensitivity, specificity and positive and negative predictive values were estimated for categories of codes "clean colon", "colonoscopy performed", "overall completeness of colonoscopy", "incomplete colonoscopy", "polypectomy", "tumor tissue left behind", "number of polyps", "lost polyps", "risk group of polyps" and "colorectal cancer and polyps/benign tumor". Hospital records were available for 136 individuals. Agreement was highest for "colorectal cancer" (97.1%) and lowest for "lost polyps" (88.2%). Sensitivity varied between moderate and high, with 60.0% for "incomplete colonoscopy" and 98.5% for "colonoscopy performed". Specificity was 92.7% or above, except for the categories "colonoscopy performed" and "overall completeness of colonoscopy", where the specificity was low; however, the estimates were imprecise. A high level of agreement between categories of codes in DCCSD and hospital records indicates that DCCSD reflects the hospital records well. Further, the validity of the categories of codes varied from moderate to high. Thus, the DCCSD may be a valuable data source for future research on colorectal cancer screening.
Xu, Wen-Wen; Qiu, Jian-Hua; Liu, Guo-Hua; Zhang, Yan; Liu, Ze-Xuan; Duan, Hong; Yue, Dong-Mei; Chang, Qiao-Cheng; Wang, Chun-Ren; Zhao, Xing-Cun
2015-12-01
The roundworms of genus Strongylus are the common parasitic nematodes in the large intestine of equine, causing significant economic losses to the livestock industries. In spite of its importance, the genetic data and epidemiology of this parasite are not entirely understood. In the present study, the complete S. equinus mitochondrial (mt) genome was determined. The length of S. equinus mt genome DNA sequence is 14,545 bp, containing 36 genes, of which 12 code for protein, 22 for transfer RNA, and two for ribosomal RNA, but lacks atp8 gene. All 36 genes are encoded in the same direction which is consistent with all other Chromadorea nematode mtDNAs published to date. Phylogenetic analysis based on concatenated amino acid sequence data of all 12 protein-coding genes showed that there were two large branches in the Strongyloidea nematodes, and S. equinus is genetically closer to S. vulgaris than to Cylicocyclus insignis in Strongylidae. This new mt genome provides a source of genetic markers for the molecular phylogeny and population genetics of equine strongyles. Copyright © 2015 Elsevier Inc. All rights reserved.
EPA Office of Water (OW): 12-digit Hydrologic Unit Boundaries of the United States
The Watershed Boundary Dataset (WBD) is a complete digital hydrologic unit national boundary layer that is at the Subwatershed (12-digit) level. It is composed of the watershed boundaries delineated by state agencies at the 1:24,000 scale. Please refer to the individual state metadata as the primary reference source. To access state specific metadata, go to the following link to view documentation created by agencies that performed the watershed delineation. This data set is a complete digital hydrologic unit boundary layer to the Subwatershed (12-digit) 6th level. This data set consists of geo-referenced digital data and associated attributes created in accordance with the FGDC Proposal, Version 1.0 - Federal Standards For Delineation of Hydrologic Unit Boundaries 3/01/02. Polygons are attributed with hydrologic unit codes for 4th level sub-basins, 5th level watersheds, 6th level subwatersheds, name, size, downstream hydrologic unit, type of watershed, non-contributing areas and flow modification. Arcs are attributed with the highest hydrologic unit code for each watershed, linesource and a metadata reference file.Please refer to the Metadata contact if you want access to the WBD national data set.
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2013 CFR
2013-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...
SolTrace | Concentrating Solar Power | NREL
NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted
Complete Decoding and Reporting of Aviation Routine Weather Reports (METARs)
NASA Technical Reports Server (NTRS)
Lui, Man-Cheung Max
2014-01-01
Aviation Routine Weather Report (METAR) provides surface weather information at and around observation stations, including airport terminals. These weather observations are used by pilots for flight planning and by air traffic service providers for managing departure and arrival flights. The METARs are also an important source of weather data for Air Traffic Management (ATM) analysts and researchers at NASA and elsewhere. These researchers use METAR to correlate severe weather events with local or national air traffic actions that restrict air traffic, as one example. A METAR is made up of multiple groups of coded text, each with a specific standard coding format. These groups of coded text are located in two sections of a report: Body and Remarks. The coded text groups in a U.S. METAR are intended to follow the coding standards set by National Oceanic and Atmospheric Administration (NOAA). However, manual data entry and edits made by a human report observer may result in coded text elements that do not follow the standards, especially in the Remarks section. And contrary to the standards, some significant weather observations are noted only in the Remarks section and not in the Body section of the reports. While human readers can infer the intended meaning of non-standard coding of weather conditions, doing so with a computer program is far more challenging. However such programmatic pre-processing is necessary to enable efficient and faster database query when researchers need to perform any significant historical weather analysis. Therefore, to support such analysis, a computer algorithm was developed to identify groups of coded text anywhere in a report and to perform subsequent decoding in software. The algorithm considers common deviations from the standards and data entry mistakes made by observers. The implemented software code was tested to decode 12 million reports and the decoding process was able to completely interpret 99.93 of the reports. This document presents the deviations from the standards and the decoding algorithm. Storing all decoded data in a database allows users to quickly query a large amount of data and to perform data mining on the data. Users can specify complex query criteria not only on date or airport but also on weather condition. This document also describes the design of a database schema for storing the decoded data, and a Data Warehouse web application that allows users to perform reporting and analysis on the decoded data. Finally, this document presents a case study correlating dust storms reported in METARs from the Phoenix International airport with Ground Stops issued by Air Route Traffic Control Centers (ATCSCC). Blowing widespread dust is one of the weather conditions when dust storm occurs. By querying the database, 294 METARs were found to report blowing widespread dust at the Phoenix airport and 41 of them reported such condition only in the Remarks section of the reports. When METAR is a data source for an ATM research, it is important to include weather conditions not only from the Body section but also from the Remarks section of METARs.
Building a Snow Data Management System using Open Source Software (and IDL)
NASA Astrophysics Data System (ADS)
Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.
2012-12-01
At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01
Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy.
Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe
2015-07-07
The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm(3) calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.
Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy
NASA Astrophysics Data System (ADS)
Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe
2015-07-01
The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm3 calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.
Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code
NASA Astrophysics Data System (ADS)
Faghihi, F.; Mehdizadeh, S.; Hadad, K.
Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.
Standardizing clinical laboratory data for secondary use.
Abhyankar, Swapna; Demner-Fushman, Dina; McDonald, Clement J
2012-08-01
Clinical databases provide a rich source of data for answering clinical research questions. However, the variables recorded in clinical data systems are often identified by local, idiosyncratic, and sometimes redundant and/or ambiguous names (or codes) rather than unique, well-organized codes from standard code systems. This reality discourages research use of such databases, because researchers must invest considerable time in cleaning up the data before they can ask their first research question. Researchers at MIT developed MIMIC-II, a nearly complete collection of clinical data about intensive care patients. Because its data are drawn from existing clinical systems, it has many of the problems described above. In collaboration with the MIT researchers, we have begun a process of cleaning up the data and mapping the variable names and codes to LOINC codes. Our first step, which we describe here, was to map all of the laboratory test observations to LOINC codes. We were able to map 87% of the unique laboratory tests that cover 94% of the total number of laboratory tests results. Of the 13% of tests that we could not map, nearly 60% were due to test names whose real meaning could not be discerned and 29% represented tests that were not yet included in the LOINC table. These results suggest that LOINC codes cover most of laboratory tests used in critical care. We have delivered this work to the MIMIC-II researchers, who have included it in their standard MIMIC-II database release so that researchers who use this database in the future will not have to do this work. Published by Elsevier Inc.
Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei
2009-03-01
Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.
Vibro-Acoustics Modal Testing at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Pappa, Richard S.; Pritchard, Jocelyn I.; Buehrle, Ralph D.
1999-01-01
This paper summarizes on-going modal testing activities at the NASA Langley Research Center for two aircraft fuselage structures: a generic "aluminum testbed cylinder" (ATC) and a Beechcraft Starship fuselage (BSF). Subsequent acoustic tests will measure the interior noise field created by exterior mechanical and acoustic sources. These test results will provide validation databases for interior noise prediction codes on realistic aircraft fuselage structures. The ATC is a 12-ft-long, all-aluminum, scale model assembly. The BSF is a 40-ft-long, all-composite, complete aircraft fuselage. To date, two of seven test configurations of the ATC and all three test configurations of the BSF have been completed. The paper briefly describes the various test configurations, testing procedure, and typical results for frequencies up to 250 Hz.
Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S
2016-03-08
Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.
SinEx DB: a database for single exon coding sequences in mammalian genomes.
Jorquera, Roddy; Ortiz, Rodrigo; Ossandon, F; Cárdenas, Juan Pablo; Sepúlveda, Rene; González, Carolina; Holmes, David S
2016-01-01
Eukaryotic genes are typically interrupted by intragenic, noncoding sequences termed introns. However, some genes lack introns in their coding sequence (CDS) and are generally known as 'single exon genes' (SEGs). In this work, a SEG is defined as a nuclear, protein-coding gene that lacks introns in its CDS. Whereas, many public databases of Eukaryotic multi-exon genes are available, there are only two specialized databases for SEGs. The present work addresses the need for a more extensive and diverse database by creating SinEx DB, a publicly available, searchable database of predicted SEGs from 10 completely sequenced mammalian genomes including human. SinEx DB houses the DNA and protein sequence information of these SEGs and includes their functional predictions (KOG) and the relative distribution of these functions within species. The information is stored in a relational database built with My SQL Server 5.1.33 and the complete dataset of SEG sequences and their functional predictions are available for downloading. SinEx DB can be interrogated by: (i) a browsable phylogenetic schema, (ii) carrying out BLAST searches to the in-house SinEx DB of SEGs and (iii) via an advanced search mode in which the database can be searched by key words and any combination of searches by species and predicted functions. SinEx DB provides a rich source of information for advancing our understanding of the evolution and function of SEGs.Database URL: www.sinex.cl. © The Author(s) 2016. Published by Oxford University Press.
A Theory of False Cognitive Expectancies in Airline Pilots
NASA Astrophysics Data System (ADS)
Cortes, Antonio I.
The Theory of False Cognitive Expectancies was developed by studying high reliability flight operations. Airline pilots depend extensively on cognitive expectancies to perceive, understand, and predict actions and events. Out of 1,363 incident reports submitted by airline pilots to the National Aeronautics and Space Administration Aviation Safety Reporting System over a year's time, 110 reports were found to contain evidence of 127 false cognitive expectancies in pilots. A comprehensive taxonomy was developed with six categories of interest. The dataset of 127 false expectancies was used to initially code tentative taxon values for each category. Intermediate coding through constant comparative analysis completed the taxonomy. The taxonomy was used for the advanced coding of chronological context-dependent visualizations of expectancy factors, known as strands, which depict the major factors in the creation and propagation of each expectancy. Strands were mapped into common networks to detect highly represented expectancy processes. Theoretical integration established 11 sources of false expectancies, the most common expectancy errors, and those conspicuous factors worthy of future study. The most prevalent source of false cognitive expectancies within the dataset was determined to be unconscious individual modeling based on past events. Integrative analyses also revealed relationships between expectancies and flight deck automation, unresolved discrepancies, and levels of situation awareness. Particularly noteworthy were the findings that false expectancies can combine in three possible permutations to diminish situation awareness and examples of how false expectancies can be unwittingly transmitted from one person to another. The theory resulting from this research can enhance the error coding process used during aircraft line oriented safety audits, lays the foundation for developing expectancy management training programs, and will allow researchers to proffer hypotheses for human testing using flight simulators.
NASA Astrophysics Data System (ADS)
Nyland, Kristina
2017-01-01
Although our knowledge of the physics of galaxy evolution has made great strides over the past few decades, we still lack a complete understanding of the formation and growth of galaxies at high redshift. The Spitzer Extragalactic Representative Volume Survey (SERVS) aims to address this issue through deep Spitzer observations at [3.6] and [4.5] microns of 4 million sources distributed over five well-studied “deep fields” with abundant ancillary data from ground-based near-infrared surveys. The large SERVS footprint covers 18 square degrees and will provide a census of the multiwavelength properties of massive galaxies in the redshift range z = 1-6. A critical aspect of the scientific success and legacy value of SERVS is the construction of a robust source catalog. While multiwavelength source catalogs of the SERVS fields have been generated using traditional techniques, the photometric accuracy of these catalogs is limited by their inability to correctly measure fluxes of individual sources that are blended and/or inherently faint in the IRAC bands. To improve upon this shortfall and maximize the scientific impact of SERVS, we are using The Tractor image modeling code to produce a more accurate and complete multiwavelength source catalog. The Tractor optimizes a likelihood for the source properties given an image cut-out, light profile model, and the PSF information. Thus, The Tractor uses the source properties at the fiducial, highest-resolution band as a prior to more accurately measure the source properties in the lower-resolution images at longer wavelengths. We provide an overview of our parallelized implementation of The Tractor, discuss the subsequent improvements to the SERVS photometry, and suggest future applications.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Classification of Instructional Program (CIP) code of that program; and (C) If the student completed a program during the award year— (1) The name and CIP code of that program, and the date the student completed the... program, by name and CIP code, offered by the institution under § 668.8(c)(3) or (d), the total number of...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Classification of Instructional Program (CIP) code of that program; and (C) If the student completed a program during the award year— (1) The name and CIP code of that program, and the date the student completed the... program, by name and CIP code, offered by the institution under § 668.8(c)(3) or (d), the total number of...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Classification of Instructional Program (CIP) code of that program; and (C) If the student completed a program during the award year— (1) The name and CIP code of that program, and the date the student completed the... program, by name and CIP code, offered by the institution under § 668.8(c)(3) or (d), the total number of...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Classification of Instructional Program (CIP) code of that program; and (C) If the student completed a program during the award year— (1) The name and CIP code of that program, and the date the student completed the... program, by name and CIP code, offered by the institution under § 668.8(c)(3) or (d), the total number of...
Remanent Activation in the Mini-SHINE Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Micklich, Bradley J.
2015-04-16
Argonne National Laboratory is assisting SHINE Medical Technologies in developing a domestic source of the medical isotope 99Mo through the fission of low-enrichment uranium in a uranyl sulfate solution. In Phase 2 of these experiments, electrons from a linear accelerator create neutrons by interacting in a depleted uranium target, and these neutrons are used to irradiate the solution. The resulting neutron and photon radiation activates the target, the solution vessels, and a shielded cell that surrounds the experimental apparatus. When the experimental campaign is complete, the target must be removed into a shielding cask, and the experimental components must bemore » disassembled. The radiation transport code MCNPX and the transmutation code CINDER were used to calculate the radionuclide inventories of the solution, the target assembly, and the shielded cell, and to determine the dose rates and shielding requirements for selected removal scenarios for the target assembly and the solution vessels.« less
Development of Northeast Asia Nuclear Power Plant Accident Simulator.
Kim, Juyub; Kim, Juyoul; Po, Li-Chi Cliff
2017-06-15
A conclusion from the lessons learned after the March 2011 Fukushima Daiichi accident was that Korea needs a tool to estimate consequences from a major accident that could occur at a nuclear power plant located in a neighboring country. This paper describes a suite of computer-based codes to be used by Korea's nuclear emergency response staff for training and potentially operational support in Korea's national emergency preparedness and response program. The systems of codes, Northeast Asia Nuclear Accident Simulator (NANAS), consist of three modules: source-term estimation, atmospheric dispersion prediction and dose assessment. To quickly assess potential doses to the public in Korea, NANAS includes specific reactor data from the nuclear power plants in China, Japan and Taiwan. The completed simulator is demonstrated using data for a hypothetical release. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Using the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.
2013-01-01
The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.
McSKY: A hybrid Monte-Carlo lime-beam code for shielded gamma skyshine calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shultis, J.K.; Faw, R.E.; Stedry, M.H.
1994-07-01
McSKY evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated into either a vertical cone or a vertical structure with an N-sided polygon cross section. The code assumes an overhead shield of two materials, through the user can specify zero shield thickness for an unshielded calculation. The code uses a Monte-Carlo algorithm to evaluate transport through source shields and the integral line source to describe photon transport through the atmosphere. The source energy must be between 0.02 and 100 MeV. For heavily shielded sources with energies above 20 MeV, McSKY results must be used cautiously, especially at detectormore » locations near the source.« less
Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding
NASA Astrophysics Data System (ADS)
Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.
2016-03-01
In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.
Performance and Architecture Lab Modeling Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-06-19
Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less
Suppression of Alfvénic modes with off-axis NBI
NASA Astrophysics Data System (ADS)
Fredrickson, Eric; Bell, R.; Diallo, A.; Leblanc, B.; Podesta, M.; Levinton, F.; Yuh, H.; Liu, D.
2016-10-01
GAE are seen on NSTX-U in the frequency range from 1 to 3 MHz with injection of the more perpendicular, NSTX neutral beam sources. A new result is that injection of any of the new, more tangential, neutral beam sources with tangency radii larger than the magnetic axis suppress this GAE activity. Simulations of beam deposition and slowing down with the TRANSP code indicate that these new sources deposit fast ions with 0.9
Chromaticity calculations and code comparisons for x-ray lithography source XLS and SXLS rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parsa, Z.
1988-06-16
This note presents the chromaticity calculations and code comparison results for the (x-ray lithography source) XLS (Chasman Green, XUV Cosy lattice) and (2 magnet 4T) SXLS lattices, with the standard beam optic codes, including programs SYNCH88.5, MAD6, PATRICIA88.4, PATPET88.2, DIMAD, BETA, and MARYLIE. This analysis is a part of our ongoing accelerator physics code studies. 4 figs., 10 tabs.
The Astrophysics Source Code Library: Where Do We Go from Here?
NASA Astrophysics Data System (ADS)
Allen, A.; Berriman, B.; DuPrie, K.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J.
2014-05-01
The Astrophysics Source Code Library1, started in 1999, has in the past three years grown from a repository for 40 codes to a registry of over 700 codes that are now indexed by ADS. What comes next? We examine the future of the , the challenges facing it, the rationale behind its practices, and the need to balance what we might do with what we have the resources to accomplish.
Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.
Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile
2016-01-01
This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.
PD5: a general purpose library for primer design software.
Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda
2013-01-01
Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.
Tam, Vivian; Edge, Jennifer S; Hoffman, Steven J
2016-10-12
Shortages of health workers in low-income countries are exacerbated by the international migration of health workers to more affluent countries. This problem is compounded by the active recruitment of health workers by destination countries, particularly Australia, Canada, UK and USA. The World Health Organization (WHO) adopted a voluntary Code of Practice in May 2010 to mitigate tensions between health workers' right to migrate and the shortage of health workers in source countries. The first empirical impact evaluation of this Code was conducted 11-months after its adoption and demonstrated a lack of impact on health workforce recruitment policy and practice in the short-term. This second empirical impact evaluation was conducted 4-years post-adoption using the same methodology to determine whether there have been any changes in the perceived utility, applicability, and implementation of the Code in the medium-term. Forty-four respondents representing government, civil society and the private sector from Australia, Canada, UK and USA completed an email-based survey evaluating their awareness of the Code, perceived impact, changes to policy or recruitment practices resulting from the Code, and the effectiveness of non-binding Codes generally. The same survey instrument from the original study was used to facilitate direct comparability of responses. Key lessons were identified through thematic analysis. The main findings between the initial impact evaluation and the current one are unchanged. Both sets of key informants reported no significant policy or regulatory changes to health worker recruitment in their countries as a direct result of the Code due to its lack of incentives, institutional mechanisms and interest mobilizers. Participants emphasized the existence of previous bilateral and regional Codes, the WHO Code's non-binding nature, and the primacy of competing domestic healthcare priorities in explaining this perceived lack of impact. The Code has probably still not produced the tangible improvements in health worker flows it aspired to achieve. Several actions, including a focus on developing bilateral codes, linking the Code to topical global priorities, and reframing the Code's purpose to emphasize health system sustainability, are proposed to improve the Code's uptake and impact.
Methods for Coding Tobacco-Related Twitter Data: A Systematic Review
Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai
2017-01-01
Background As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. Objective The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Methods Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. Results E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter’s Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Conclusions Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter’s databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. PMID:28363883
50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use
Code of Federal Regulations, 2012 CFR
2012-10-01
... following: Alpha gear code NMFS logbooks Electronic check-in/ check-out Use numeric code to complete the following: Numeric gear code IERS eLandings ADF&G COAR NMFS AND ADF&G GEAR CODES Hook-and-line HAL X X 61 X...
50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use
Code of Federal Regulations, 2014 CFR
2014-10-01
... following: Alpha gear code NMFS logbooks Electronic check-in/ check-out Use numeric code to complete the following: Numeric gear code IERS eLandings ADF&G COAR NMFS AND ADF&G GEAR CODES Hook-and-line HAL X X 61 X...
50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use
Code of Federal Regulations, 2013 CFR
2013-10-01
... following: Alpha gear code NMFS logbooks Electronic check-in/ check-out Use numeric code to complete the following: Numeric gear code IERS eLandings ADF&G COAR NMFS AND ADF&G GEAR CODES Hook-and-line HAL X X 61 X...
Paterson, Andrew H.; Wang, Xuelin; Xu, Yiqing; Wu, Dongyang; Qu, Yanshu; Jiang, Anna; Ye, Qiaolin
2016-01-01
Cotton is one of the most important economic crops and the primary source of natural fiber and is an important protein source for animal feed. The complete nuclear and chloroplast (cp) genome sequences of G. raimondii are already available but not mitochondria. Here, we assembled the complete mitochondrial (mt) DNA sequence of G. raimondii into a circular genome of length of 676,078 bp and performed comparative analyses with other higher plants. The genome contains 39 protein-coding genes, 6 rRNA genes, and 25 tRNA genes. We also identified four larger repeats (63.9 kb, 10.6 kb, 9.1 kb, and 2.5 kb) in this mt genome, which may be active in intramolecular recombination in the evolution of cotton. Strikingly, nearly all of the G. raimondii mt genome has been transferred to nucleus on Chr1, and the transfer event must be very recent. Phylogenetic analysis reveals that G. raimondii, as a member of Malvaceae, is much closer to another cotton (G. barbadense) than other rosids, and the clade formed by two Gossypium species is sister to Brassicales. The G. raimondii mt genome may provide a crucial foundation for evolutionary analysis, molecular biology, and cytoplasmic male sterility in cotton and other higher plants. PMID:27847816
Image authentication using distributed source coding.
Lin, Yao-Chung; Varodayan, David; Girod, Bernd
2012-01-01
We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.
1991-05-31
benchmarks ............ .... . .. .. . . .. 220 Appendix G : Source code of the Aquarius Prolog compiler ........ . 224 Chapter I Introduction "You’re given...notation, a tool that is used throughout the compiler’s implementation. Appendix F lists the source code of the C and Prolog benchmarks. Appendix G lists the...source code of the compilcr. 5 "- standard form Prolog / a-sfomadon / head umrvln Convert to tmeikernel Prol g vrans~fonaon 1symbolic execution
Video transmission on ATM networks. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chen, Yun-Chung
1993-01-01
The broadband integrated services digital network (B-ISDN) is expected to provide high-speed and flexible multimedia applications. Multimedia includes data, graphics, image, voice, and video. Asynchronous transfer mode (ATM) is the adopted transport techniques for B-ISDN and has the potential for providing a more efficient and integrated environment for multimedia. It is believed that most broadband applications will make heavy use of visual information. The prospect of wide spread use of image and video communication has led to interest in coding algorithms for reducing bandwidth requirements and improving image quality. The major results of a study on the bridging of network transmission performance and video coding are: Using two representative video sequences, several video source models are developed. The fitness of these models are validated through the use of statistical tests and network queuing performance. A dual leaky bucket algorithm is proposed as an effective network policing function. The concept of the dual leaky bucket algorithm can be applied to a prioritized coding approach to achieve transmission efficiency. A mapping of the performance/control parameters at the network level into equivalent parameters at the video coding level is developed. Based on that, a complete set of principles for the design of video codecs for network transmission is proposed.
NASA Astrophysics Data System (ADS)
Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel
2015-04-01
We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity and Kernel Inversion), which provides a generalized interface to arbitrary external forward modelling codes. So far, the 3D spectral-element code SPECFEM3D (Tromp, Komatitsch and Liu, 2008) and the 1D semi-analytical code GEMINI (Friederich and Dalkolmo, 1995) in both, Cartesian and spherical framework are supported. The creation of interfaces to further forward codes is planned in the near future. ASKI is freely available under the terms of the GPL at www.rub.de/aski . Since the independent modules of ASKI must communicate via file output/input, large storage capacities need to be accessible conveniently. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion. In the presentation, we will show some aspects of the theory behind the full waveform inversion method and its practical realization by the software package ASKI, as well as synthetic and real-data applications from different scales and geometries.
Brown, Andrew D; Tollin, Daniel J
2016-09-21
In mammals, localization of sound sources in azimuth depends on sensitivity to interaural differences in sound timing (ITD) and level (ILD). Paradoxically, while typical ILD-sensitive neurons of the auditory brainstem require millisecond synchrony of excitatory and inhibitory inputs for the encoding of ILDs, human and animal behavioral ILD sensitivity is robust to temporal stimulus degradations (e.g., interaural decorrelation due to reverberation), or, in humans, bilateral clinical device processing. Here we demonstrate that behavioral ILD sensitivity is only modestly degraded with even complete decorrelation of left- and right-ear signals, suggesting the existence of a highly integrative ILD-coding mechanism. Correspondingly, we find that a majority of auditory midbrain neurons in the central nucleus of the inferior colliculus (of chinchilla) effectively encode ILDs despite complete decorrelation of left- and right-ear signals. We show that such responses can be accounted for by relatively long windows of bilateral excitatory-inhibitory interaction, which we explicitly measure using trains of narrowband clicks. Neural and behavioral data are compared with the outputs of a simple model of ILD processing with a single free parameter, the duration of excitatory-inhibitory interaction. Behavioral, neural, and modeling data collectively suggest that ILD sensitivity depends on binaural integration of excitation and inhibition within a ≳3 ms temporal window, significantly longer than observed in lower brainstem neurons. This relatively slow integration potentiates a unique role for the ILD system in spatial hearing that may be of particular importance when informative ITD cues are unavailable. In mammalian hearing, interaural differences in the timing (ITD) and level (ILD) of impinging sounds carry critical information about source location. However, natural sounds are often decorrelated between the ears by reverberation and background noise, degrading the fidelity of both ITD and ILD cues. Here we demonstrate that behavioral ILD sensitivity (in humans) and neural ILD sensitivity (in single neurons of the chinchilla auditory midbrain) remain robust under stimulus conditions that render ITD cues undetectable. This result can be explained by "slow" temporal integration arising from several-millisecond-long windows of excitatory-inhibitory interaction evident in midbrain, but not brainstem, neurons. Such integrative coding can account for the preservation of ILD sensitivity despite even extreme temporal degradations in ecological acoustic stimuli. Copyright © 2016 the authors 0270-6474/16/369908-14$15.00/0.
2007-10-01
Architecture ................................................................................ 14 Figure 2. Eclipse Java Model...16 Figure 3. Eclipse Java Model at the Source Code Level...24 Figure 9. Java Source Code
Modeling and Prediction of the Noise from Non-Axisymmetric Jets
NASA Technical Reports Server (NTRS)
Leib, Stewart J.
2014-01-01
The new source model was combined with the original sound propagation model developed for rectangular jets to produce a new version of the rectangular jet noise prediction code. This code was validated using a set of rectangular nozzles whose geometries were specified by NASA. Nozzles of aspect ratios two, four and eight were studied at jet exit Mach numbers of 0.5, 0.7 and 0.9, for a total of nine cases. Reynolds-averaged Navier-Stokes solutions for these jets were provided to the contactor for use as input to the code. Quantitative comparisons of the predicted azimuthal and polar directivity of the acoustic spectrum were made with experimental data provided by NASA. The results of these comparisons, along with a documentation of the propagation and source models, were reported in a journal article publication (Ref. 4). The complete set of computer codes and computational modules that make up the prediction scheme, along with a user's guide describing their use and example test cases, was provided to NASA as a deliverable of this task. The use of conformal mapping, along with simplified modeling of the mean flow field, for noise propagation modeling was explored for other nozzle geometries, to support the task milestone of developing methods which are applicable to other geometries and flow conditions of interest to NASA. A model to represent twin round jets using this approach was formulated and implemented. A general approach to solving the equations governing sound propagation in a locally parallel nonaxisymmetric jet was developed and implemented, in aid of the tasks and milestones charged with selecting more exact numerical methods for modeling sound propagation, and developing methods that have application to other nozzle geometries. The method is based on expansion of both the mean-flowdependent coefficients in the governing equation and the Green's function in series of orthogonal functions. The method was coded and tested on two analytically prescribed mean flows which were meant to represent noise reduction concepts being considered by NASA. Testing (Ref. 5) showed that the method was feasible for the types of mean flows of interest in jet noise applications. Subsequently, this method was further developed to allow use of mean flow profiles obtained from a Reynolds-averaged Navier-Stokes (RANS) solution of the flow. Preliminary testing of the generalized code was among the last tasks completed under this contract. The stringent noise-reduction goals of NASA's Fundamental Aeronautics Program suggest that, in addition to potentially complex exhaust nozzle geometries, next generation aircraft will also involve tighter integration of the engine with the airframe. Therefore, noise generated and propagated by jet flows in the vicinity of solid surfaces is expected to be quite significant, and reduced-order noise prediction tools will be needed that can deal with such geometries. One important source of noise is that generated by the interaction of a turbulent jet with the edge of a solid surface (edge noise). Such noise is generated, for example, by the passing of the engine exhaust over a shielding surface, such as a wing. Work under this task supported an effort to develop a RANS-based prediction code for edge noise based on an extension of the classical Rapid Distortion Theory (RDT) to transversely sheared base flows (Refs. 6 and 7). The RDT-based theoretical analysis was applied to the generic problem of a turbulent jet interacting with the trailing edge of a flat plate. A code was written to evaluate the formula derived for the spectrum of the noise produced by this interaction and results were compared with data taken at NASA Glenn for a variety of jet/plate configurations and flow conditions (Ref. 8). A longer-term goal of this task was to work toward the development of a high-fidelity model of sound propagation in spatially developing non-axisymmetric jets using direct numerical methods for solving the relevant equations. Working with NASA Glenn Acoustics Branch personnel, numerical methods and boundary conditions appropriate for use in a high-resolution calculation of the full equations governing sound propagation in a steady base flow were identified. Computer codes were then written (by NASA) and tested (by OAI) for an increasingly complex set of flow conditions to validate the methods. The NASA-supplied codes were ported to the High-End Computing resources of the NASA Advanced Supercomputing facility for testing and validation against analytical (where possible) and independent numerical solutions. The cases which were completed during the course of this contract were solutions of the two-dimensional linearized Euler equations with no mean flow, a uniform mean flow and a nonuniform mean flow representative of a parallel flow jet.
Scalable Video Transmission Over Multi-Rate Multiple Access Channels
2007-06-01
Rate - compatible punctured convolutional codes (RCPC codes ) and their ap- plications,” IEEE...source encoded using the MPEG-4 video codec. The source encoded bitstream is then channel encoded with Rate Compatible Punctured Convolutional (RCPC...Clark, and J. M. Geist, “ Punctured convolutional codes or rate (n-1)/n and simplified maximum likelihood decoding,” IEEE Transactions on
Zaker, Neda; Sina, Sedigheh; Koontz, Craig; Meigooni1, Ali S.
2016-01-01
Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross‐sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross‐sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code — MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low‐energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes. PACS number(s): 87.56.bg PMID:27074460
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2012 CFR
2012-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2013 CFR
2013-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2014 CFR
2014-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2011 CFR
2011-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
Coding conventions and principles for a National Land-Change Modeling Framework
Donato, David I.
2017-07-14
This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.
Provisional Coding Practices: Are They Really a Waste of Time?
Krypuy, Matthew; McCormack, Lena
2006-11-01
In order to facilitate effective clinical coding and hence the precise financial reimbursement of acute services, in 2005 Western District Health Service (WDHS) (located in regional Victoria, Australia) undertook a provisional coding trial for inpatient medical episodes to determine the magnitude and accuracy of clinical documentation. Utilising clinical coding software installed on a laptop computer, provisional coding was undertaken for all current overnight inpatient episodes under each physician one day prior to attending their daily ward round. The provisionally coded episodes were re-coded upon the completion of the discharge summary and the final Diagnostic Related Group (DRG) allocation and weight were compared to the provisional DRG assignment. A total of 54 out of 220 inpatient medical episodes were provisionally coded. This represented approximately a 25% cross section of the population selected for observation. Approximately 67.6% of the provisionally allocated DRGs were accurate in contrast to 32.4% which were subject to change once the discharge summary was completed. The DRG changes were primarily due to: disease progression of a patient during their care episode which could not be identified by clinical coding staff due to discharge prior to the following scheduled ward round; the discharge destination of particular patients; and the accuracy of clinical documentation on the discharge summary. The information gathered from the provisional coding trial supported the hypothesis that clinical documentation standards were sufficient and adequate to support precise clinical coding and DRG assignment at WDHS. The trial further highlighted the importance of a complete and accurate discharge summary available during the coding process of acute inpatient episodes.
NASA Technical Reports Server (NTRS)
Davis, Kirsch; Bankieris, Derek
2016-01-01
As an intern project for NASA Johnson Space Center (JSC), my job was to familiarize myself and operate a Robotics Operating System (ROS). The project outcome converted existing software assets into ROS using nodes, enabling a robotic Hexapod to communicate to be functional and controlled by an existing PlayStation 3 (PS3) controller. Existing control algorithms and current libraries have no ROS capabilities within the Hexapod C++ source code when the internship started, but that has changed throughout my internship. Conversion of C++ codes to ROS enabled existing code to be compatible with ROS, and is now controlled using an existing PS3 controller. Furthermore, my job description was to design ROS messages and script programs that enabled assets to participate in the ROS ecosystem by subscribing and publishing messages. Software programming source code is written in directories using C++. Testing of software assets included compiling code within the Linux environment using a terminal. The terminal ran the code from a directory. Several problems occurred while compiling code and the code would not compile. So modifying code to where C++ can read the source code were made. Once the code was compiled and ran, the code was uploaded to Hexapod and then controlled by a PS3 controller. The project outcome has the Hexapod fully functional and compatible with ROS and operates using the PlayStation 3 controller. In addition, an open source software (IDE) Arduino board will be integrated into the ecosystem with designing circuitry on a breadboard to add additional behavior with push buttons, potentiometers and other simple elements in the electrical circuitry. Other projects with the Arduino will be a GPS module, digital clock that will run off 22 satellites to show accurate real time using a GPS signal and an internal patch antenna to communicate with satellites. In addition, this internship experience has led me to pursue myself to learn coding more efficiently and effectively to write, subscribe and publish my own source code in different programming languages. With some familiarity with software programming, it will enhance my skills in the electrical engineering field. In contrast, my experience here at JSC with the Simulation and Graphics Branch (ER7) has led me to take my coding skill to be more proficient to increase my knowledge in software programming, and also enhancing my skills in ROS. This knowledge will be taken back to my university to implement coding in a school project that will use source coding and ROS to work on the PR2 robot which is controlled by ROS software. My skills learned here will be used to integrate messages to subscribe and publish ROS messages to a PR2 robot. The PR2 robot will be controlled by an existing PS3 controller by changing C++ coding to subscribe and publish messages to ROS. Overall the skills that were obtained here will not be lost, but increased.
Comparison of procedure coding systems for level 1 and 2 hospitals in South Africa.
Montewa, Lebogang; Hanmer, Lyn; Reagon, Gavin
2013-01-01
The ability of three procedure coding systems to reflect the procedure concepts extracted from patient records from six hospitals was compared, in order to inform decision making about a procedure coding standard for South Africa. A convenience sample of 126 procedure concepts was extracted from patient records at three level 1 hospitals and three level 2 hospitals. Each procedure concept was coded using ICPC-2, ICD-9-CM, and CCSA-2001. The extent to which each code assigned actually reflected the procedure concept was evaluated (between 'no match' and 'complete match'). For the study sample, CCSA-2001 was found to reflect the procedure concepts most completely, followed by ICD-9-CM and then ICPC-2. In practice, decision making about procedure coding standards would depend on multiple factors in addition to coding accuracy.
Runtime Detection of C-Style Errors in UPC Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pirkelbauer, P; Liao, C; Panas, T
2011-09-29
Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the globalmore » address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.« less
Development of a new EMP code at LANL
NASA Astrophysics Data System (ADS)
Colman, J. J.; Roussel-Dupré, R. A.; Symbalisty, E. M.; Triplett, L. A.; Travis, B. J.
2006-05-01
A new code for modeling the generation of an electromagnetic pulse (EMP) by a nuclear explosion in the atmosphere is being developed. The source of the EMP is the Compton current produced by the prompt radiation (γ-rays, X-rays, and neutrons) of the detonation. As a first step in building a multi- dimensional EMP code we have written three kinetic codes, Plume, Swarm, and Rad. Plume models the transport of energetic electrons in air. The Plume code solves the relativistic Fokker-Planck equation over a specified energy range that can include ~ 3 keV to 50 MeV and computes the resulting electron distribution function at each cell in a two dimensional spatial grid. The energetic electrons are allowed to transport, scatter, and experience Coulombic drag. Swarm models the transport of lower energy electrons in air, spanning 0.005 eV to 30 keV. The swarm code performs a full 2-D solution to the Boltzmann equation for electrons in the presence of an applied electric field. Over this energy range the relevant processes to be tracked are elastic scattering, three body attachment, two body attachment, rotational excitation, vibrational excitation, electronic excitation, and ionization. All of these occur due to collisions between the electrons and neutral bodies in air. The Rad code solves the full radiation transfer equation in the energy range of 1 keV to 100 MeV. It includes effects of photo-absorption, Compton scattering, and pair-production. All of these codes employ a spherical coordinate system in momentum space and a cylindrical coordinate system in configuration space. The "z" axis of the momentum and configuration spaces is assumed to be parallel and we are currently also assuming complete spatial symmetry around the "z" axis. Benchmarking for each of these codes will be discussed as well as the way forward towards an integrated modern EMP code.
NASA Astrophysics Data System (ADS)
López-Coto, R.; Hahn, J.; BenZvi, S.; Dingus, B.; Hinton, J.; Nisa, M. U.; Parsons, R. D.; Greus, F. Salesa; Zhang, H.; Zhou, H.
2018-11-01
The positron excess measured by PAMELA and AMS can only be explained if there is one or several sources injecting them. Moreover, at the highest energies, it requires the presence of nearby ( ∼ hundreds of parsecs) and middle age (maximum of ∼ hundreds of kyr) sources. Pulsars, as factories of electrons and positrons, are one of the proposed candidates to explain the origin of this excess. To calculate the contribution of these sources to the electron and positron flux at the Earth, we developed EDGE (Electron Diffusion and Gamma rays to the Earth), a code to treat the propagation of electrons and compute their diffusion from a central source with a flexible injection spectrum. Using this code, we can derive the source's gamma-ray spectrum, spatial extension, the all-electron density in space, the electron and positron flux reaching the Earth and the positron fraction measured at the Earth. We present in this paper the foundations of the code and study how different parameters affect the gamma-ray spectrum of a source and the electron flux measured at the Earth. We also studied the effect of several approximations usually performed in these studies. This code has been used to derive the results of the positron flux measured at the Earth in [1].
KEGGParser: parsing and editing KEGG pathway maps in Matlab.
Arakelyan, Arsen; Nersisyan, Lilit
2013-02-15
KEGG pathway database is a collection of manually drawn pathway maps accompanied with KGML format files intended for use in automatic analysis. KGML files, however, do not contain the required information for complete reproduction of all the events indicated in the static image of a pathway map. Several parsers and editors of KEGG pathways exist for processing KGML files. We introduce KEGGParser-a MATLAB based tool for KEGG pathway parsing, semiautomatic fixing, editing, visualization and analysis in MATLAB environment. It also works with Scilab. The source code is available at http://www.mathworks.com/matlabcentral/fileexchange/37561.
Practical adaptive quantum tomography
NASA Astrophysics Data System (ADS)
Granade, Christopher; Ferrie, Christopher; Flammia, Steven T.
2017-11-01
We introduce a fast and accurate heuristic for adaptive tomography that addresses many of the limitations of prior methods. Previous approaches were either too computationally intensive or tailored to handle special cases such as single qubits or pure states. By contrast, our approach combines the efficiency of online optimization with generally applicable and well-motivated data-processing techniques. We numerically demonstrate these advantages in several scenarios including mixed states, higher-dimensional systems, and restricted measurements. http://cgranade.com complete data and source code for this work are available online [1], and can be previewed at https://goo.gl/koiWxR.
Evaluation of Healthcare Interventions and Big Data: Review of Associated Data Issues.
Asche, Carl V; Seal, Brian; Kahler, Kristijan H; Oehrlein, Elisabeth M; Baumgartner, Meredith Greer
2017-08-01
Although the analysis of 'big data' holds tremendous potential to improve patient care, there remain significant challenges before it can be realized. Accuracy and completeness of data, linkage of disparate data sources, and access to data are areas that require particular focus. This article discusses these areas and shares strategies to promote progress. Improvement in clinical coding, innovative matching methodologies, and investment in data standardization are potential solutions to data validation and linkage problems. Challenges to data access still require significant attention with data ownership, security needs, and costs representing significant barriers to access.
Final report for the Tera Computer TTI CRADA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, G.S.; Pavlakos, C.; Silva, C.
1997-01-01
Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less
Accuracy and Completeness of Clinical Coding Using ICD-10 for Ambulatory Visits
Horsky, Jan; Drucker, Elizabeth A.; Ramelson, Harley Z.
2017-01-01
This study describes a simulation of diagnostic coding using an EHR. Twenty-three ambulatory clinicians were asked to enter appropriate codes for six standardized scenarios with two different EHRs. Their interactions with the query interface were analyzed for patterns and variations in search strategies and the resulting sets of entered codes for accuracy and completeness. Just over a half of entered codes were appropriate for a given scenario and about a quarter were omitted. Crohn’s disease and diabetes scenarios had the highest rate of inappropriate coding and code variation. The omission rate was higher for secondary than for primary visit diagnoses. Codes for immunization, dialysis dependence and nicotine dependence were the most often omitted. We also found a high rate of variation in the search terms used to query the EHR for the same diagnoses. Changes to the training of clinicians and improved design of EHR query modules may lower the rate of inappropriate and omitted codes. PMID:29854158
NASA Technical Reports Server (NTRS)
Hinds, Erold W. (Principal Investigator)
1996-01-01
This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.
Omeire, Destiny; Abdin, Shaunte; Brooks, Daniel M; Miranda, Hector C
2015-04-01
The Germain's Peacock-Pheasant Polyplectron germaini (Aves, Galliformes, Phasianidae) is classified as Near Threatened on the IUCN Red List. The complete mitochondrial genome of P. germaini is 16,699 bp, consisting of 13 protein-coding genes, 2 rRNA, 22 tRNA genes and 1 control region. All of the 13 protein-coding genes have ATG as start codon. Eight of the 13 protein-coding genes have TAA as stop codon.
Admiralty Inlet Advanced Turbulence Measurements: final data and code archive
Kilcher, Levi (ORCID:0000000183851131); Thomson, Jim (ORCID:0000000289290088); Harding, Samuel
2011-02-01
Data and code that is not already in a public location that is used in Kilcher, Thomson, Harding, and Nylund (2017) "Turbulence Measurements from Compliant Moorings - Part II: Motion Correction" doi: 10.1175/JTECH-D-16-0213.1. The links point to Python source code used in the publication. All other files are source data used in the publication.
Numerical Electromagnetic Code (NEC)-Basic Scattering Code. Part 2. Code Manual
1979-09-01
imaging of source axes for magnetic source. Ax R VSOURC(1,1) + 9 VSOURC(1,2) + T VSOURC(1,3) 4pi = x VIMAG(I,1) + ^ VINAG (1,2)+ VIMAG(l,3) An =unit...VNC A. yt and z components of the end cap unit normal OUTPUT VARIABLE VINAG X.. Y, and z components defining thesource image coordinate system axesin
Java Source Code Analysis for API Migration to Embedded Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winter, Victor; McCoy, James A.; Guerrero, Jonathan
Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less
Lowe, Jeanne R; Raugi, Gregory J; Reiber, Gayle E; Whitney, Joanne D
2013-01-01
The purpose of this cohort study was to evaluate the effect of a 1-year intervention of an electronic medical record wound care template on the completeness of wound care documentation and medical coding compared to a similar time interval for the fiscal year preceding the intervention. From October 1, 2006, to September 30, 2007, a "good wound care" intervention was implemented at a rural Veterans Affairs facility to prevent amputations in veterans with diabetes and foot ulcers. The study protocol included a template with foot ulcer variables embedded in the electronic medical record to facilitate data collection, support clinical decision making, and improve ordering and medical coding. The intervention group showed significant differences in complete documentation of good wound care compared to the historic control group (χ = 15.99, P < .001), complete documentation of coding for diagnoses and procedures (χ = 30.23, P < .001), and complete documentation of both good wound care and coding for diagnoses and procedures (χ = 14.96, P < .001). An electronic wound care template improved documentation of evidence-based interventions and facilitated coding for wound complexity and procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Cliff; Spring, Stefan; Lapidus, Alla
Pedobacter heparinus (Payza and Korn 1956) Steyn et al. 1998 comb. nov. is the type species of the rapidly growing genus Pedobacter within the family Sphingobacteriaceae of the phylum 'Bacteroidetes'. P. heparinus is of interest, because it was the first isolated strain shown to grow with heparin as sole carbon and nitrogen source and because it produces several enzymes involved in the degradation of mucopolysaccharides. All available data about this species are based on a sole strain that was isolated from dry soil. Here we describe the features of this organism, together with the complete genome sequence, and annotation. Thismore » is the first report on a complete genome sequence of a member of the genus Pedobacter, and the 5,167,383 bp long single replicon genome with its 4287 protein-coding and 54 RNA genes is part of the Genomic Encyclopedia of Bacteria and Archaea project.« less
Lopez-Iturri, Peio; de Miguel-Bilbao, Silvia; Aguirre, Erik; Azpilicueta, Leire; Falcone, Francisco; Ramos, Victoria
2015-01-01
The electromagnetic field leakage levels of nonionizing radiation from a microwave oven have been estimated within a complex indoor scenario. By employing a hybrid simulation technique, based on coupling full wave simulation with an in-house developed deterministic 3D ray launching code, estimations of the observed electric field values can be obtained for the complete indoor scenario. The microwave oven can be modeled as a time- and frequency-dependent radiating source, in which leakage, basically from the microwave oven door, is propagated along the complete indoor scenario interacting with all of the elements present in it. This method can be of aid in order to assess the impact of such devices on expected exposure levels, allowing adequate minimization strategies such as optimal location to be applied. PMID:25705676
Brito, Luciana Fernandes; Bach, Evelise; Kalinowski, Jörn; Rückert, Christian; Wibberg, Daniel; Passaglia, Luciane M; Wendisch, Volker F
2015-08-10
Paenibacillus riograndensis is a Gram-positive rhizobacterium which exhibits plant growth promoting activities. It was isolated from the rhizosphere of wheat grown in the state of Rio Grande do Sul, Brazil. Here we announce the complete genome sequence of P. riograndensis strain SBR5(T). The genome of P. riograndensis SBR5(T) consists of a circular chromosome of 7,893,056bps. The genome was finished and fully annotated, containing 6705 protein coding genes, 87 tRNAs and 27 rRNAs. The knowledge of the complete genome helped to explain why P. riograndensis SBR5(T) can grow with the carbon sources arabinose and mannitol, but not myo-inositol, and to explain physiological features such as biotin auxotrophy and antibiotic resistances. The genome sequence will be valuable for functional genomics and ecological studies as well as for application of P. riograndensis SBR5(T) as plant growth-promoting rhizobacterium. Copyright © 2015 Elsevier B.V. All rights reserved.
Making your code citable with the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.
2016-01-01
The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.
Sorimachi, Kenji; Okayasu, Teiji
2015-01-01
The complete vertebrate mitochondrial genome consists of 13 coding genes. We used this genome to investigate the existence of natural selection in vertebrate evolution. From the complete mitochondrial genomes, we predicted nucleotide contents and then separated these values into coding and non-coding regions. When nucleotide contents of a coding or non-coding region were plotted against the nucleotide content of the complete mitochondrial genomes, we obtained linear regression lines only between homonucleotides and their analogs. On every plot using G or A content purine, G content in aquatic vertebrates was higher than that in terrestrial vertebrates, while A content in aquatic vertebrates was lower than that in terrestrial vertebrates. Based on these relationships, vertebrates were separated into two groups, terrestrial and aquatic. However, using C or T content pyrimidine, clear separation between these two groups was not obtained. The hagfish (Eptatretus burgeri) was further separated from both terrestrial and aquatic vertebrates. Based on these results, nucleotide content relationships predicted from the complete vertebrate mitochondrial genomes reveal the existence of natural selection based on evolutionary separation between terrestrial and aquatic vertebrate groups. In addition, we propose that separation of the two groups might be linked to ammonia detoxification based on high G and low A contents, which encode Glu rich and Lys poor proteins.
50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Gear Codes, Descriptions, and Use 15 Table... ALASKA Pt. 679, Table 15 Table 15 to Part 679—Gear Codes, Descriptions, and Use Gear Codes, Descriptions, and Use (X indicates where this code is used) Name of gear Use alphabetic code to complete the...
Issues and opportunities: beam simulations for heavy ion fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, A
1999-07-15
UCRL- JC- 134975 PREPRINT code offering 3- D, axisymmetric, and ''transverse slice'' (steady flow) geometries, with a hierarchy of models for the ''lattice'' of focusing, bending, and accelerating elements. Interactive and script- driven code steering is afforded through an interpreter interface. The code runs with good parallel scaling on the T3E. Detailed simulations of machine segments and of complete small experiments, as well as simplified full- system runs, have been carried out, partially benchmarking the code. A magnetoinductive model, with module impedance and multi- beam effects, is under study. experiments, including an injector scalable to multi- beam arrays, a high-more » current beam transport and acceleration experiment, and a scaled final- focusing experiment. These ''phase I'' projects are laying the groundwork for the next major step in HIF development, the Integrated Research Experiment (IRE). Simulations aimed directly at the IRE must enable us to: design a facility with maximum power on target at minimal cost; set requirements for hardware tolerances, beam steering, etc.; and evaluate proposed chamber propagation modes. Finally, simulations must enable us to study all issues which arise in the context of a fusion driver, and must facilitate the assessment of driver options. In all of this, maximum advantage must be taken of emerging terascale computer architectures, requiring an aggressive code development effort. An organizing principle should be pursuit of the goal of integrated and detailed source- to- target simulation. methods for analysis of the beam dynamics in the various machine concepts, using moment- based methods for purposes of design, waveform synthesis, steering algorithm synthesis, etc. Three classes of discrete- particle models should be coupled: (1) electrostatic/ magnetoinductive PIC simulations should track the beams from the source through the final- focusing optics, passing details of the time- dependent distribution function to (2) electromagnetic or magnetoinductive PIC or hybrid PIG/ fluid simulations in the fusion chamber (which would finally pass their particle trajectory information to the radiation- hydrodynamics codes used for target design); in parallel, (3) detailed PIC, delta- f, core/ test- particle, and perhaps continuum Vlasov codes should be used to study individual sections of the driver and chamber very carefully; consistency may be assured by linking data from the PIC sequence, and knowledge gained may feed back into that sequence.« less
nRC: non-coding RNA Classifier based on structural features.
Fiannaca, Antonino; La Rosa, Massimo; La Paglia, Laura; Rizzo, Riccardo; Urso, Alfonso
2017-01-01
Non-coding RNA (ncRNA) are small non-coding sequences involved in gene expression regulation of many biological processes and diseases. The recent discovery of a large set of different ncRNAs with biologically relevant roles has opened the way to develop methods able to discriminate between the different ncRNA classes. Moreover, the lack of knowledge about the complete mechanisms in regulative processes, together with the development of high-throughput technologies, has required the help of bioinformatics tools in addressing biologists and clinicians with a deeper comprehension of the functional roles of ncRNAs. In this work, we introduce a new ncRNA classification tool, nRC (non-coding RNA Classifier). Our approach is based on features extraction from the ncRNA secondary structure together with a supervised classification algorithm implementing a deep learning architecture based on convolutional neural networks. We tested our approach for the classification of 13 different ncRNA classes. We obtained classification scores, using the most common statistical measures. In particular, we reach an accuracy and sensitivity score of about 74%. The proposed method outperforms other similar classification methods based on secondary structure features and machine learning algorithms, including the RNAcon tool that, to date, is the reference classifier. nRC tool is freely available as a docker image at https://hub.docker.com/r/tblab/nrc/. The source code of nRC tool is also available at https://github.com/IcarPA-TBlab/nrc.
Hernando, Victoria; Sobrino-Vegas, Paz; Burriel, M Carmen; Berenguer, Juan; Navarro, Gemma; Santos, Ignacio; Reparaz, Jesús; Martínez, M Angeles; Antela, Antonio; Gutiérrez, Félix; del Amo, Julia
2012-09-10
To compare causes of death (CoDs) from two independent sources: National Basic Death File (NBDF) and deaths reported to the Spanish HIV Research cohort [Cohort de adultos con infección por VIH de la Red de Investigación en SIDA CoRIS)] and compare the two coding algorithms: International Classification of Diseases, 10th revision (ICD-10) and revised version of Coding Causes of Death in HIV (revised CoDe). Between 2004 and 2008, CoDs were obtained from the cohort records (free text, multiple causes) and also from NBDF (ICD-10). CoDs from CoRIS were coded according to ICD-10 and revised CoDe by a panel. Deaths were compared by 13 disease groups: HIV/AIDS, liver diseases, malignancies, infections, cardiovascular, blood disorders, pulmonary, central nervous system, drug use, external, suicide, other causes and ill defined. There were 160 deaths. Concordance for the 13 groups was observed in 111 (69%) cases for the two sources and in 115 (72%) cases for the two coding algorithms. According to revised CoDe, the commonest CoDs were HIV/AIDS (53%), non-AIDS malignancies (11%) and liver related (9%), these percentages were similar, 57, 10 and 8%, respectively, for NBDF (coded as ICD-10). When using ICD-10 to code deaths in CoRIS, wherein HIV infection was known in everyone, the proportion of non-AIDS malignancies was 13%, liver-related accounted for 3%, while HIV/AIDS reached 70% due to liver-related, infections and ill-defined causes being coded as HIV/AIDS. There is substantial variation in CoDs in HIV-infected persons according to sources and algorithms. ICD-10 in patients known to be HIV-positive overestimates HIV/AIDS-related deaths at the expense of underestimating liver-related diseases, infections and ill defined causes. CoDe seems as the best option for cohort studies.
Alternate operating scenarios for NDCX-II
NASA Astrophysics Data System (ADS)
Sharp, W. M.; Friedman, A.; Grote, D. P.; Cohen, R. H.; Lund, S. M.; Vay, J.-L.; Waldron, W. L.
2014-01-01
NDCX-II is a newly completed accelerator facility at LBNL, built to study ion-heated warm dense matter, as well as aspects of ion-driven targets and intense-beam dynamics for inertial-fusion energy. The baseline design calls for using 12 induction cells to accelerate 30-50 nC of Li+ ions to 1.2 MeV. During commissioning, though, we plan to extend the source lifetime by extracting less total charge. Over time, we expect that NDCX-II will be upgraded to substantially higher energies, necessitating the use of heavier ions to keep a suitable deposition range in targets. For operational flexibility, the option of using a helium plasma source is also being investigated. Each of these options requires development of an alternate acceleration schedule. The schedules here are worked out with a fast-running 1-D particle-in-cell code ASP.
ERIC Educational Resources Information Center
Hickok, Gregory
2012-01-01
Speech recognition is an active process that involves some form of predictive coding. This statement is relatively uncontroversial. What is less clear is the source of the prediction. The dual-stream model of speech processing suggests that there are two possible sources of predictive coding in speech perception: the motor speech system and the…
NASA Astrophysics Data System (ADS)
Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi
2017-07-01
We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.
Hypersonic simulations using open-source CFD and DSMC solvers
NASA Astrophysics Data System (ADS)
Casseau, V.; Scanlon, T. J.; John, B.; Emerson, D. R.; Brown, R. E.
2016-11-01
Hypersonic hybrid hydrodynamic-molecular gas flow solvers are required to satisfy the two essential requirements of any high-speed reacting code, these being physical accuracy and computational efficiency. The James Weir Fluids Laboratory at the University of Strathclyde is currently developing an open-source hybrid code which will eventually reconcile the direct simulation Monte-Carlo method, making use of the OpenFOAM application called dsmcFoam, and the newly coded open-source two-temperature computational fluid dynamics solver named hy2Foam. In conjunction with employing the CVDV chemistry-vibration model in hy2Foam, novel use is made of the QK rates in a CFD solver. In this paper, further testing is performed, in particular with the CFD solver, to ensure its efficacy before considering more advanced test cases. The hy2Foam and dsmcFoam codes have shown to compare reasonably well, thus providing a useful basis for other codes to compare against.
Parrish, Robert M; Burns, Lori A; Smith, Daniel G A; Simmonett, Andrew C; DePrince, A Eugene; Hohenstein, Edward G; Bozkaya, Uğur; Sokolov, Alexander Yu; Di Remigio, Roberto; Richard, Ryan M; Gonthier, Jérôme F; James, Andrew M; McAlexander, Harley R; Kumar, Ashutosh; Saitow, Masaaki; Wang, Xiao; Pritchard, Benjamin P; Verma, Prakash; Schaefer, Henry F; Patkowski, Konrad; King, Rollin A; Valeev, Edward F; Evangelista, Francesco A; Turney, Justin M; Crawford, T Daniel; Sherrill, C David
2017-07-11
Psi4 is an ab initio electronic structure program providing methods such as Hartree-Fock, density functional theory, configuration interaction, and coupled-cluster theory. The 1.1 release represents a major update meant to automate complex tasks, such as geometry optimization using complete-basis-set extrapolation or focal-point methods. Conversion of the top-level code to a Python module means that Psi4 can now be used in complex workflows alongside other Python tools. Several new features have been added with the aid of libraries providing easy access to techniques such as density fitting, Cholesky decomposition, and Laplace denominators. The build system has been completely rewritten to simplify interoperability with independent, reusable software components for quantum chemistry. Finally, a wide range of new theoretical methods and analyses have been added to the code base, including functional-group and open-shell symmetry adapted perturbation theory, density-fitted coupled cluster with frozen natural orbitals, orbital-optimized perturbation and coupled-cluster methods (e.g., OO-MP2 and OO-LCCD), density-fitted multiconfigurational self-consistent field, density cumulant functional theory, algebraic-diagrammatic construction excited states, improvements to the geometry optimizer, and the "X2C" approach to relativistic corrections, among many other improvements.
Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baratta, A.J.
1997-07-01
To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts andmore » engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together.« less
NASA Technical Reports Server (NTRS)
Meyer, H. D.
1993-01-01
The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.
Methods for Coding Tobacco-Related Twitter Data: A Systematic Review.
Lienemann, Brianna A; Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai
2017-03-31
As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter's Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter's databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. ©Brianna A Lienemann, Jennifer B Unger, Tess Boley Cruz, Kar-Hai Chu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 31.03.2017.
Astrophysics Source Code Library Enhancements
NASA Astrophysics Data System (ADS)
Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.
2015-09-01
The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.
Modeling the Volcanic Source at Long Valley, CA, Using a Genetic Algorithm Technique
NASA Technical Reports Server (NTRS)
Tiampo, Kristy F.
1999-01-01
In this project, we attempted to model the deformation pattern due to the magmatic source at Long Valley caldera using a real-value coded genetic algorithm (GA) inversion similar to that found in Michalewicz, 1992. The project has been both successful and rewarding. The genetic algorithm, coded in the C programming language, performs stable inversions over repeated trials, with varying initial and boundary conditions. The original model used a GA in which the geophysical information was coded into the fitness function through the computation of surface displacements for a Mogi point source in an elastic half-space. The program was designed to invert for a spherical magmatic source - its depth, horizontal location and volume - using the known surface deformations. It also included the capability of inverting for multiple sources.
SimVascular: An Open Source Pipeline for Cardiovascular Simulation.
Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C
2017-03-01
Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.
Evaluation of assimilatory sulphur metabolism in Caldicellulosiruptor saccharolyticus.
Pawar, Sudhanshu S; van Niel, Ed W J
2014-10-01
Caldicellulosiruptor saccharolyticus has gained reputation as being among the best microorganisms to produce H2 due to possession of various appropriate features. The quest to develop an inexpensive cultivation medium led to determine a possible replacement of the expensive component cysteine, i.e. sulphate. C. saccharolyticus assimilated sulphate successfully in absence of a reducing agent without releasing hydrogen sulphide. A complete set of genes coding for enzymes required for sulphate assimilation were found in the majority of Caldicellulosiruptor species including C. saccharolyticus. C. saccharolyticus displayed indifferent physiological behaviour to source of sulphur when grown under favourable conditions in continuous cultures. Increasing the usual concentration of sulphur in the feed medium increased substrate conversion. Choice of sulphur source did not affect the tolerance of C. saccharolyticus to high partial pressures of H2. Thus, sulphate can be a principle sulphur source in an economically viable and more sustainable biohydrogen process using C. saccharolyticus. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dictionary Learning Algorithms for Sparse Representation
Kreutz-Delgado, Kenneth; Murray, Joseph F.; Rao, Bhaskar D.; Engan, Kjersti; Lee, Te-Won; Sejnowski, Terrence J.
2010-01-01
Algorithms for data-driven learning of domain-specific overcomplete dictionaries are developed to obtain maximum likelihood and maximum a posteriori dictionary estimates based on the use of Bayesian models with concave/Schur-concave (CSC) negative log priors. Such priors are appropriate for obtaining sparse representations of environmental signals within an appropriately chosen (environmentally matched) dictionary. The elements of the dictionary can be interpreted as concepts, features, or words capable of succinct expression of events encountered in the environment (the source of the measured signals). This is a generalization of vector quantization in that one is interested in a description involving a few dictionary entries (the proverbial “25 words or less”), but not necessarily as succinct as one entry. To learn an environmentally adapted dictionary capable of concise expression of signals generated by the environment, we develop algorithms that iterate between a representative set of sparse representations found by variants of FOCUSS and an update of the dictionary using these sparse representations. Experiments were performed using synthetic data and natural images. For complete dictionaries, we demonstrate that our algorithms have improved performance over other independent component analysis (ICA) methods, measured in terms of signal-to-noise ratios of separated sources. In the overcomplete case, we show that the true underlying dictionary and sparse sources can be accurately recovered. In tests with natural images, learned overcomplete dictionaries are shown to have higher coding efficiency than complete dictionaries; that is, images encoded with an over-complete dictionary have both higher compression (fewer bits per pixel) and higher accuracy (lower mean square error). PMID:12590811
Psynteract: A flexible, cross-platform, open framework for interactive experiments.
Henninger, Felix; Kieslich, Pascal J; Hilbig, Benjamin E
2017-10-01
We introduce a novel platform for interactive studies, that is, any form of study in which participants' experiences depend not only on their own responses, but also on those of other participants who complete the same study in parallel, for example a prisoner's dilemma or an ultimatum game. The software thus especially serves the rapidly growing field of strategic interaction research within psychology and behavioral economics. In contrast to all available software packages, our platform does not handle stimulus display and response collection itself. Instead, we provide a mechanism to extend existing experimental software to incorporate interactive functionality. This approach allows us to draw upon the capabilities already available, such as accuracy of temporal measurement, integration with auxiliary hardware such as eye-trackers or (neuro-)physiological apparatus, and recent advances in experimental software, for example capturing response dynamics through mouse-tracking. Through integration with OpenSesame, an open-source graphical experiment builder, studies can be assembled via a drag-and-drop interface requiring little or no further programming skills. In addition, by using the same communication mechanism across software packages, we also enable interoperability between systems. Our source code, which provides support for all major operating systems and several popular experimental packages, can be freely used and distributed under an open source license. The communication protocols underlying its functionality are also well documented and easily adapted to further platforms. Code and documentation are available at https://github.com/psynteract/ .
Welk, Blayne; Kwong, Justin
2017-01-01
Studies using routinely collected data (RCD) are common in the urological literature; however, there are important considerations in the creation and review of RCD discoveries. A recent reporting guideline (REporting of studies Conducted using Observational Routinely-collected health Data, RECORD) was developed to improve the reporting of these studies. This narrative review examines important considerations for RCD studies. To assess the current level of reporting in the urological literature, we reviewed all the original research articles published in Journal of Urology and European Urology in 2014, and determined the proportion of the RECORD checklist items that were reported for RCD studies. There were 56 RCD studies identified among the 608 articles. When the RECORD items were considered applicable to the specific study, they were reported in 52.5% of cases. Studies most consistently (>80% of them) reported the names of the data sources, the study time frame, the extent to which the authors could access the database source, the patient selection, and discussed missing data. Few studies (<25%) discussed validation of key coding elements, details on data-linkage, data-cleaning, the impact of changing eligibility over time, or provided the complete list of coding elements used to define key study variables. Reporting factors specifically relevant in RCD studies may serve to increase the quality of these studies in the urological literature. With increased technological integration in healthcare and the proliferation of electronic medical records, RCD will continue to be an important source for urological research.
Scalable video transmission over Rayleigh fading channels using LDPC codes
NASA Astrophysics Data System (ADS)
Bansal, Manu; Kondi, Lisimachos P.
2005-03-01
In this paper, we investigate an important problem of efficiently utilizing the available resources for video transmission over wireless channels while maintaining a good decoded video quality and resilience to channel impairments. Our system consists of the video codec based on 3-D set partitioning in hierarchical trees (3-D SPIHT) algorithm and employs two different schemes using low-density parity check (LDPC) codes for channel error protection. The first method uses the serial concatenation of the constant-rate LDPC code and rate-compatible punctured convolutional (RCPC) codes. Cyclic redundancy check (CRC) is used to detect transmission errors. In the other scheme, we use the product code structure consisting of a constant rate LDPC/CRC code across the rows of the `blocks' of source data and an erasure-correction systematic Reed-Solomon (RS) code as the column code. In both the schemes introduced here, we use fixed-length source packets protected with unequal forward error correction coding ensuring a strictly decreasing protection across the bitstream. A Rayleigh flat-fading channel with additive white Gaussian noise (AWGN) is modeled for the transmission. The rate-distortion optimization algorithm is developed and carried out for the selection of source coding and channel coding rates using Lagrangian optimization. The experimental results demonstrate the effectiveness of this system under different wireless channel conditions and both the proposed methods (LDPC+RCPC/CRC and RS+LDPC/CRC) outperform the more conventional schemes such as those employing RCPC/CRC.
Methodology of decreasing software complexity using ontology
NASA Astrophysics Data System (ADS)
DÄ browska-Kubik, Katarzyna
2015-09-01
In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.
Bit-wise arithmetic coding for data compression
NASA Technical Reports Server (NTRS)
Kiely, A. B.
1994-01-01
This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.
Astrophysics Source Code Library -- Now even better!
NASA Astrophysics Data System (ADS)
Allen, Alice; Schmidt, Judy; Berriman, Bruce; DuPrie, Kimberly; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.
2015-01-01
The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. Indexed by ADS, it now contains nearly 1,000 codes and with recent major changes, is better than ever! The resource has a new infrastructure that offers greater flexibility and functionality for users, including an easier submission process, better browsing, one-click author search, and an RSS feeder for news. The new database structure is easier to maintain and offers new possibilities for collaboration. Come see what we've done!
Study of statistical coding for digital TV
NASA Technical Reports Server (NTRS)
Gardenhire, L. W.
1972-01-01
The results are presented for a detailed study to determine a pseudo-optimum statistical code to be installed in a digital TV demonstration test set. Studies of source encoding were undertaken, using redundancy removal techniques in which the picture is reproduced within a preset tolerance. A method of source encoding, which preliminary studies show to be encouraging, is statistical encoding. A pseudo-optimum code was defined and the associated performance of the code was determined. The format was fixed at 525 lines per frame, 30 frames per second, as per commercial standards.
NASA Technical Reports Server (NTRS)
Rost, Martin C.; Sayood, Khalid
1991-01-01
A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.
Field Encapsulation Library The FEL 2.2 User Guide
NASA Technical Reports Server (NTRS)
Moran, Patrick J.; Henze, Chris; Ellsworth, David
1999-01-01
This document describes version 2.2 of the Field Encapsulation Library (FEL), a library of mesh and field classes. FEL is a library for programmers - it is a "building block" enabling the rapid development of applications by a user. Since FEL is a library intended for code development, it is essential that enough technical detail be provided so that one can make full use of the code. Providing such detail requires some assumptions with respect to the reader's familiarity with the library implementation language, C++, particularly C++ with templates. We have done our best to make the explanations accessible to those who may not be completely C++ literate. Nevertheless, familiarity with the language will certainly help one's understanding of how and why things work the way they do. One consolation is that the level of understanding essential for using the library is significantly less than the level that one should have in order to modify or extend the library. One more remark on C++ templates: Templates have been a source of both joy and frustration for us. The frustration stems from the lack of mature or complete implementations that one has to work with. Template problems rear their ugly head particularly when porting. When porting C code, successfully compiling to a set of object files typically means that one is almost done. With templated C++ and the current state of the compilers and linkers, generating the object files is often only the beginning of the fun. On the other hand, templates are quite powerful. Used judiciously, templates enable more succinct designs and more efficient code. Templates also help with code maintenance. Designers can avoid creating objects that are the same in many respects, but not exactly the same. For example, FEL fields are templated by node type, thus the code for scalar fields and vector fields is shared. Furthermore, node type templating allows the library user to instantiate fields with data types not provided by the FEL authors. This type of flexibility would be difficult to offer without the support of the language. For users who may be having template-related problems, we offer the consolation that support for C++ templates is destined to improve with time. Efforts such as the Standard Template Library (STL) will inevitably drive vendors to provide more thorough, optimized tools for template code development. Furthermore, the benefits will become harder to resist for those who currently subscribe to the least-common-denominator "code it all in C" strategy. May FEL bring you both increased productivity and aesthetic satisfaction.
The Search for Transients and Variables in the LSST Pathfinder Survey
NASA Astrophysics Data System (ADS)
Gorsuch, Mary Katherine; Kotulla, Ralf
2018-01-01
This research was completed during participation in the NSF-REU program at University of Wisconsin-Madison. Two fields of a few square degrees, close to the galactic plane, were imaged on the WIYN 3.5 meter telescope during the commissioning of the One Degree Imager (ODI) focal plane. These images were taken with repeated, shorter exposures in order to model an LSST-like cadence. This data was taken in order to identify transient and variable light sources. This was done by using Source Extractor to generate a catalog of all sources in each exposure, and inserting this data into a larger photometry database composed of all exposures for each field. A Python code was developed to analyze the data and isolate sources of interest from a large data set. We found that there were some discrepancies in the data, which lead to some interesting results that we are looking into further. Variable and transient sources, while relatively well understood, are not numerous in current cataloging systems. This will be a major undertaking of the Large Synoptic Survey Telescope (LSST), which this project is a precursor to. Locating these sources may give us a better understanding of where these sources are located and how they impact their surroundings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoekstra, Robert J.; Hammond, Simon David; Richards, David
2017-09-01
This milestone is a tri-lab deliverable supporting ongoing Co-Design efforts impacting applications in the Integrated Codes (IC) program element Advanced Technology Development and Mitigation (ATDM) program element. In FY14, the trilabs looked at porting proxy application to technologies of interest for ATS procurements. In FY15, a milestone was completed evaluating proxy applications in multiple programming models and in FY16, a milestone was completed focusing on the migration of lessons learned back into production code development. This year, the co-design milestone focuses on extracting the knowledge gained and/or code revisions back into production applications.
Wenke, A; Gaber, A; Hertle, L; Roeder, N; Pühse, G
2012-07-01
Precise and complete coding of diagnoses and procedures is of value for optimizing revenues within the German diagnosis-related groups (G-DRG) system. The implementation of effective structures for coding is cost-intensive. The aim of this study was to prove whether higher costs can be refunded by complete acquisition of comorbidities and complications. Calculations were based on DRG data of the Department of Urology, University Hospital of Münster, Germany, covering all patients treated in 2009. The data were regrouped and subjected to a process of simulation (increase and decrease of patient clinical complexity levels, PCCL) with the help of recently developed software. In urology a strong dependency of quantity and quality of coding of secondary diagnoses on PCCL and subsequent profits was found. Departmental budgetary procedures can be optimized when coding is effective. The new simulation tool can be a valuable aid to improve profits available for distribution. Nevertheless, calculation of time use and financial needs by this procedure are subject to specific departmental terms and conditions. Completeness of coding of (secondary) diagnoses must be the ultimate administrative goal of patient case documentation in urology.
Acta Aeronautica et Astronautica Sinica.
1982-07-28
AERONAUTICA ET ASTRONAUTICA SINICA - <,y English pages: 212 _r Source : Acta Aeronautica et Astronautica Sinica, Vol. 2, Nr. 4, December 1981 , . pp. 1...ADVOCATED OR IMPLIED ARE THOSE OP THE SOURCE ANDDO NOT NECESSARILY REFLECT THE POSITION TRANSLATION DIVISION OR OPINION OF THE FOREnjN TECHNOLOGY DI...axial) solution section code 2 Lower Corner Symbols i code of sectional cylindrical coordinate system j,k radial and peripheral codes of solution
Stey, Anne M; Ko, Clifford Y; Hall, Bruce Lee; Louie, Rachel; Lawson, Elise H; Gibbons, Melinda M; Zingmond, David S; Russell, Marcia M
2014-08-01
Identifying iatrogenic injuries using existing data sources is important for improved transparency in the occurrence of intraoperative events. There is evidence that procedure codes are reliably recorded in claims data. The objective of this study was to assess whether concurrent splenic procedure codes in patients undergoing colectomy procedures are reliably coded in claims data as compared with clinical registry data. Patients who underwent colectomy procedures in the absence of neoplastic diagnosis codes were identified from American College of Surgeons (ACS) NSQIP data linked with Medicare inpatient claims data file (2005 to 2008). A κ statistic was used to assess coding concordance between ACS NSQIP and Medicare inpatient claims, with ACS NSQIP serving as the reference standard. A total of 11,367 colectomy patients were identified from 212 hospitals. There were 114 patients (1%) who had a concurrent splenic procedure code recorded in either ACS NSQIP or Medicare inpatient claims. There were 7 patients who had a splenic injury diagnosis code recorded in either data source. Agreement of splenic procedure codes between the data sources was substantial (κ statistic 0.72; 95% CI, 0.64-0.79). Medicare inpatient claims identified 81% of the splenic procedure codes recorded in ACS NSQIP, and 99% of the patients without a splenic procedure code. It is feasible to use Medicare claims data to identify splenic injuries occurring during colectomy procedures, as claims data have moderate sensitivity and excellent specificity for capturing concurrent splenic procedure codes compared with ACS NSQIP. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Han, B. X.; Welton, R. F.; Stockli, M. P.; Luciano, N. P.; Carmichael, J. R.
2008-02-01
Beam simulation codes PBGUNS, SIMION, and LORENTZ-3D were evaluated by modeling the well-diagnosed SNS base line ion source and low energy beam transport (LEBT) system. Then, an investigation was conducted using these codes to assist our ion source and LEBT development effort which is directed at meeting the SNS operational and also the power-upgrade project goals. A high-efficiency H- extraction system as well as magnetic and electrostatic LEBT configurations capable of transporting up to 100mA is studied using these simulation tools.
JADAMILU: a software code for computing selected eigenvalues of large sparse symmetric matrices
NASA Astrophysics Data System (ADS)
Bollhöfer, Matthias; Notay, Yvan
2007-12-01
A new software code for computing selected eigenvalues and associated eigenvectors of a real symmetric matrix is described. The eigenvalues are either the smallest or those closest to some specified target, which may be in the interior of the spectrum. The underlying algorithm combines the Jacobi-Davidson method with efficient multilevel incomplete LU (ILU) preconditioning. Key features are modest memory requirements and robust convergence to accurate solutions. Parameters needed for incomplete LU preconditioning are automatically computed and may be updated at run time depending on the convergence pattern. The software is easy to use by non-experts and its top level routines are written in FORTRAN 77. Its potentialities are demonstrated on a few applications taken from computational physics. Program summaryProgram title: JADAMILU Catalogue identifier: ADZT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 101 359 No. of bytes in distributed program, including test data, etc.: 7 493 144 Distribution format: tar.gz Programming language: Fortran 77 Computer: Intel or AMD with g77 and pgf; Intel EM64T or Itanium with ifort; AMD Opteron with g77, pgf and ifort; Power (IBM) with xlf90. Operating system: Linux, AIX RAM: problem dependent Word size: real:8; integer: 4 or 8, according to user's choice Classification: 4.8 Nature of problem: Any physical problem requiring the computation of a few eigenvalues of a symmetric matrix. Solution method: Jacobi-Davidson combined with multilevel ILU preconditioning. Additional comments: We supply binaries rather than source code because JADAMILU uses the following external packages: MC64. This software is copyrighted software and not freely available. COPYRIGHT (c) 1999 Council for the Central Laboratory of the Research Councils. AMD. Copyright (c) 2004-2006 by Timothy A. Davis, Patrick R. Amestoy, and Iain S. Duff. Source code is distributed by the authors under the GNU LGPL licence. BLAS. The reference BLAS is a freely-available software package. It is available from netlib via anonymous ftp and the World Wide Web. LAPACK. The complete LAPACK package or individual routines from LAPACK are freely available on netlib and can be obtained via the World Wide Web or anonymous ftp. For maximal benefit to the community, we added the sources we are proprietary of to the tar.gz file submitted for inclusion in the CPC library. However, as explained in the README file, users willing to compile the code instead of using binaries should first obtain the sources for the external packages mentioned above (email and/or web addresses are provided). Running time: Problem dependent; the test examples provided with the code only take a few seconds to run; timing results for large scale problems are given in Section 5.
Particle-in-cell code library for numerical simulation of the ECR source plasma
NASA Astrophysics Data System (ADS)
Shirkov, G.; Alexandrov, V.; Preisendorf, V.; Shevtsov, V.; Filippov, A.; Komissarov, R.; Mironov, V.; Shirkova, E.; Strekalovsky, O.; Tokareva, N.; Tuzikov, A.; Vatulin, V.; Vasina, E.; Fomin, V.; Anisimov, A.; Veselov, R.; Golubev, A.; Grushin, S.; Povyshev, V.; Sadovoi, A.; Donskoi, E.; Nakagawa, T.; Yano, Y.
2003-05-01
The project ;Numerical simulation and optimization of ion accumulation and production in multicharged ion sources; is funded by the International Science and Technology Center (ISTC). A summary of recent project development and the first version of a computer code library for simulation of electron-cyclotron resonance (ECR) source plasmas based on the particle-in-cell method are presented.
Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.
2012-01-01
An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.
A new software for deformation source optimization, the Bayesian Earthquake Analysis Tool (BEAT)
NASA Astrophysics Data System (ADS)
Vasyura-Bathke, H.; Dutta, R.; Jonsson, S.; Mai, P. M.
2017-12-01
Modern studies of crustal deformation and the related source estimation, including magmatic and tectonic sources, increasingly use non-linear optimization strategies to estimate geometric and/or kinematic source parameters and often consider both jointly, geodetic and seismic data. Bayesian inference is increasingly being used for estimating posterior distributions of deformation source model parameters, given measured/estimated/assumed data and model uncertainties. For instance, some studies consider uncertainties of a layered medium and propagate these into source parameter uncertainties, while others use informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed to efficiently explore the high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational burden of these methods is high and estimation codes are rarely made available along with the published results. Even if the codes are accessible, it is usually challenging to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in deformation source estimations, we undertook the effort of developing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package builds on the pyrocko seismological toolbox (www.pyrocko.org), and uses the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat), and we encourage and solicit contributions to the project. Here, we present our strategy for developing BEAT and show application examples; especially the effect of including the model prediction uncertainty of the velocity model in following source optimizations: full moment tensor, Mogi source, moderate strike-slip earth-quake.
Complete Mitochondrial Genome of Echinostoma hortense (Digenea: Echinostomatidae).
Liu, Ze-Xuan; Zhang, Yan; Liu, Yu-Ting; Chang, Qiao-Cheng; Su, Xin; Fu, Xue; Yue, Dong-Mei; Gao, Yuan; Wang, Chun-Ren
2016-04-01
Echinostoma hortense (Digenea: Echinostomatidae) is one of the intestinal flukes with medical importance in humans. However, the mitochondrial (mt) genome of this fluke has not been known yet. The present study has determined the complete mt genome sequences of E. hortense and assessed the phylogenetic relationships with other digenean species for which the complete mt genome sequences are available in GenBank using concatenated amino acid sequences inferred from 12 protein-coding genes. The mt genome of E. hortense contained 12 protein-coding genes, 22 transfer RNA genes, 2 ribosomal RNA genes, and 1 non-coding region. The length of the mt genome of E. hortense was 14,994 bp, which was somewhat smaller than those of other trematode species. Phylogenetic analyses based on concatenated nucleotide sequence datasets for all 12 protein-coding genes using maximum parsimony (MP) method showed that E. hortense and Hypoderaeum conoideum gathered together, and they were closer to each other than to Fasciolidae and other echinostomatid trematodes. The availability of the complete mt genome sequences of E. hortense provides important genetic markers for diagnostics, population genetics, and evolutionary studies of digeneans.
Complete Mitochondrial Genome of Echinostoma hortense (Digenea: Echinostomatidae)
Liu, Ze-Xuan; Zhang, Yan; Liu, Yu-Ting; Chang, Qiao-Cheng; Su, Xin; Fu, Xue; Yue, Dong-Mei; Gao, Yuan; Wang, Chun-Ren
2016-01-01
Echinostoma hortense (Digenea: Echinostomatidae) is one of the intestinal flukes with medical importance in humans. However, the mitochondrial (mt) genome of this fluke has not been known yet. The present study has determined the complete mt genome sequences of E. hortense and assessed the phylogenetic relationships with other digenean species for which the complete mt genome sequences are available in GenBank using concatenated amino acid sequences inferred from 12 protein-coding genes. The mt genome of E. hortense contained 12 protein-coding genes, 22 transfer RNA genes, 2 ribosomal RNA genes, and 1 non-coding region. The length of the mt genome of E. hortense was 14,994 bp, which was somewhat smaller than those of other trematode species. Phylogenetic analyses based on concatenated nucleotide sequence datasets for all 12 protein-coding genes using maximum parsimony (MP) method showed that E. hortense and Hypoderaeum conoideum gathered together, and they were closer to each other than to Fasciolidae and other echinostomatid trematodes. The availability of the complete mt genome sequences of E. hortense provides important genetic markers for diagnostics, population genetics, and evolutionary studies of digeneans. PMID:27180575
NASA Astrophysics Data System (ADS)
Jubran, Mohammad K.; Bansal, Manu; Kondi, Lisimachos P.
2006-01-01
In this paper, we consider the problem of optimal bit allocation for wireless video transmission over fading channels. We use a newly developed hybrid scalable/multiple-description codec that combines the functionality of both scalable and multiple-description codecs. It produces a base layer and multiple-description enhancement layers. Any of the enhancement layers can be decoded (in a non-hierarchical manner) with the base layer to improve the reconstructed video quality. Two different channel coding schemes (Rate-Compatible Punctured Convolutional (RCPC)/Cyclic Redundancy Check (CRC) coding and, product code Reed Solomon (RS)+RCPC/CRC coding) are used for unequal error protection of the layered bitstream. Optimal allocation of the bitrate between source and channel coding is performed for discrete sets of source coding rates and channel coding rates. Experimental results are presented for a wide range of channel conditions. Also, comparisons with classical scalable coding show the effectiveness of using hybrid scalable/multiple-description coding for wireless transmission.
ERIC Educational Resources Information Center
Olsen, Florence
2003-01-01
Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)
Watterson, Dina; Cleland, Heather; Picton, Natalie; Simpson, Pam M; Gabbe, Belinda J
2011-03-01
The percentage of total body surface area burnt (%TBSA) is a critical measure of burn injury severity and a key predictor of burn injury outcome. This study evaluated the level of agreement between four sources of %TBSA using 120 cases identified through the Victorian State Trauma Registry. Expert clinician, ICD-10-AM, Abbreviated Injury Scale, and burns registry coding were compared using measures of agreement. There was near-perfect agreement (weighted Kappa statistic 0.81-1) between all sources of data, suggesting that ICD-10-AM is a valid source of %TBSA and use of ICD-10-AM codes could reduce the resource used by trauma and burns registries capturing this information.
High-resolution 3D simulations of NIF ignition targets performed on Sequoia with HYDRA
NASA Astrophysics Data System (ADS)
Marinak, M. M.; Clark, D. S.; Jones, O. S.; Kerbel, G. D.; Sepke, S.; Patel, M. V.; Koning, J. M.; Schroeder, C. R.
2015-11-01
Developments in the multiphysics ICF code HYDRA enable it to perform large-scale simulations on the Sequoia machine at LLNL. With an aggregate computing power of 20 Petaflops, Sequoia offers an unprecedented capability to resolve the physical processes in NIF ignition targets for a more complete, consistent treatment of the sources of asymmetry. We describe modifications to HYDRA that enable it to scale to over one million processes on Sequoia. These include new options for replicating parts of the mesh over a subset of the processes, to avoid strong scaling limits. We consider results from a 3D full ignition capsule-only simulation performed using over one billion zones run on 262,000 processors which resolves surface perturbations through modes l = 200. We also report progress towards a high-resolution 3D integrated hohlraum simulation performed using 262,000 processors which resolves surface perturbations on the ignition capsule through modes l = 70. These aim for the most complete calculations yet of the interactions and overall impact of the various sources of asymmetry for NIF ignition targets. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344.
PARAVT: Parallel Voronoi tessellation code
NASA Astrophysics Data System (ADS)
González, R. E.
2016-10-01
In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy
NASA Astrophysics Data System (ADS)
Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.
2016-12-01
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.
Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M
2016-12-07
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
Validity of data in the Danish Colorectal Cancer Screening Database
Thomsen, Mette Kielsholm; Njor, Sisse Helle; Rasmussen, Morten; Linnemann, Dorte; Andersen, Berit; Baatrup, Gunnar; Friis-Hansen, Lennart Jan; Jørgensen, Jens Christian Riis; Mikkelsen, Ellen Margrethe
2017-01-01
Background In Denmark, a nationwide screening program for colorectal cancer was implemented in March 2014. Along with this, a clinical database for program monitoring and research purposes was established. Objective The aim of this study was to estimate the agreement and validity of diagnosis and procedure codes in the Danish Colorectal Cancer Screening Database (DCCSD). Methods All individuals with a positive immunochemical fecal occult blood test (iFOBT) result who were invited to screening in the first 3 months since program initiation were identified. From these, a sample of 150 individuals was selected using stratified random sampling by age, gender and region of residence. Data from the DCCSD were compared with data from hospital records, which were used as the reference. Agreement, sensitivity, specificity and positive and negative predictive values were estimated for categories of codes “clean colon”, “colonoscopy performed”, “overall completeness of colonoscopy”, “incomplete colonoscopy”, “polypectomy”, “tumor tissue left behind”, “number of polyps”, “lost polyps”, “risk group of polyps” and “colorectal cancer and polyps/benign tumor”. Results Hospital records were available for 136 individuals. Agreement was highest for “colorectal cancer” (97.1%) and lowest for “lost polyps” (88.2%). Sensitivity varied between moderate and high, with 60.0% for “incomplete colonoscopy” and 98.5% for “colonoscopy performed”. Specificity was 92.7% or above, except for the categories “colonoscopy performed” and “overall completeness of colonoscopy”, where the specificity was low; however, the estimates were imprecise. Conclusion A high level of agreement between categories of codes in DCCSD and hospital records indicates that DCCSD reflects the hospital records well. Further, the validity of the categories of codes varied from moderate to high. Thus, the DCCSD may be a valuable data source for future research on colorectal cancer screening. PMID:28255255
Comprehensive Areal Model of Earthquake-Induced Landslides: Technical Specification and User Guide
Miles, Scott B.; Keefer, David K.
2007-01-01
This report describes the complete design of a comprehensive areal model of earthquakeinduced landslides (CAMEL). This report presents the design process, technical specification of CAMEL. It also provides a guide to using the CAMEL source code and template ESRI ArcGIS map document file for applying CAMEL, both of which can be obtained by contacting the authors. CAMEL is a regional-scale model of earthquake-induced landslide hazard developed using fuzzy logic systems. CAMEL currently estimates areal landslide concentration (number of landslides per square kilometer) of six aggregated types of earthquake-induced landslides - three types each for rock and soil.
A simple system for detection of EEG artifacts in polysomnographic recordings.
Durka, P J; Klekowicz, H; Blinowska, K J; Szelenberger, W; Niemcewicz, Sz
2003-04-01
We present an efficient parametric system for automatic detection of electroencephalogram (EEG) artifacts in polysomnographic recordings. For each of the selected types of artifacts, a relevant parameter was calculated for a given epoch. If any of these parameters exceeded a threshold, the epoch was marked as an artifact. Performance of the system, evaluated on 18 overnight polysomnographic recordings, revealed concordance with decisions of human experts close to the interexpert agreement and the repeatability of expert's decisions, assessed via a double-blind test. Complete software (Matlab source code) for the presented system is freely available from the Internet at http://brain.fuw.edu.pl/artifacts.
Low-Latency and Energy-Efficient Data Preservation Mechanism in Low-Duty-Cycle Sensor Networks.
Jiang, Chan; Li, Tao-Shen; Liang, Jun-Bin; Wu, Heng
2017-05-06
Similar to traditional wireless sensor networks (WSN), the nodes only have limited memory and energy in low-duty-cycle sensor networks (LDC-WSN). However, different from WSN, the nodes in LDC-WSN often sleep most of their time to preserve their energies. The sleeping feature causes serious data transmission delay. However, each source node that has sensed data needs to quickly disseminate its data to other nodes in the network for redundant storage. Otherwise, data would be lost due to its source node possibly being destroyed by outer forces in a harsh environment. The quick dissemination requirement produces a contradiction with the sleeping delay in the network. How to quickly disseminate all the source data to all the nodes with limited memory in the network for effective preservation is a challenging issue. In this paper, a low-latency and energy-efficient data preservation mechanism in LDC-WSN is proposed. The mechanism is totally distributed. The data can be disseminated to the network with low latency by using a revised probabilistic broadcasting mechanism, and then stored by the nodes with LT (Luby Transform) codes, which are a famous rateless erasure code. After the process of data dissemination and storage completes, some nodes may die due to being destroyed by outer forces. If a mobile sink enters the network at any time and from any place to collect the data, it can recover all of the source data by visiting a small portion of survived nodes in the network. Theoretical analyses and simulation results show that our mechanism outperforms existing mechanisms in the performances of data dissemination delay and energy efficiency.
STINGRAY: system for integrated genomic resources and analysis.
Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R
2014-03-07
The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.
STINGRAY: system for integrated genomic resources and analysis
2014-01-01
Background The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. Findings STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. Conclusion STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/. PMID:24606808
Maximum aposteriori joint source/channel coding
NASA Technical Reports Server (NTRS)
Sayood, Khalid; Gibson, Jerry D.
1991-01-01
A maximum aposteriori probability (MAP) approach to joint source/channel coder design is presented in this paper. This method attempts to explore a technique for designing joint source/channel codes, rather than ways of distributing bits between source coders and channel coders. For a nonideal source coder, MAP arguments are used to design a decoder which takes advantage of redundancy in the source coder output to perform error correction. Once the decoder is obtained, it is analyzed with the purpose of obtaining 'desirable properties' of the channel input sequence for improving overall system performance. Finally, an encoder design which incorporates these properties is proposed.
Ball, James W.; Nordstrom, D. Kirk; Jenne, Everett A.
1980-01-01
A computerized chemical model, WATEQ2, has resulted from extensive additions to and revision of the WATEQ model of Truesdell and Jones (Truesdell, A. H., and Jones, B. F., 1974, WATEQ, a computer program for calculating chemical equilibria of natural waters: J. Res. U. S. Geol, Survey, v. 2, p. 233-274). The model building effort has necessitated searching the literature and selecting thermochemical data pertinent to the reactions added to the model. This supplementary report manes available the details of the reactions added to the model together with the selected thermochemical data and their sources. Also listed are details of program operation and a brief description of the output of the model. Appendices-contain a glossary of identifiers used in the PL/1 computer code, the complete PL/1 listing, and sample output from three water analyses used as test cases.
Neutron Capture gamma ENDF libraries for modeling and identification of neutron sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sleaford, B
2007-10-29
There are a number of inaccuracies and data omissions with respect to gammas from neutron capture in the ENDF libraries used as field reference information and by modeling codes used in JTOT. As the use of Active Neutron interrogation methods is expanded, these shortfalls become more acute. A new, more accurate and complete evaluated experimental database of gamma rays (over 35,000 lines for 262 isotopes up to U so far) from thermal neutron capture has recently become available from the IAEA. To my knowledge, none of this new data has been installed in ENDF libraries and disseminated. I propose tomore » upgrade libraries of {sup 184,186}W, {sup 56}Fe, {sup 204,206,207}Pb, {sup 104}Pd, and {sup 19}F the 1st year. This will involve collaboration with Richard Firestone at LBL in evaluating the data and installing it in the libraries. I will test them with the transport code MCNP5.« less
From GCode to STL: Reconstruct Models from 3D Printing as a Service
NASA Astrophysics Data System (ADS)
Baumann, Felix W.; Schuermann, Martin; Odefey, Ulrich; Pfeil, Markus
2017-12-01
The authors present a method to reverse engineer 3D printer specific machine instructions (GCode) to a point cloud representation and then a STL (Stereolithography) file format. GCode is a machine code that is used for 3D printing among other applications, such as CNC routers. Such code files contain instructions for the 3D printer to move and control its actuator, in case of Fused Deposition Modeling (FDM), the printhead that extrudes semi-molten plastics. The reverse engineering method presented here is based on the digital simulation of the extrusion process of FDM type 3D printing. The reconstructed models and pointclouds do not accommodate for hollow structures, such as holes or cavities. The implementation is performed in Python and relies on open source software and libraries, such as Matplotlib and OpenCV. The reconstruction is performed on the model’s extrusion boundary and considers mechanical imprecision. The complete reconstruction mechanism is available as a RESTful (Representational State Transfer) Web service.
Line x-ray source for diffraction enhanced imaging in clinical and industrial applications
NASA Astrophysics Data System (ADS)
Wang, Xiaoqin
Mammography is one type of imaging modalities that uses a low-dose x-ray or other radiation sources for examination of breasts. It plays a central role in early detection of breast cancers. The material similarity of tumor-cell and health cell, breast implants surgery and other factors, make the breast cancers hard to visualize and detect. Diffraction enhanced imaging (DEI), first proposed and investigated by D. Chapman is a new x-ray radiographic imaging modality using monochromatic x-rays from a synchrotron source, which produced images of thick absorbing objects that are almost completely free of scatter. It shows dramatically improved contrast over standard imaging when applied to the same phantom. The contrast is based not only on attenuation but also on the refraction and diffraction properties of the sample. This imaging method may improve image quality of mammography, other medical applications, industrial radiography for non-destructive testing and x-ray computed tomography. However, the size, and cost, of a synchrotron source limits the application of the new modality to be applicable at clinical levels. This research investigates the feasibility of a designed line x-ray source to produce intensity compatible to synchrotron sources. It is composed of a 2-cm in length tungsten filament, installed on a carbon steel filament cup (backing plate), as the cathode and a stationary oxygen-free copper anode with molybdenum coating on the front surface serves as the target. Characteristic properties of the line x-ray source were computationally studied and the prototype was experimentally investigated. SIMIION code was used to computationally study the electron trajectories emanating from the filament towards the molybdenum target. A Faraday cup on the prototype device, proof-of-principle, was used to measure the distribution of electrons on the target, which compares favorably to computational results. The intensities of characteristic x-ray for molybdenum, tungsten and rhodium targets were investigated with different window materials for -30kV to -100kV applied potential. Heat loading and thermal management of the target has been investigated computationally using COMSOL code package, and experimental measurements of target temperature rise was taken via thermocouples attached to the target. Temperature measurements for low voltage, low current regime without active cooling were compared to computational results for code-experiment benchmarking. Two different phantoms were used in the simulation of DEI images, which showed that the designed x-ray source with DEI setup could produce images with significant improved contrast. The computational results, along with experimental measurements on the prototype setup, indicate the possibility of scale up to larger area x-ray source adequate for DEI applications.
Training and support to improve ICD coding quality: A controlled before-and-after impact evaluation.
Dyers, Robin; Ward, Grant; Du Plooy, Shane; Fourie, Stephanus; Evans, Juliet; Mahomed, Hassan
2017-05-24
The proposed National Health Insurance policy for South Africa (SA) requires hospitals to maintain high-quality International Statistical Classification of Diseases (ICD) codes for patient records. While considerable strides had been made to improve ICD coding coverage by digitising the discharge process in the Western Cape Province, further intervention was required to improve data quality. The aim of this controlled before-and-after study was to evaluate the impact of a clinician training and support initiative to improve ICD coding quality. To compare ICD coding quality between two central hospitals in the Western Cape before and after the implementation of a training and support initiative for clinicians at one of the sites. The difference in differences in data quality between the intervention site and the control site was calculated. Multiple logistic regression was also used to determine the odds of data quality improvement after the intervention and to adjust for potential differences between the groups. The intervention had a positive impact of 38.0% on ICD coding completeness over and above changes that occurred at the control site. Relative to the baseline, patient records at the intervention site had a 6.6 (95% confidence interval 3.5 - 16.2) adjusted odds ratio of having a complete set of ICD codes for an admission episode after the introduction of the training and support package. The findings on impact on ICD coding accuracy were not significant. There is sufficient pragmatic evidence that a training and support package will have a considerable positive impact on ICD coding completeness in the SA setting.
15 CFR 740.7 - Computers (APP).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 4A003. (2) Technology and software. License Exception APP authorizes exports of technology and software... programmability. (ii) Technology and source code. Technology and source code eligible for License Exception APP..., reexports and transfers (in-country) for nuclear, chemical, biological, or missile end-users and end-uses...
Spread Spectrum Visual Sensor Network Resource Management Using an End-to-End Cross-Layer Design
2011-02-01
Coding In this work, we use rate compatible punctured convolutional (RCPC) codes for channel coding [11]. Using RCPC codes al- lows us to utilize Viterbi’s...11] J. Hagenauer, “ Rate - compatible punctured convolutional codes (RCPC codes ) and their applications,” IEEE Trans. Commun., vol. 36, no. 4, pp. 389...source coding rate , a channel coding rate , and a power level to all nodes in the
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu (Inventor)
1997-01-01
A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu (Inventor)
1998-01-01
A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.
NASA Technical Reports Server (NTRS)
Steyn, J. J.; Born, U.
1970-01-01
A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.
Power optimization of wireless media systems with space-time block codes.
Yousefi'zadeh, Homayoun; Jafarkhani, Hamid; Moshfeghi, Mehran
2004-07-01
We present analytical and numerical solutions to the problem of power control in wireless media systems with multiple antennas. We formulate a set of optimization problems aimed at minimizing total power consumption of wireless media systems subject to a given level of QoS and an available bit rate. Our formulation takes into consideration the power consumption related to source coding, channel coding, and transmission of multiple-transmit antennas. In our study, we consider Gauss-Markov and video source models, Rayleigh fading channels along with the Bernoulli/Gilbert-Elliott loss models, and space-time block codes.
Top ten reasons to register your code with the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; DuPrie, Kimberly; Berriman, G. Bruce; Mink, Jessica D.; Nemiroff, Robert J.; Robitaille, Thomas; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Teuben, Peter J.; Wallin, John F.; Warmels, Rein
2017-01-01
With 1,400 codes, the Astrophysics Source Code Library (ASCL, ascl.net) is the largest indexed resource for codes used in astronomy research in existence. This free online registry was established in 1999, is indexed by Web of Science and ADS, and is citable, with citations to its entries tracked by ADS. Registering your code with the ASCL is easy with our online submissions system. Making your software available for examination shows confidence in your research and makes your research more transparent, reproducible, and falsifiable. ASCL registration allows your software to be cited on its own merits and provides a citation that is trackable and accepted by all astronomy journals and journals such as Science and Nature. Registration also allows others to find your code more easily. This presentation covers the benefits of registering astronomy research software with the ASCL.
Code of Federal Regulations, 2014 CFR
2014-01-01
...—American Indian or Alaska Native Code 2—Asian Code 3—Black or African American Code 4—Native Hawaiian or... secondary market entity within the same calendar year: Code 0—Loan was not originated or was not sold in...
Code of Federal Regulations, 2013 CFR
2013-01-01
...—American Indian or Alaska Native Code 2—Asian Code 3—Black or African American Code 4—Native Hawaiian or... secondary market entity within the same calendar year: Code 0—Loan was not originated or was not sold in...
Code of Federal Regulations, 2012 CFR
2012-01-01
...—American Indian or Alaska Native Code 2—Asian Code 3—Black or African American Code 4—Native Hawaiian or... secondary market entity within the same calendar year: Code 0—Loan was not originated or was not sold in...
Code of Federal Regulations, 2012 CFR
2012-01-01
...—American Indian or Alaska Native Code 2—Asian Code 3—Black or African American Code 4—Native Hawaiian or... secondary market entity within the same calendar year: Code 0—Loan was not originated or was not sold in...
Code of Federal Regulations, 2013 CFR
2013-01-01
...—American Indian or Alaska Native Code 2—Asian Code 3—Black or African American Code 4—Native Hawaiian or... secondary market entity within the same calendar year: Code 0—Loan was not originated or was not sold in...
Code of Federal Regulations, 2014 CFR
2014-01-01
...—American Indian or Alaska Native Code 2—Asian Code 3—Black or African American Code 4—Native Hawaiian or... secondary market entity within the same calendar year: Code 0—Loan was not originated or was not sold in...
ARES: automated response function code. Users manual. [HPGAM and LSQVM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maung, T.; Reynolds, G.M.
This ARES user's manual provides detailed instructions for a general understanding of the Automated Response Function Code and gives step by step instructions for using the complete code package on a HP-1000 system. This code is designed to calculate response functions of NaI gamma-ray detectors, with cylindrical or rectangular geometries.
NASA Technical Reports Server (NTRS)
Rutishauser, David
2006-01-01
The motivation for this work comes from an observation that amidst the push for Massively Parallel (MP) solutions to high-end computing problems such as numerical physical simulations, large amounts of legacy code exist that are highly optimized for vector supercomputers. Because re-hosting legacy code often requires a complete re-write of the original code, which can be a very long and expensive effort, this work examines the potential to exploit reconfigurable computing machines in place of a vector supercomputer to implement an essentially unmodified legacy source code. Custom and reconfigurable computing resources could be used to emulate an original application's target platform to the extent required to achieve high performance. To arrive at an architecture that delivers the desired performance subject to limited resources involves solving a multi-variable optimization problem with constraints. Prior research in the area of reconfigurable computing has demonstrated that designing an optimum hardware implementation of a given application under hardware resource constraints is an NP-complete problem. The premise of the approach is that the general issue of applying reconfigurable computing resources to the implementation of an application, maximizing the performance of the computation subject to physical resource constraints, can be made a tractable problem by assuming a computational paradigm, such as vector processing. This research contributes a formulation of the problem and a methodology to design a reconfigurable vector processing implementation of a given application that satisfies a performance metric. A generic, parametric, architectural framework for vector processing implemented in reconfigurable logic is developed as a target for a scheduling/mapping algorithm that maps an input computation to a given instance of the architecture. This algorithm is integrated with an optimization framework to arrive at a specification of the architecture parameters that attempts to minimize execution time, while staying within resource constraints. The flexibility of using a custom reconfigurable implementation is exploited in a unique manner to leverage the lessons learned in vector supercomputer development. The vector processing framework is tailored to the application, with variable parameters that are fixed in traditional vector processing. Benchmark data that demonstrates the functionality and utility of the approach is presented. The benchmark data includes an identified bottleneck in a real case study example vector code, the NASA Langley Terminal Area Simulation System (TASS) application.
NASA Technical Reports Server (NTRS)
Cross, James H., II; Morrison, Kelly I.; May, Charles H., Jr.; Waddel, Kathryn C.
1989-01-01
The first phase of a three-phase effort to develop a new graphically oriented specification language which will facilitate the reverse engineering of Ada source code into graphical representations (GRs) as well as the automatic generation of Ada source code is described. A simplified view of the three phases of Graphical Representations for Algorithms, Structure, and Processes for Ada (GRASP/Ada) with respect to three basic classes of GRs is presented. Phase 1 concentrated on the derivation of an algorithmic diagram, the control structure diagram (CSD) (CRO88a) from Ada source code or Ada PDL. Phase 2 includes the generation of architectural and system level diagrams such as structure charts and data flow diagrams and should result in a requirements specification for a graphically oriented language able to support automatic code generation. Phase 3 will concentrate on the development of a prototype to demonstrate the feasibility of this new specification language.
ERIC Educational Resources Information Center
Schepp, Julie; Domagala, Anna
2009-01-01
This report provides information on degree and certificate programs offered and student program completions for fiscal year 2008-09 in North Dakota's public and private postsecondary educational institutions. Institutional programs are coded in accordance with the Classification of Instructional Programs (CIP Code) system provided by the National…
ERIC Educational Resources Information Center
North Dakota Univ. System, Bismarck.
This report provides information on degree and certificate programs offered and student program completions for fiscal year 2001-2002 in North Dakota's public and private postsecondary education institutions. Institutional programs are coded in accordance with the Classification of Instructional Programs (CIP Code) system provided by the National…
ERIC Educational Resources Information Center
North Dakota Univ. System, Bismarck.
This report provides information on degree and certificate programs offered and student program completions for fiscal year 2002-2003 in North Dakota's public and private postsecondary educational institutions. Institutional programs are coded in accordance with the Classification of Instructional Programs (CIP) code system and are organized in…
ERIC Educational Resources Information Center
North Dakota Univ. System, Bismarck.
This report provides information on degree and certificate programs offered and student program completions for fiscal year 2000-2001 in North Dakota's public and private postsecondary educational institutions. Institutions programs are coded in accordance with the Classification of Instructional Programs (CIP) code system of the National Center…
Some practical universal noiseless coding techniques
NASA Technical Reports Server (NTRS)
Rice, R. F.
1979-01-01
Some practical adaptive techniques for the efficient noiseless coding of a broad class of such data sources are developed and analyzed. Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values. A general applicability of these algorithms to solving practical problems is obtained because most real data sources can be simply transformed into this form by appropriate preprocessing. These algorithms have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span. Performance considerably under a measured average data entropy may be observed when data characteristics are changing over the measurement span.
Outer satellite atmospheres: Their nature and planetary interactions
NASA Technical Reports Server (NTRS)
Smyth, W. H.; Combi, M. R.
1982-01-01
Significant progress is reported in early modeling analysis of observed sodium cloud images with our new model which includes the oscillating Io plasma torus ionization sink. Both the general w-D morphology of the region B cloud as well as the large spatial gradient seen between the region A and B clouds are found to be consistent with an isotropic flux of sodium atoms from Io. Model analysis of the spatially extended high velocity directional features provided substantial evidence for a magnetospheric wind driven gas escape mechanism from Io. In our efforts to define the source(s) of hydrogen atoms in the Saturn system, major steps were taken in order to understand the role of Titan. We have completed the comparison of the Voyager UVS data with previous Titan model results, as well as the update of the old model computer code to handle the spatially varying ionization sink for H atoms.
Comparative analysis of design codes for timber bridges in Canada, the United States, and Europe
James Wacker; James (Scott) Groenier
2010-01-01
The United States recently completed its transition from the allowable stress design code to the load and resistance factor design (LRFD) reliability-based code for the design of most highway bridges. For an international perspective on the LRFD-based bridge codes, a comparative analysis is presented: a study addressed national codes of the United States, Canada, and...
15 CFR 740.7 - Computers (APP).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 4A003. (2) Technology and software. License Exception APP authorizes exports of technology and software... License Exception. (2) Access and release restrictions. (i)[Reserved] (ii) Technology and source code. Technology and source code eligible for License Exception APP may not be released to nationals of Cuba, Iran...
Codes of Discipline: Developments, Dimensions, Directions.
ERIC Educational Resources Information Center
Goldsmith, Arthur H.
1982-01-01
Well-drafted codes of discipline can help to eliminate the ambiguity and arbitrariness that often have been associated with school discipline. Discipline codes should be characterized by fairness, fact-finding provisions, completeness of information, frankness, flexibility, informality, firmness, concern with disciplinary suitability,…
Watkins, Sharon
2017-01-01
Objectives: The primary objective of this study was to identify patients with heat-related illness (HRI) using codes for heat-related injury diagnosis and external cause of injury in 3 administrative data sets: emergency department (ED) visit records, hospital discharge records, and death certificates. Methods: We obtained data on ED visits, hospitalizations, and deaths for Florida residents for May 1 through October 31, 2005-2012. To identify patients with HRI, we used codes from the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to search data on ED visits and hospitalizations and codes from the International Classification of Diseases, Tenth Revision (ICD-10) to search data on deaths. We stratified the results by data source and whether the HRI was work related. Results: We identified 23 981 ED visits, 4816 hospitalizations, and 140 deaths in patients with non–work-related HRI and 2979 ED visits, 415 hospitalizations, and 23 deaths in patients with work-related HRI. The most common diagnosis codes among patients were for severe HRI (heat exhaustion or heatstroke). The proportion of patients with a severe HRI diagnosis increased with data source severity. If ICD-9-CM code E900.1 and ICD-10 code W92 (excessive heat of man-made origin) were used as exclusion criteria for HRI, 5.0% of patients with non–work-related deaths, 3.0% of patients with work-related ED visits, and 1.7% of patients with work-related hospitalizations would have been removed. Conclusions: Using multiple data sources and all diagnosis fields may improve the sensitivity of HRI surveillance. Future studies should evaluate the impact of converting ICD-9-CM to ICD-10-CM codes on HRI surveillance of ED visits and hospitalizations. PMID:28379784
NASA Technical Reports Server (NTRS)
Gong, J.; Ozdemir, T.; Volakis, J; Nurnberger, M.
1995-01-01
Year 1 progress can be characterized with four major achievements which are crucial toward the development of robust, easy to use antenna analysis code on doubly conformal platforms. (1) A new FEM code was developed using prismatic meshes. This code is based on a new edge based distorted prism and is particularly attractive for growing meshes associated with printed slot and patch antennas on doubly conformal platforms. It is anticipated that this technology will lead to interactive, simple to use codes for a large class of antenna geometries. Moreover, the codes can be expanded to include modeling of the circuit characteristics. An attached report describes the theory and validation of the new prismatic code using reference calculations and measured data collected at the NASA Langley facilities. The agreement between the measured and calculated data is impressive even for the coated patch configuration. (2) A scheme was developed for improved feed modeling in the context of FEM. A new approach based on the voltage continuity condition was devised and successfully tested in modeling coax cables and aperture fed antennas. An important aspect of this new feed modeling approach is the ability to completely separate the feed and antenna mesh regions. In this manner, different elements can be used in each of the regions leading to substantially improved accuracy and meshing simplicity. (3) A most important development this year has been the introduction of the perfectly matched interface (PMI) layer for truncating finite element meshes. So far the robust boundary integral method has been used for truncating the finite element meshes. However, this approach is not suitable for antennas on nonplanar platforms. The PMI layer is a lossy anisotropic absorber with zero reflection at its interface. (4) We were able to interface our antenna code FEMA_CYL (for antennas on cylindrical platforms) with a standard high frequency code. This interface was achieved by first generating equivalent magnetic currents across the antenna aperture using the FEM code. These currents were employed as the sources in the high frequency code.
Lincoln, A; Sorock, G; Courtney, T; Wellman, H; Smith, G; Amoroso, P
2004-01-01
Objective: To determine whether narrative text in safety reports contains sufficient information regarding contributing factors and precipitating mechanisms to prioritize occupational back injury prevention strategies. Design, setting, subjects, and main outcome measures: Nine essential data elements were identified in narratives and coded sections of safety reports for each of 94 cases of back injuries to United States Army truck drivers reported to the United States Army Safety Center between 1987 and 1997. The essential elements of each case were used to reconstruct standardized event sequences. A taxonomy of the event sequences was then developed to identify common hazard scenarios and opportunities for primary interventions. Results: Coded data typically only identified five data elements (broad activity, task, event/exposure, nature of injury, and outcomes) while narratives provided additional elements (contributing factor, precipitating mechanism, primary source) essential for developing our taxonomy. Three hazard scenarios were associated with back injuries among Army truck drivers accounting for 83% of cases: struck by/against events during motor vehicle crashes; falls resulting from slips/trips or loss of balance; and overexertion from lifting activities. Conclusions: Coded data from safety investigations lacked sufficient information to thoroughly characterize the injury event. However, the combination of existing narrative text (similar to that collected by many injury surveillance systems) and coded data enabled us to develop a more complete taxonomy of injury event characteristics and identify common hazard scenarios. This study demonstrates that narrative text can provide the additional information on contributing factors and precipitating mechanisms needed to target prevention strategies. PMID:15314055
Enhancements to the MCNP6 background source
McMath, Garrett E.; McKinney, Gregg W.
2015-10-19
The particle transport code MCNP has been used to produce a background radiation data file on a worldwide grid that can easily be sampled as a source in the code. Location-dependent cosmic showers were modeled by Monte Carlo methods to produce the resulting neutron and photon background flux at 2054 locations around Earth. An improved galactic-cosmic-ray feature was used to model the source term as well as data from multiple sources to model the transport environment through atmosphere, soil, and seawater. A new elevation scaling feature was also added to the code to increase the accuracy of the cosmic neutronmore » background for user locations with off-grid elevations. Furthermore, benchmarking has shown the neutron integral flux values to be within experimental error.« less
On the optimality of code options for a universal noiseless coder
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu; Rice, Robert F.; Miller, Warner
1991-01-01
A universal noiseless coding structure was developed that provides efficient performance over an extremely broad range of source entropy. This is accomplished by adaptively selecting the best of several easily implemented variable length coding algorithms. Custom VLSI coder and decoder modules capable of processing over 20 million samples per second are currently under development. The first of the code options used in this module development is shown to be equivalent to a class of Huffman code under the Humblet condition, other options are shown to be equivalent to the Huffman codes of a modified Laplacian symbol set, at specified symbol entropy values. Simulation results are obtained on actual aerial imagery, and they confirm the optimality of the scheme. On sources having Gaussian or Poisson distributions, coder performance is also projected through analysis and simulation.
Toward Intelligent Software Defect Detection
NASA Technical Reports Server (NTRS)
Benson, Markland J.
2011-01-01
Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
Supporting Source Code Comprehension during Software Evolution and Maintenance
ERIC Educational Resources Information Center
Alhindawi, Nouh
2013-01-01
This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…
Automating RPM Creation from a Source Code Repository
2012-02-01
apps/usr --with- libpq=/apps/ postgres make rm -rf $RPM_BUILD_ROOT umask 0077 mkdir -p $RPM_BUILD_ROOT/usr/local/bin mkdir -p $RPM_BUILD_ROOT...from a source code repository. %pre %prep %setup %build ./autogen.sh ; ./configure --with-db=/apps/db --with-libpq=/apps/ postgres make
ERIC Educational Resources Information Center
Moore, David G., Jr.
2013-01-01
Aspect-oriented software design (AOSD) enables better and more complete separation of concerns in software-intensive systems. By extracting aspect code and relegating crosscutting functionality to aspects, software engineers can improve the maintainability of their code by reducing code tangling and coupling of code concerns. Further, the number…
1992-11-24
15 Code I: Internal Reports ................................................................. 19 Code M : Oral...experiments. 13. S. M . Baumer: completed M.S. thesis in 1988 on light scattering. 14. C. E. Dean: completed Ph.D. dissertation in 1989 on light...novel oscillation induced flow instabilities. 18. J. M . Winey: awarded M.S. degree in 1990 with project on capillary wave experiments. He
Hu, Bo; Liu, Dong-Xing; Zhang, Yu-Qing; Song, Jian-Tao; Ji, Xian-Fei; Hou, Zhi-Qiang; Zhang, Zhen-Hai
2016-05-01
In this study we sequenced the complete mitochondrial genome sequencing of a heart failure model of cardiomyopathic Syrian hamster (Mesocricetus auratus) for the first time. The total length of the mitogenome was 16,267 bp. It harbored 13 protein-coding genes, 2 ribosomal RNA genes, 22 transfer RNA genes and 1 non-coding control region.
An adaptable binary entropy coder
NASA Technical Reports Server (NTRS)
Kiely, A.; Klimesh, M.
2001-01-01
We present a novel entropy coding technique which is based on recursive interleaving of variable-to-variable length binary source codes. We discuss code design and performance estimation methods, as well as practical encoding and decoding algorithms.
Plasma separation process. Betacell (BCELL) code, user's manual
NASA Astrophysics Data System (ADS)
Taherzadeh, M.
1987-11-01
The emergence of clearly defined applications for (small or large) amounts of long-life and reliable power sources has given the design and production of betavoltaic systems a new life. Moreover, because of the availability of the Plasma Separation Program, (PSP) at TRW, it is now possible to separate the most desirable radioisotopes for betacell power generating devices. A computer code, named BCELL, has been developed to model the betavoltaic concept by utilizing the available up-to-date source/cell parameters. In this program, attempts have been made to determine the betacell energy device maximum efficiency, degradation due to the emitting source radiation and source/cell lifetime power reduction processes. Additionally, comparison is made between the Schottky and PN junction devices for betacell battery design purposes. Certain computer code runs have been made to determine the JV distribution function and the upper limit of the betacell generated power for specified energy sources. A Ni beta emitting radioisotope was used for the energy source and certain semiconductors were used for the converter subsystem of the betacell system. Some results for a Promethium source are also given here for comparison.
Coding Issues in Grounded Theory
ERIC Educational Resources Information Center
Moghaddam, Alireza
2006-01-01
This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…
European Code against Cancer 4th Edition: Ultraviolet radiation and cancer.
Greinert, Rüdiger; de Vries, Esther; Erdmann, Friederike; Espina, Carolina; Auvinen, Anssi; Kesminiene, Ausrele; Schüz, Joachim
2015-12-01
Ultraviolet radiation (UVR) is part of the electromagnetic spectrum emitted naturally from the sun or from artificial sources such as tanning devices. Acute skin reactions induced by UVR exposure are erythema (skin reddening), or sunburn, and the acquisition of a suntan triggered by UVR-induced DNA damage. UVR exposure is the main cause of skin cancer, including cutaneous malignant melanoma, basal-cell carcinoma, and squamous-cell carcinoma. Skin cancer is the most common cancer in fair-skinned populations, and its incidence has increased steeply over recent decades. According to estimates for 2012, about 100,000 new cases of cutaneous melanoma and about 22,000 deaths from it occurred in Europe. The main mechanisms by which UVR causes cancer are well understood. Exposure during childhood appears to be particularly harmful. Exposure to UVR is a risk factor modifiable by individuals' behaviour. Excessive exposure from natural sources can be avoided by seeking shade when the sun is strongest, by wearing appropriate clothing, and by appropriately applying sunscreens if direct sunlight is unavoidable. Exposure from artificial sources can be completely avoided by not using sunbeds. Beneficial effects of sun or UVR exposure, such as for vitamin D production, can be fully achieved while still avoiding too much sun exposure and the use of sunbeds. Taking all the scientific evidence together, the recommendation of the 4th edition of the European Code Against Cancer for ultraviolet radiation is: "Avoid too much sun, especially for children. Use sun protection. Do not use sunbeds." Copyright © 2015 International Agency for Research on Cancer. Published by Elsevier Ltd. All rights reserved.
Monitoring Hydraulic Fracturing Using Ground-Based Controlled Source Electromagnetics
NASA Astrophysics Data System (ADS)
Hickey, M. S.; Trevino, S., III; Everett, M. E.
2017-12-01
Hydraulic fracturing allows hydrocarbon production in low permeability formations. Imaging the distribution of fluid used to create a hydraulic fracture can aid in the characterization of fracture properties such as extent of plume penetration as well as fracture azimuth and symmetry. This could contribute to improving the efficiency of an operation, for example, in helping to determine ideal well spacing or the need to refracture a zone. A ground-based controlled-source electromagnetics (CSEM) technique is ideal for imaging the fluid due to the change in field caused by the difference in the conductive properties of the fluid when compared to the background. With advances in high signal to noise recording equipment, coupled with a high-power, broadband transmitter we can show hydraulic fracture extent and azimuth with minimal processing. A 3D finite element code is used to model the complete well casing along with the layered subsurface. This forward model is used to optimize the survey design and isolate the band of frequencies with the best response. In the field, the results of the modeling are also used to create a custom pseudorandom numeric (PRN) code to control the frequencies transmitted through a grounded dipole source. The receivers record the surface voltage across two grounded dipoles, one parallel and one perpendicular to the transmitter. The data are presented as the displays of amplitude ratios across several frequencies with the associated spatial information. In this presentation, we show multiple field results in multiple basins in the United States along with the CSEM theory used to create the survey designs.
User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.
1982-01-01
This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.
NASA Astrophysics Data System (ADS)
Havemann, Stephan; Wong, Gerald
2016-10-01
The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) represents transmittances, radiances and fluxes by principal components that cover the spectra at very high resolution, allowing fast highly-resolved pseudo line-by-line, hyperspectral and broadband simulations across the electromagnetic spectrum form the microwave to the ultraviolet for satellite-based, airborne and ground-based sensors. HT-FRTC models clear atmospheres and those containing clouds and aerosols, as well as any surface (land/sea/man-made). The HT-FRTC has been used operationally in the NEON Tactical Decision Aid (TDA) since 2008. The TDA combines the HT-FRTC with a thermal contrast model and an NWP model forecast data feed to predict the apparent thermal contrast between different surfaces and ground-based targets in the thermal and short-wave IR. The new objective here is to predict the optical contrast of air-borne targets under realistic night-time scenarios in the Photopic and NVG parts of the spectrum. This requires the inclusion of all the relevant radiation sources, which include twilight, moonlight, starlight, airglow and cultural light. A completely new exact scattering code has been developed which allows the straight-forward addition of any number of direct and diffuse sources anywhere in the atmosphere. The new code solves the radiative transfer equation iteratively and is faster than the previous solution. Simulations of scenarios with different light levels, from situations during a full moon to a moonless night with very low light levels and a situation with cultural light from a town are presented. The impact of surface reflectance and target reflectance is investigated.
Kim, Min Jee; Im, Hyun Hwak; Lee, Kwang Youll; Han, Yeon Soo; Kim, Iksoo
2014-06-01
Abstract The complete nucleotide sequences of the mitochondrial genome from the whiter-spotted flower chafer, Protaetia brevitarsis (Coleoptera: Scarabaeidae), was determined. The 20,319-bp long circular genome is the longest among completely sequenced Coleoptera. As is typical in animals, the P. brevitarsis genome consisted of two ribosomal RNAs, 22 transfer RNAs, 13 protein-coding genes and one A + T-rich region. Although the size of the coding genes was typical, the non-coding A + T-rich region was 5654 bp, which is the longest in insects. The extraordinary length of this region was composed of 28,117-bp tandem repeats and 782-bp tandem repeats. These repeat sequences were encompassed by three non-repeat sequences constituting 1804 bp.
Coding For Compression Of Low-Entropy Data
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu
1994-01-01
Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.
Ni, ZhouXian; Ye, YouJu; Bai, Tiandao; Xu, Meng; Xu, Li-An
2017-09-11
The chloroplast genome (CPG) of Pinus massoniana belonging to the genus Pinus (Pinaceae), which is a primary source of turpentine, was sequenced and analyzed in terms of gene rearrangements, ndh genes loss, and the contraction and expansion of short inverted repeats (IRs). P. massoniana CPG has a typical quadripartite structure that includes large single copy (LSC) (65,563 bp), small single copy (SSC) (53,230 bp) and two IRs (IRa and IRb, 485 bp). The 108 unique genes were identified, including 73 protein-coding genes, 31 tRNAs, and 4 rRNAs. Most of the 81 simple sequence repeats (SSRs) identified in CPG were mononucleotides motifs of A/T types and located in non-coding regions. Comparisons with related species revealed an inversion (21,556 bp) in the LSC region; P. massoniana CPG lacks all 11 intact ndh genes (four ndh genes lost completely; the five remained truncated as pseudogenes; and the other two ndh genes remain as pseudogenes because of short insertions or deletions). A pair of short IRs was found instead of large IRs, and size variations among pine species were observed, which resulted from short insertions or deletions and non-synchronized variations between "IRa" and "IRb". The results of phylogenetic analyses based on whole CPG sequences of 16 conifers indicated that the whole CPG sequences could be used as a powerful tool in phylogenetic analyses.
Random sampling of elementary flux modes in large-scale metabolic networks.
Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel
2012-09-15
The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.
Study of an External Neutron Source for an Accelerator-Driven System using the PHITS Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sugawara, Takanori; Iwasaki, Tomohiko; Chiba, Takashi
A code system for the Accelerator Driven System (ADS) has been under development for analyzing dynamic behaviors of a subcritical core coupled with an accelerator. This code system named DSE (Dynamics calculation code system for a Subcritical system with an External neutron source) consists of an accelerator part and a reactor part. The accelerator part employs a database, which is calculated by using PHITS, for investigating the effect related to the accelerator such as the changes of beam energy, beam diameter, void generation, and target level. This analysis method using the database may introduce some errors into dynamics calculations sincemore » the neutron source data derived from the database has some errors in fitting or interpolating procedures. In this study, the effects of various events are investigated to confirm that the method based on the database is appropriate.« less
Correlation estimation and performance optimization for distributed image compression
NASA Astrophysics Data System (ADS)
He, Zhihai; Cao, Lei; Cheng, Hui
2006-01-01
Correlation estimation plays a critical role in resource allocation and rate control for distributed data compression. A Wyner-Ziv encoder for distributed image compression is often considered as a lossy source encoder followed by a lossless Slepian-Wolf encoder. The source encoder consists of spatial transform, quantization, and bit plane extraction. In this work, we find that Gray code, which has been extensively used in digital modulation, is able to significantly improve the correlation between the source data and its side information. Theoretically, we analyze the behavior of Gray code within the context of distributed image compression. Using this theoretical model, we are able to efficiently allocate the bit budget and determine the code rate of the Slepian-Wolf encoder. Our experimental results demonstrate that the Gray code, coupled with accurate correlation estimation and rate control, significantly improves the picture quality, by up to 4 dB, over the existing methods for distributed image compression.
Table-top laser-driven ultrashort electron and X-ray source: the CIBER-X source project
NASA Astrophysics Data System (ADS)
Girardeau-Montaut, Jean-Pierre; Kiraly, Bélà; Girardeau-Montaut, Claire; Leboutet, Hubert
2000-09-01
We report on the development of a new laser-driven table-top ultrashort electron and X-ray source, also called the CIBER-X source . X-ray pulses are produced by a three-step process which consists of the photoelectron emission from a thin metallic photocathode illuminated by 16 ps duration laser pulses at 213 nm. The e-gun is a standard Pierce diode electrode type, in which electrons are accelerated by a cw electric field of ˜11 MV/m up to a hole made in the anode. The photoinjector produces a train of 70-80 keV electron pulses of ˜0.5 nC and 20 A peak current at a repetition rate of 10 Hz. The electrons are then transported outside the diode along a path of 20 cm length, and are focused onto a target of thullium by magnetic fields produced by two electromagnetic coils. X-rays are then produced by the impact of electrons on the target. Simulations of geometrical, electromagnetic fields and energetic characteristics of the complete source were performed previously with the assistance of the code PIXEL1 also developed at the laboratory. Finally, experimental electron and X-ray performances of the CIBER-X source as well as its application to very low dose imagery are presented and discussed. source Compacte d' Impulsions Brèves d' Electrons et de Rayons X
A GIS-based atmospheric dispersion model for pollutants emitted by complex source areas.
Teggi, Sergio; Costanzini, Sofia; Ghermandi, Grazia; Malagoli, Carlotta; Vinceti, Marco
2018-01-01
Gaussian dispersion models are widely used to simulate the concentrations and deposition fluxes of pollutants emitted by source areas. Very often, the calculation time limits the number of sources and receptors and the geometry of the sources must be simple and without holes. This paper presents CAREA, a new GIS-based Gaussian model for complex source areas. CAREA was coded in the Python language, and is largely based on a simplified formulation of the very popular and recognized AERMOD model. The model allows users to define in a GIS environment thousands of gridded or scattered receptors and thousands of complex sources with hundreds of vertices and holes. CAREA computes ground level, or near ground level, concentrations and dry deposition fluxes of pollutants. The input/output and the runs of the model can be completely managed in GIS environment (e.g. inside a GIS project). The paper presents the CAREA formulation and its applications to very complex test cases. The tests shows that the processing time are satisfactory and that the definition of sources and receptors and the output retrieval are quite easy in a GIS environment. CAREA and AERMOD are compared using simple and reproducible test cases. The comparison shows that CAREA satisfactorily reproduces AERMOD simulations and is considerably faster than AERMOD. Copyright © 2017 Elsevier B.V. All rights reserved.
FEDEF: A High Level Architecture Federate Development Framework
2010-09-01
require code changes for operability between HLA specifications. Configuration of federate requirements such as publications, subscriptions, time ... management , and management protocol should occur outside of federate source code, allowing for federate reusability without code modification and re
The importance of KMR completion for dermatology income in secondary care in the UK.
Hague, J; Nichols, H; Klimmeck, J; Lanigan, S
2007-05-01
In the UK, a Korner Medical Record (KMR) document is completed for each inpatient discharged from hospital. The number and type of medical conditions entered onto this record are used to determine the income the department will receive for that individual patient. We set out to audit the accuracy of the KMR documentation of our dermatology ward, and to assess what impact improving these records would have on the department's income. The audit standard was that KMRs should be completed accurately and contain all of the relevant information. KMRs from May 2005, which had been completed initially by the ward clerk, and later by the junior medical staff, were reviewed. They were then completed by the main auditor, who had received training in KMR completion from the coding department. All three sets of KMRs were reviewed by the coding officer, and their respective income calculated. In total, 20 patients were discharged from the dermatology ward during April 2005. The main diagnosis given for two patients was initially incorrect, and in another eight cases it could have been more accurate than was originally documented. The total number of comorbid or 'secondary' diagnoses (for all 20 patients) reported by the ward clerk was 5. Junior staff added a further 36 secondary diagnoses. The main auditor identified an additional 35 secondary diagnoses. In total, an extra pound 9211 would have been paid to the department if KMRs had been completed by the main auditor rather than the ward clerk. On an annual basis, a potential pound 110,532 would remain unclaimed if KMR completion continued to be performed by the ward clerk. This audit shows that KMR completion is inadequate when performed by a nonmedical practitioner. Training of medical staff in KMR completion by the coding department also significantly increases the accuracy and completeness of documentation. Dermatologists of all grades need to be aware of the importance and process of KMR completion, and routine training of medical staff by their coding department in KMR completion is recommended.
Simulation study on ion extraction from electron cyclotron resonance ion sources
NASA Astrophysics Data System (ADS)
Fu, S.; Kitagawa, A.; Yamada, S.
1994-04-01
In order to study beam optics of NIRS-ECR ion source used in the HIMAC project, the EGUN code has been modified to make it capable of modeling ion extraction from a plasma. Two versions of the modified code are worked out with two different methods in which 1D and 2D sheath theories are used, respectively. Convergence problem of the strong nonlinear self-consistent equations is investigated. Simulations on NIRS-ECR ion source and HYPER-ECR ion source are presented in this paper, exhibiting an agreement with the experiment results.
Ming-Xing, Lu; Zhi-Teng, Chen; Wei-Wei, Yu; Yu-Zhou, Du
2017-03-01
We report the complete mitochondrial genome (mitogenome) of a spiraling whitefly, Aleurodicus dispersus (Hemiptera: Aleyrodidae). The 16 170 bp long genome consists of 13 protein-coding genes, 20 transfer RNAs, 2 ribosomal RNAs, and a control region. The A. dispersus mitogenome also includes a cytb-like non-coding region and shows several variations relative to the typical insect mitogenome. A phylogenetic tree has been constructed using the 13 protein-coding genes of 12 related species from Hemiptera. Our results would contribute to further study of phylogeny in Aleyrodidae and Hemiptera.
Extensions under development for the HEVC standard
NASA Astrophysics Data System (ADS)
Sullivan, Gary J.
2013-09-01
This paper discusses standardization activities for extending the capabilities of the High Efficiency Video Coding (HEVC) standard - the first edition of which was completed in early 2013. These near-term extensions are focused on three areas: range extensions (such as enhanced chroma formats, monochrome video, and increased bit depth), bitstream scalability extensions for spatial and fidelity scalability, and 3D video extensions (including stereoscopic/multi-view coding, and probably also depth map coding and combinations thereof). Standardization extensions on each of these topics will be completed by mid-2014, and further work beyond that timeframe is also discussed.
Plagiarism Detection Algorithm for Source Code in Computer Science Education
ERIC Educational Resources Information Center
Liu, Xin; Xu, Chan; Ouyang, Boyu
2015-01-01
Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…
Automated Source-Code-Based Testing of Object-Oriented Software
NASA Astrophysics Data System (ADS)
Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten
2014-08-01
With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.
E-Standards For Mass Properties Engineering
NASA Technical Reports Server (NTRS)
Cerro, Jeffrey A.
2008-01-01
A proposal is put forth to promote the concept of a Society of Allied Weight Engineers developed voluntary consensus standard for mass properties engineering. This standard would be an e-standard, and would encompass data, data manipulation, and reporting functionality. The standard would be implemented via an open-source SAWE distribution site with full SAWE member body access. Engineering societies and global standards initiatives are progressing toward modern engineering standards, which become functioning deliverable data sets. These data sets, if properly standardized, will integrate easily between supplier and customer enabling technically precise mass properties data exchange. The concepts of object-oriented programming support all of these requirements, and the use of a JavaTx based open-source development initiative is proposed. Results are reported for activity sponsored by the NASA Langley Research Center Innovation Institute to scope out requirements for developing a mass properties engineering e-standard. An initial software distribution is proposed. Upon completion, an open-source application programming interface will be available to SAWE members for the development of more specific programming requirements that are tailored to company and project requirements. A fully functioning application programming interface will permit code extension via company proprietary techniques, as well as through continued open-source initiatives.
NASA Astrophysics Data System (ADS)
Salvato, M.; Buchner, j.; Budavari, T.; Dwelly, T.; Merloni, A.; Brusa, M.; Rau, A.; Fotopoulou, S.; Nandra, K.
2017-10-01
At the end of the mission, the eROSITA All-sky X-ray survey will provide the community with about 4 million of point-like sources, down to a limit of 10^{-14} erg/cm^2/s in the soft band and 2x10^{-13} erg/cm^2/s in the hard band. The brightest sources however have been already observed by ROSAT, but have been rarely used due to the large uncertainties in their positions, thus making the identification of their right multi-wavelength counterparts a demanding task with uncertain results. New all-sky Optical and IR surveys like GAIA and WISE allow us, for the first time, to provide reliable counterparts to all ROSAT sources, thanks also to the development of a new algorithm, NWAY, based on Bayesian statistic and adoption of color-magnitude priors. This paves the way to new programs of complete characterization of the bright X-ray sky, such as the SDSS-IV/SPIDERS survey started in 2014. In this talk I will briefly present the code and the multiwavelength properties of ROSAT and XMMSLEW counterparts.
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2006-01-01
A computer program automatically builds large, full-resolution mosaics of multispectral images of Earth landmasses from images acquired by Landsat 7, complete with matching of colors and blending between adjacent scenes. While the code has been used extensively for Landsat, it could also be used for other data sources. A single mosaic of as many as 8,000 scenes, represented by more than 5 terabytes of data and the largest set produced in this work, demonstrated what the code could do to provide global coverage. The program first statistically analyzes input images to determine areas of coverage and data-value distributions. It then transforms the input images from their original universal transverse Mercator coordinates to other geographical coordinates, with scaling. It applies a first-order polynomial brightness correction to each band in each scene. It uses a data-mask image for selecting data and blending of input scenes. Under control by a user, the program can be made to operate on small parts of the output image space, with check-point and restart capabilities. The program runs on SGI IRIX computers. It is capable of parallel processing using shared-memory code, large memories, and tens of central processing units. It can retrieve input data and store output data at locations remote from the processors on which it is executed.
RNAiFold 2.0: a web server and software to design custom and Rfam-based RNA molecules.
Garcia-Martin, Juan Antonio; Dotu, Ivan; Clote, Peter
2015-07-01
Several algorithms for RNA inverse folding have been used to design synthetic riboswitches, ribozymes and thermoswitches, whose activity has been experimentally validated. The RNAiFold software is unique among approaches for inverse folding in that (exhaustive) constraint programming is used instead of heuristic methods. For that reason, RNAiFold can generate all sequences that fold into the target structure or determine that there is no solution. RNAiFold 2.0 is a complete overhaul of RNAiFold 1.0, rewritten from the now defunct COMET language to C++. The new code properly extends the capabilities of its predecessor by providing a user-friendly pipeline to design synthetic constructs having the functionality of given Rfam families. In addition, the new software supports amino acid constraints, even for proteins translated in different reading frames from overlapping coding sequences; moreover, structure compatibility/incompatibility constraints have been expanded. With these features, RNAiFold 2.0 allows the user to design single RNA molecules as well as hybridization complexes of two RNA molecules. the web server, source code and linux binaries are publicly accessible at http://bioinformatics.bc.edu/clotelab/RNAiFold2.0. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Adaptive variable-length coding for efficient compression of spacecraft television data.
NASA Technical Reports Server (NTRS)
Rice, R. F.; Plaunt, J. R.
1971-01-01
An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.
A comparison of skyshine computational methods.
Hertel, Nolan E; Sweezy, Jeremy E; Shultis, J Kenneth; Warkentin, J Karl; Rose, Zachary J
2005-01-01
A variety of methods employing radiation transport and point-kernel codes have been used to model two skyshine problems. The first problem is a 1 MeV point source of photons on the surface of the earth inside a 2 m tall and 1 m radius silo having black walls. The skyshine radiation downfield from the point source was estimated with and without a 30-cm-thick concrete lid on the silo. The second benchmark problem is to estimate the skyshine radiation downfield from 12 cylindrical canisters emplaced in a low-level radioactive waste trench. The canisters are filled with ion-exchange resin with a representative radionuclide loading, largely 60Co, 134Cs and 137Cs. The solution methods include use of the MCNP code to solve the problem by directly employing variance reduction techniques, the single-scatter point kernel code GGG-GP, the QADMOD-GP point kernel code, the COHORT Monte Carlo code, the NAC International version of the SKYSHINE-III code, the KSU hybrid method and the associated KSU skyshine codes.
TALYS/TENDL verification and validation processes: Outcomes and recommendations
NASA Astrophysics Data System (ADS)
Fleming, Michael; Sublet, Jean-Christophe; Gilbert, Mark R.; Koning, Arjan; Rochman, Dimitri
2017-09-01
The TALYS-generated Evaluated Nuclear Data Libraries (TENDL) provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states), up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR) approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.
ERIC Educational Resources Information Center
North Dakota University System, 2008
2008-01-01
This report provides information on degree and certificate programs offered and student program completions for fiscal year 2006-2007 in North Dakota's public and private postsecondary educational institutions. Institutional programs are coded in accordance with the Classification of Instructional Programs (CIP) code system and are organized in…
QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials.
Giannozzi, Paolo; Baroni, Stefano; Bonini, Nicola; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Chiarotti, Guido L; Cococcioni, Matteo; Dabo, Ismaila; Dal Corso, Andrea; de Gironcoli, Stefano; Fabris, Stefano; Fratesi, Guido; Gebauer, Ralph; Gerstmann, Uwe; Gougoussis, Christos; Kokalj, Anton; Lazzeri, Michele; Martin-Samos, Layla; Marzari, Nicola; Mauri, Francesco; Mazzarello, Riccardo; Paolini, Stefano; Pasquarello, Alfredo; Paulatto, Lorenzo; Sbraccia, Carlo; Scandolo, Sandro; Sclauzero, Gabriele; Seitsonen, Ari P; Smogunov, Alexander; Umari, Paolo; Wentzcovitch, Renata M
2009-09-30
QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.
NASA Astrophysics Data System (ADS)
Blecic, Jasmina; Harrington, Joseph; Bowman, Matthew O.; Cubillos, Patricio E.; Stemm, Madison; Foster, Andrew
2014-11-01
We present a new, open-source, Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. TEA uses the Gibbs-free-energy minimization method with an iterative Lagrangian optimization scheme. It initializes the radiative-transfer calculation in our Bayesian Atmospheric Radiative Transfer (BART) code. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. The code is tested against the original method developed by White at al. (1958), the analytic method developed by Burrows and Sharp (1999), and the Newton-Raphson method implemented in the open-source Chemical Equilibrium with Applications (CEA) code. TEA is written in Python and is available to the community via the open-source development site GitHub.com. We also present BART applied to eclipse depths of WASP-43b exoplanet, constraining atmospheric thermal and chemical parameters. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.
NASA Technical Reports Server (NTRS)
Pratt, D. T.; Radhakrishnan, K.
1986-01-01
The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.
The Astrophysics Source Code Library: Supporting software publication and citation
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, Peter
2018-01-01
The Astrophysics Source Code Library (ASCL, ascl.net), established in 1999, is a free online registry for source codes used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. In addition to registering codes, the ASCL can house archive files for download and assign them DOIs. The ASCL advocations for software citation on par with article citation, participates in multidiscipinary events such as Force11, OpenCon, and the annual Workshop on Sustainable Software for Science, works with journal publishers, and organizes Special Sessions and Birds of a Feather meetings at national and international conferences such as Astronomical Data Analysis Software and Systems (ADASS), European Week of Astronomy and Space Science, and AAS meetings. In this presentation, I will discuss some of the challenges of gathering credit for publishing software and ideas and efforts from other disciplines that may be useful to astronomy.
Self-consistent modeling of electron cyclotron resonance ion sources
NASA Astrophysics Data System (ADS)
Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lécot, C.
2004-05-01
In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 9 2011-07-01 2011-07-01 false Definitions. 4001.2 Section 4001.2 Labor Regulations Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION GENERAL TERMINOLOGY § 4001.2 Definitions... section 401(a)(2) of the Code). Code means the Internal Revenue Code of 1986, as amended. Complete...
Some practical universal noiseless coding techniques, part 3, module PSl14,K+
NASA Technical Reports Server (NTRS)
Rice, Robert F.
1991-01-01
The algorithmic definitions, performance characterizations, and application notes for a high-performance adaptive noiseless coding module are provided. Subsets of these algorithms are currently under development in custom very large scale integration (VLSI) at three NASA centers. The generality of coding algorithms recently reported is extended. The module incorporates a powerful adaptive noiseless coder for Standard Data Sources (i.e., sources whose symbols can be represented by uncorrelated non-negative integers, where smaller integers are more likely than the larger ones). Coders can be specified to provide performance close to the data entropy over any desired dynamic range (of entropy) above 0.75 bit/sample. This is accomplished by adaptively choosing the best of many efficient variable-length coding options to use on each short block of data (e.g., 16 samples) All code options used for entropies above 1.5 bits/sample are 'Huffman Equivalent', but they require no table lookups to implement. The coding can be performed directly on data that have been preprocessed to exhibit the characteristics of a standard source. Alternatively, a built-in predictive preprocessor can be used where applicable. This built-in preprocessor includes the familiar 1-D predictor followed by a function that maps the prediction error sequences into the desired standard form. Additionally, an external prediction can be substituted if desired. A broad range of issues dealing with the interface between the coding module and the data systems it might serve are further addressed. These issues include: multidimensional prediction, archival access, sensor noise, rate control, code rate improvements outside the module, and the optimality of certain internal code options.
Marchetti, Luca; Manca, Vincenzo
2015-04-15
MpTheory Java library is an open-source project collecting a set of objects and algorithms for modeling observed dynamics by means of the Metabolic P (MP) theory, that is, a mathematical theory introduced in 2004 for modeling biological dynamics. By means of the library, it is possible to model biological systems both at continuous and at discrete time. Moreover, the library comprises a set of regression algorithms for inferring MP models starting from time series of observations. To enhance the modeling experience, beside a pure Java usage, the library can be directly used within the most popular computing environments, such as MATLAB, GNU Octave, Mathematica and R. The library is open-source and licensed under the GNU Lesser General Public License (LGPL) Version 3.0. Source code, binaries and complete documentation are available at http://mptheory.scienze.univr.it. luca.marchetti@univr.it, marchetti@cosbi.eu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Binary Black Holes, Gravitational Waves, and Numerical Relativity
NASA Technical Reports Server (NTRS)
Centrella, Joan
2007-01-01
Massive black hole (MBH) binaries are found at the centers of most galaxies. MBH mergers trace galaxy mergers and are strong sources of gravitational waves. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities. causing them to crash well before the black hole:, in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This presentation shows how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. Focus is on the recent advances that that reveal these waveforms, and the potential for discoveries that arises when these sources are observed by LIGO and LISA.
Universal Noiseless Coding Subroutines
NASA Technical Reports Server (NTRS)
Schlutsmeyer, A. P.; Rice, R. F.
1986-01-01
Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.
myChEMBL: a virtual machine implementation of open data and cheminformatics tools.
Ochoa, Rodrigo; Davies, Mark; Papadatos, George; Atkinson, Francis; Overington, John P
2014-01-15
myChEMBL is a completely open platform, which combines public domain bioactivity data with open source database and cheminformatics technologies. myChEMBL consists of a Linux (Ubuntu) Virtual Machine featuring a PostgreSQL schema with the latest version of the ChEMBL database, as well as the latest RDKit cheminformatics libraries. In addition, a self-contained web interface is available, which can be modified and improved according to user specifications. The VM is available at: ftp://ftp.ebi.ac.uk/pub/databases/chembl/VM/myChEMBL/current. The web interface and web services code is available at: https://github.com/rochoa85/myChEMBL.
A Documentary Analysis of Abstracts Presented in European Congresses on Adapted Physical Activity.
Sklenarikova, Jana; Kudlacek, Martin; Baloun, Ladislav; Causgrove Dunn, Janice
2016-07-01
The purpose of the study was to identify trends in research abstracts published in the books of abstracts of the European Congress of Adapted Physical Activity from 2004 to 2012. A documentary analysis of the contents of 459 abstracts was completed. Data were coded based on subcategories used in a previous study by Zhang, deLisle, and Chen (2006) and by Porretta and Sherrill (2005): number of authors, data source, sample size, type of disability, data analyses, type of study, and focus of study. Descriptive statistics calculated for each subcategory revealed an overall picture of the state and trends of scientific inquiry in adapted physical activity research in Europe.
Safe, Multiphase Bounds Check Elimination in Java
2010-01-28
production of mobile code from source code, JIT compilation in the virtual ma- chine, and application code execution. The code producer uses...invariants, and inequality constraint analysis) to identify and prove redundancy of bounds checks. During class-loading and JIT compilation, the virtual...unoptimized code if the speculated invariants do not hold. The combined effect of the multiple phases is to shift the effort as- sociated with bounds
NASA Astrophysics Data System (ADS)
Pei, Yong; Modestino, James W.
2007-12-01
We describe a multilayered video transport scheme for wireless channels capable of adapting to channel conditions in order to maximize end-to-end quality of service (QoS). This scheme combines a scalable H.263+ video source coder with unequal error protection (UEP) across layers. The UEP is achieved by employing different channel codes together with a multiresolution modulation approach to transport the different priority layers. Adaptivity to channel conditions is provided through a joint source-channel coding (JSCC) approach which attempts to jointly optimize the source and channel coding rates together with the modulation parameters to obtain the maximum achievable end-to-end QoS for the prevailing channel conditions. In this work, we model the wireless links as slow-fading Rician channel where the channel conditions can be described in terms of the channel signal-to-noise ratio (SNR) and the ratio of specular-to-diffuse energy[InlineEquation not available: see fulltext.]. The multiresolution modulation/coding scheme consists of binary rate-compatible punctured convolutional (RCPC) codes used together with nonuniform phase-shift keyed (PSK) signaling constellations. Results indicate that this adaptive JSCC scheme employing scalable video encoding together with a multiresolution modulation/coding approach leads to significant improvements in delivered video quality for specified channel conditions. In particular, the approach results in considerably improved graceful degradation properties for decreasing channel SNR.
An adaptive distributed data aggregation based on RCPC for wireless sensor networks
NASA Astrophysics Data System (ADS)
Hua, Guogang; Chen, Chang Wen
2006-05-01
One of the most important design issues in wireless sensor networks is energy efficiency. Data aggregation has significant impact on the energy efficiency of the wireless sensor networks. With massive deployment of sensor nodes and limited energy supply, data aggregation has been considered as an essential paradigm for data collection in sensor networks. Recently, distributed source coding has been demonstrated to possess several advantages in data aggregation for wireless sensor networks. Distributed source coding is able to encode sensor data with lower bit rate without direct communication among sensor nodes. To ensure reliable and high throughput transmission with the aggregated data, we proposed in this research a progressive transmission and decoding of Rate-Compatible Punctured Convolutional (RCPC) coded data aggregation with distributed source coding. Our proposed 1/2 RSC codes with Viterbi algorithm for distributed source coding are able to guarantee that, even without any correlation between the data, the decoder can always decode the data correctly without wasting energy. The proposed approach achieves two aspects in adaptive data aggregation for wireless sensor networks. First, the RCPC coding facilitates adaptive compression corresponding to the correlation of the sensor data. When the data correlation is high, higher compression ration can be achieved. Otherwise, lower compression ratio will be achieved. Second, the data aggregation is adaptively accumulated. There is no waste of energy in the transmission; even there is no correlation among the data, the energy consumed is at the same level as raw data collection. Experimental results have shown that the proposed distributed data aggregation based on RCPC is able to achieve high throughput and low energy consumption data collection for wireless sensor networks
VizieR Online Data Catalog: ynogkm: code for calculating time-like geodesics (Yang+, 2014)
NASA Astrophysics Data System (ADS)
Yang, X.-L.; Wang, J.-C.
2013-11-01
Here we present the source file for a new public code named ynogkm, aim on calculating the time-like geodesics in a Kerr-Newmann spacetime fast. In the code the four Boyer-Lindquis coordinates and proper time are expressed as functions of a parameter p semi-analytically, i.e., r(p), μ(p), φ(p), t(p), and σ(p), by using the Weiers- trass' and Jacobi's elliptic functions and integrals. All of the ellip- tic integrals are computed by Carlson's elliptic integral method, which guarantees the fast speed of the code.The source Fortran file ynogkm.f90 contains three modules: constants, rootfind, ellfunction, and blcoordinates. (3 data files).
Pseudo color ghost coding imaging with pseudo thermal light
NASA Astrophysics Data System (ADS)
Duan, De-yang; Xia, Yun-jie
2018-04-01
We present a new pseudo color imaging scheme named pseudo color ghost coding imaging based on ghost imaging but with multiwavelength source modulated by a spatial light modulator. Compared with conventional pseudo color imaging where there is no nondegenerate wavelength spatial correlations resulting in extra monochromatic images, the degenerate wavelength and nondegenerate wavelength spatial correlations between the idle beam and signal beam can be obtained simultaneously. This scheme can obtain more colorful image with higher quality than that in conventional pseudo color coding techniques. More importantly, a significant advantage of the scheme compared to the conventional pseudo color coding imaging techniques is the image with different colors can be obtained without changing the light source and spatial filter.
Distributed single source coding with side information
NASA Astrophysics Data System (ADS)
Vila-Forcen, Jose E.; Koval, Oleksiy; Voloshynovskiy, Sviatoslav V.
2004-01-01
In the paper we advocate image compression technique in the scope of distributed source coding framework. The novelty of the proposed approach is twofold: classical image compression is considered from the positions of source coding with side information and, contrarily to the existing scenarios, where side information is given explicitly, side information is created based on deterministic approximation of local image features. We consider an image in the transform domain as a realization of a source with a bounded codebook of symbols where each symbol represents a particular edge shape. The codebook is image independent and plays the role of auxiliary source. Due to the partial availability of side information at both encoder and decoder we treat our problem as a modification of Berger-Flynn-Gray problem and investigate a possible gain over the solutions when side information is either unavailable or available only at decoder. Finally, we present a practical compression algorithm for passport photo images based on our concept that demonstrates the superior performance in very low bit rate regime.
Processing of Visual--Action Codes by Deaf and Hearing Children: Coding Orientation or "M"-Capacity?
ERIC Educational Resources Information Center
Todman, John; Cowdy, Natascha
1993-01-01
Results from a study in which 25 deaf children and 25 hearing children completed a vocabulary test and a compound stimulus visual information task support the hypothesis that performance on cognitive tasks is dependent on compatibility of task demands with a coding orientation. (SLD)
Xu, Qin; Xiong, Guanjun; Li, Pengbo; He, Fei; Huang, Yi; Wang, Kunbo; Li, Zhaohu; Hua, Jinping
2012-01-01
Background Cotton (Gossypium spp.) is a model system for the analysis of polyploidization. Although ascertaining the donor species of allotetraploid cotton has been intensively studied, sequence comparison of Gossypium chloroplast genomes is still of interest to understand the mechanisms underlining the evolution of Gossypium allotetraploids, while it is generally accepted that the parents were A- and D-genome containing species. Here we performed a comparative analysis of 13 Gossypium chloroplast genomes, twelve of which are presented here for the first time. Methodology/Principal Findings The size of 12 chloroplast genomes under study varied from 159,959 bp to 160,433 bp. The chromosomes were highly similar having >98% sequence identity. They encoded the same set of 112 unique genes which occurred in a uniform order with only slightly different boundary junctions. Divergence due to indels as well as substitutions was examined separately for genome, coding and noncoding sequences. The genome divergence was estimated as 0.374% to 0.583% between allotetraploid species and A-genome, and 0.159% to 0.454% within allotetraploids. Forty protein-coding genes were completely identical at the protein level, and 20 intergenic sequences were completely conserved. The 9 allotetraploids shared 5 insertions and 9 deletions in whole genome, and 7-bp substitutions in protein-coding genes. The phylogenetic tree confirmed a close relationship between allotetraploids and the ancestor of A-genome, and the allotetraploids were divided into four separate groups. Progenitor allotetraploid cotton originated 0.43–0.68 million years ago (MYA). Conclusion Despite high degree of conservation between the Gossypium chloroplast genomes, sequence variations among species could still be detected. Gossypium chloroplast genomes preferred for 5-bp indels and 1–3-bp indels are mainly attributed to the SSR polymorphisms. This study supports that the common ancestor of diploid A-genome species in Gossypium is the maternal source of extant allotetraploid species and allotetraploids have a monophyletic origin. G. hirsutum AD1 lineages have experienced more sequence variations than other allotetraploids in intergenic regions. The available complete nucleotide sequences of 12 Gossypium chloroplast genomes should facilitate studies to uncover the molecular mechanisms of compartmental co-evolution and speciation of Gossypium allotetraploids. PMID:22876273
1983-09-01
6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA
Simulations of the plasma dynamics in high-current ion diodes
NASA Astrophysics Data System (ADS)
Boine-Frankenheim, O.; Pointon, T. D.; Mehlhorn, T. A.
Our time-implicit fluid/Particle-In-Cell (PIC) code DYNAID [1]is applied to problems relevant for applied- B ion diode operation. We present simulations of the laser ion source, which will soon be employed on the SABRE accelerator at SNL, and of the dynamics of the anode source plasma in the applied electric and magnetic fields. DYNAID is still a test-bed for a higher-dimensional simulation code. Nevertheless, the code can already give new theoretical insight into the dynamics of plasmas in pulsed power devices.
Numerical Electromagnetic Code (NEC)-Basic Scattering Code. Part I. User’s Manual.
1979-09-01
Command RT : 29 I. Command PG: 32 J. Command GP: 35 K. Command CG: 36 L. Command SG: 39 M. Command AM: 44 N. Conumand PR: 48 0. Command NP: 49 P...these points and con- firm the validity of the solution. 1 0 1 -.- ’----.- ... The source presently considered in the computer code is an Plec - tric...Range Input 28 * RT : Translate and/or Rotate Coordinates 29 SG: Source Geometry Input IQ TO: Test Data Generation Options 17 [IN: Units of Input U)S
NASA Technical Reports Server (NTRS)
Desautel, Richard
1993-01-01
The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).
NASA Astrophysics Data System (ADS)
Yahampath, Pradeepa
2017-12-01
Consider communicating a correlated Gaussian source over a Rayleigh fading channel with no knowledge of the channel signal-to-noise ratio (CSNR) at the transmitter. In this case, a digital system cannot be optimal for a range of CSNRs. Analog transmission however is optimal at all CSNRs, if the source and channel are memoryless and bandwidth matched. This paper presents new hybrid digital-analog (HDA) systems for sources with memory and channels with bandwidth expansion, which outperform both digital-only and analog-only systems over a wide range of CSNRs. The digital part is either a predictive quantizer or a transform code, used to achieve a coding gain. Analog part uses linear encoding to transmit the quantization error which improves the performance under CSNR variations. The hybrid encoder is optimized to achieve the minimum AMMSE (average minimum mean square error) over the CSNR distribution. To this end, analytical expressions are derived for the AMMSE of asymptotically optimal systems. It is shown that the outage CSNR of the channel code and the analog-digital power allocation must be jointly optimized to achieve the minimum AMMSE. In the case of HDA predictive quantization, a simple algorithm is presented to solve the optimization problem. Experimental results are presented for both Gauss-Markov sources and speech signals.
Relay selection in energy harvesting cooperative networks with rateless codes
NASA Astrophysics Data System (ADS)
Zhu, Kaiyan; Wang, Fei
2018-04-01
This paper investigates the relay selection in energy harvesting cooperative networks, where the relays harvests energy from the radio frequency (RF) signals transmitted by a source, and the optimal relay is selected and uses the harvested energy to assist the information transmission from the source to its destination. Both source and the selected relay transmit information using rateless code, which allows the destination recover original information after collecting codes bits marginally surpass the entropy of original information. In order to improve transmission performance and efficiently utilize the harvested power, the optimal relay is selected. The optimization problem are formulated to maximize the achievable information rates of the system. Simulation results demonstrate that our proposed relay selection scheme outperform other strategies.
Test Generator for MATLAB Simulations
NASA Technical Reports Server (NTRS)
Henry, Joel
2011-01-01
MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.
The FORTRAN static source code analyzer program (SAP) user's guide, revision 1
NASA Technical Reports Server (NTRS)
Decker, W.; Taylor, W.; Eslinger, S.
1982-01-01
The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.
The Need for Vendor Source Code at NAS. Revised
NASA Technical Reports Server (NTRS)
Carter, Russell; Acheson, Steve; Blaylock, Bruce; Brock, David; Cardo, Nick; Ciotti, Bob; Poston, Alan; Wong, Parkson; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
The Numerical Aerodynamic Simulation (NAS) Facility has a long standing practice of maintaining buildable source code for installed hardware. There are two reasons for this: NAS's designated pathfinding role, and the need to maintain a smoothly running operational capacity given the widely diversified nature of the vendor installations. NAS has a need to maintain support capabilities when vendors are not able; diagnose and remedy hardware or software problems where applicable; and to support ongoing system software development activities whether or not the relevant vendors feel support is justified. This note provides an informal history of these activities at NAS, and brings together the general principles that drive the requirement that systems integrated into the NAS environment run binaries built from source code, onsite.
Power Balance and Impurity Studies in TCS
NASA Astrophysics Data System (ADS)
Grossnickle, J. A.; Pietrzyk, Z. A.; Vlases, G. C.
2003-10-01
A "zero-dimension" power balance model was developed based on measurements of absorbed power, radiated power, absolute D_α, temperature, and density for the TCS device. Radiation was determined to be the dominant source of power loss for medium to high density plasmas. The total radiated power was strongly correlated with the Oxygen line radiation. This suggests Oxygen is the dominant radiating species, which was confirmed by doping studies. These also extrapolate to a Carbon content below 1.5%. Determining the source of the impurities is an important question that must be answered for the TCS upgrade. Preliminary indications are that the primary sources of Oxygen are the stainless steel end cones. A Ti gettering system is being installed to reduce this Oxygen source. A field line code has been developed for use in tracking where open field lines terminate on the walls. Output from this code is also used to generate grids for an impurity tracking code.
Status report on the development of a tubular electron beam ion source
NASA Astrophysics Data System (ADS)
Donets, E. D.; Donets, E. E.; Becker, R.; Liljeby, L.; Rensfelt, K.-G.; Beebe, E. N.; Pikin, A. I.
2004-05-01
The theoretical estimations and numerical simulations of tubular electron beams in both beam and reflex mode of source operation as well as the off-axis ion extraction from a tubular electron beam ion source (TEBIS) are presented. Numerical simulations have been done with the use of the IGUN and OPERA-3D codes. Numerical simulations with IGUN code show that the effective electron current can reach more than 100 A with a beam current density of about 300-400 A/cm2 and the electron energy in the region of several KeV with a corresponding increase of the ion output. Off-axis ion extraction from the TEBIS, being the nonaxially symmetric problem, was simulated with OPERA-3D (SCALA) code. The conceptual design and main parameters of new tubular sources which are under consideration at JINR, MSL, and BNL are based on these simulations.
NASA Astrophysics Data System (ADS)
Mense, Mario; Schindelhauer, Christian
We introduce the Read-Write-Coding-System (RWC) - a very flexible class of linear block codes that generate efficient and flexible erasure codes for storage networks. In particular, given a message x of k symbols and a codeword y of n symbols, an RW code defines additional parameters k ≤ r,w ≤ n that offer enhanced possibilities to adjust the fault-tolerance capability of the code. More precisely, an RWC provides linear left(n,k,dright)-codes that have (a) minimum distance d = n - r + 1 for any two codewords, and (b) for each codeword there exists a codeword for each other message with distance of at most w. Furthermore, depending on the values r,w and the code alphabet, different block codes such as parity codes (e.g. RAID 4/5) or Reed-Solomon (RS) codes (if r = k and thus, w = n) can be generated. In storage networks in which I/O accesses are very costly and redundancy is crucial, this flexibility has considerable advantages as r and w can optimally be adapted to read or write intensive applications; only w symbols must be updated if the message x changes completely, what is different from other codes which always need to rewrite y completely as x changes. In this paper, we first state a tight lower bound and basic conditions for all RW codes. Furthermore, we introduce special RW codes in which all mentioned parameters are adjustable even online, that is, those RW codes are adaptive to changing demands. At last, we point out some useful properties regarding safety and security of the stored data.
Plasma Separation Process: Betacell (BCELL) code: User's manual. [Bipolar barrier junction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taherzadeh, M.
1987-11-13
The emergence of clearly defined applications for (small or large) amounts of long-life and reliable power sources has given the design and production of betavoltaic systems a new life. Moreover, because of the availability of the plasma separation program, (PSP) at TRW, it is now possible to separate the most desirable radioisotopes for betacell power generating devices. A computer code, named BCELL, has been developed to model the betavoltaic concept by utilizing the available up-to-date source/cell parameters. In this program, attempts have been made to determine the betacell energy device maximum efficiency, degradation due to the emitting source radiation andmore » source/cell lifetime power reduction processes. Additionally, comparison is made between the Schottky and PN junction devices for betacell battery design purposes. Certain computer code runs have been made to determine the JV distribution function and the upper limit of the betacell generated power for specified energy sources. A Ni beta emitting radioisotope was used for the energy source and certain semiconductors were used for the converter subsystem of the betacell system. Some results for a Promethium source are also given here for comparison. 16 refs.« less
Villalobos Gámez, Juan Luis; González Pérez, Cristina; García-Almeida, José Manuel; Martínez Reina, Alfonso; Del Río Mata, José; Márquez Fernández, Efrén; Rioja Vázquez, Rosalía; Barranco Pérez, Joaquín; Enguix Armada, Alfredo; Rodríguez García, Luis Miguel; Bernal Losada, Olga; Osorio Fernández, Diego; Mínguez Mañanes, Alfredo; Lara Ramos, Carlos; Dani, Laila; Vallejo Báez, Antonio; Martínez Martín, Jesús; Fernández Ovies, José Manuel; Tinahones Madueño, Francisco Javier; Fernández-Crehuet Navajas, Joaquín
2014-06-01
The high prevalence of disease-related hospital malnutrition justifies the need for screening tools and early detection in patients at risk for malnutrition, followed by an assessment targeted towards diagnosis and treatment. At the same time there is clear undercoding of malnutrition diagnoses and the procedures to correct it Objectives: To describe the INFORNUT program/ process and its development as an information system. To quantify performance in its different phases. To cite other tools used as a coding source. To calculate the coding rates for malnutrition diagnoses and related procedures. To show the relationship to Mean Stay, Mortality Rate and Urgent Readmission; as well as to quantify its impact on the hospital Complexity Index and its effect on the justification of Hospitalization Costs. The INFORNUT® process is based on an automated screening program of systematic detection and early identification of malnourished patients on hospital admission, as well as their assessment, diagnoses, documentation and reporting. Of total readmissions with stays longer than three days incurred in 2008 and 2010, we recorded patients who underwent analytical screening with an alert for a medium or high risk of malnutrition, as well as the subgroup of patients in whom we were able to administer the complete INFORNUT® process, generating a report for each. Other documentary coding sources are cited. From the Minimum Basic Data Set, codes defined in the SEDOMSENPE consensus were analyzed. The data were processed with the Alcor-DRG program. Rates in ‰ of discharges for 2009 and 2010 of diagnoses of malnutrition, procedure and procedures-related diagnoses were calculated. These rates were compared with the mean rates in Andalusia. The contribution of these codes to the Complexity Index was estimated and, from the cost accounting data, the fraction of the hospitalization cost seen as justified by this activity was estimated. RESULTS are summarized for both study years. With respect to process performance, more than 3,600 patients per year (30% of admissions with a stay > 3 days) underwent analytical screening. Half of these patients were at medium or high risk and a nutritional assessment using INFORNUT® was completed for 55% of them, generating approximately 1,000 reports/year. Our coding rates exceeded the mean rates in Andalusia, being 3.5 times higher for diagnoses (35‰); 2.5 times higher for procedures (50‰) and five times the rate of procedurerelated diagnoses in the same patient (25‰). The Mean Stay of patients coded with malnutrition at discharge was 31.7 days, compared to 9.5 for the overall hospital stay. The Mortality Rate for the same patients (21.8%) was almost five times higher than the mean and Urgent Readmissions (5.5%) were 1.9 times higher. The impact of this coding on the hospital Complexity Index was four hundredths (from 2.08 to 2.12 in 2009 and 2.15 to 2.19 in 2010). This translates into a hospitalization cost justification of 2,000,000; five to six times the cost of artificial nutrition. The process facilitated access to the diagnosis of malnutrition and to understanding the risk of developing it, as well as to the prescription of procedures and/or supplements to correct it. The interdisciplinary team coordination, the participatory process and the tools used improved coding rates to give results far above the Andalusian mean. These results help to upwardly adjust the hospital Complexity Index or Case Mix-, as well as to explain hospitalization costs. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Coding Instead of Splitting - Algebraic Combinations in Time and Space
2016-06-09
sources message. For certain classes of two-unicast-Z networks, we show that the rate-tuple ( N ,1) is achievable as long as the individual source...destination cuts for the two source-destination pairs are respectively at least as large as N and 1, and the generalized network sharing cut - a bound...previously defined by Kamath et. al. - is at least as large as N + 1. We show this through a novel achievable scheme which is based on random linear coding at
Complete mitochondrial genome of Cynopterus sphinx (Pteropodidae: Cynopterus).
Li, Linmiao; Li, Min; Wu, Zhengjun; Chen, Jinping
2015-01-01
We have characterized the complete mitochondrial genome of Cynopterus sphinx (Pteropodidae: Cynopterus) and described its organization in this study. The total length of C. sphinx complete mitochondrial genome was 16,895 bp with the base composition of 32.54% A, 14.05% G, 25.82% T and 27.59% C. The complete mitochondrial genome included 13 protein-coding genes, 22 tRNA genes, 2 rRNA genes (12S rRNA and 16S rRNA) and 1 control region (D-loop). The control region was 1435 bp long with the sequence CATACG repeat 64 times. Three protein-coding genes (ND1, COI and ND4) were ended with incomplete stop codon TA or T.
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
The complete mitochondrial genome of Rapana venosa (Gastropoda, Muricidae).
Sun, Xiujun; Yang, Aiguo
2016-01-01
The complete mitochondrial (mt) genome of the veined rapa whelk, Rapana venosa, was determined using genome walking techniques in this study. The total length of the mt genome sequence of R. venosa was 15,271 bp, which is comparable to the reported Muricidae mitogenomes to date. It contained 13 protein-coding genes, 21 transfer RNA genes, and two ribosomal RNA genes. A bias towards a higher representation of nucleotides A and T (69%) was detected in the mt genome of R. venosa. A small number of non-coding nucleotides (302 bp) was detected, and the largest non-coding region was 74 bp in length.
Towards a complete map of the human long non-coding RNA transcriptome.
Uszczynska-Ratajczak, Barbara; Lagarde, Julien; Frankish, Adam; Guigó, Roderic; Johnson, Rory
2018-05-23
Gene maps, or annotations, enable us to navigate the functional landscape of our genome. They are a resource upon which virtually all studies depend, from single-gene to genome-wide scales and from basic molecular biology to medical genetics. Yet present-day annotations suffer from trade-offs between quality and size, with serious but often unappreciated consequences for downstream studies. This is particularly true for long non-coding RNAs (lncRNAs), which are poorly characterized compared to protein-coding genes. Long-read sequencing technologies promise to improve current annotations, paving the way towards a complete annotation of lncRNAs expressed throughout a human lifetime.
Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander
2008-04-01
We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.
Recent progress in plasma modelling at INFN-LNS
NASA Astrophysics Data System (ADS)
Neri, L.; Castro, G.; Torrisi, G.; Galatà, A.; Mascali, D.; Celona, L.; Gammino, S.
2016-02-01
At Istituto Nazionale di Fisica Nucleare - Laboratori Nazionali del Sud (INFN-LNS), the development of intense ion and proton sources has been supported by a great deal of work on the modelling of microwave generated plasmas for many years. First, a stationary version of the particle-in-cell code was developed for plasma modelling starting from an iterative strategy adopted for the space charge dominated beam transport simulations. Electromagnetic properties of the plasma and full-waves simulations are now affordable for non-homogenous and non-isotropic magnetized plasma via "cold" approximation. The effects of Coulomb collisions on plasma particles dynamics was implemented with the Langevin formalism, instead of simply applying the Spitzer 90° collisions through a Monte Carlo technique. A wide database of different cross sections related to reactions occurring in a hydrogen plasma was implemented. The next step consists of merging such a variety of approaches for retrieving an "as-a-whole" picture of plasma dynamics in ion sources. The preliminary results will be summarized in the paper for a microwave discharge ion source designed for intense and high quality proton beams production, proton source for European Spallation Source project. Even if the realization of a predictive software including the complete processes involved in plasma formation is still rather far, a better comprehension of the source behavior is possible and so the simulations may support the optimization phase.
Recent progress in plasma modelling at INFN-LNS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neri, L., E-mail: neri@lns.infn.it; Castro, G.; Mascali, D.
2016-02-15
At Istituto Nazionale di Fisica Nucleare - Laboratori Nazionali del Sud (INFN-LNS), the development of intense ion and proton sources has been supported by a great deal of work on the modelling of microwave generated plasmas for many years. First, a stationary version of the particle-in-cell code was developed for plasma modelling starting from an iterative strategy adopted for the space charge dominated beam transport simulations. Electromagnetic properties of the plasma and full-waves simulations are now affordable for non-homogenous and non-isotropic magnetized plasma via “cold” approximation. The effects of Coulomb collisions on plasma particles dynamics was implemented with the Langevinmore » formalism, instead of simply applying the Spitzer 90° collisions through a Monte Carlo technique. A wide database of different cross sections related to reactions occurring in a hydrogen plasma was implemented. The next step consists of merging such a variety of approaches for retrieving an “as-a-whole” picture of plasma dynamics in ion sources. The preliminary results will be summarized in the paper for a microwave discharge ion source designed for intense and high quality proton beams production, proton source for European Spallation Source project. Even if the realization of a predictive software including the complete processes involved in plasma formation is still rather far, a better comprehension of the source behavior is possible and so the simulations may support the optimization phase.« less
75 FR 14331 - Disaster Assistance Loan Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... meet current building code requirements. If your business is a major source of employment, SBA may..., granting tax exemption under sections 510(c), (d), or (e) of the Internal Revenue Code of 1954, or (2...; 8:45 am] BILLING CODE 8025-01-P ...
VizieR Online Data Catalog: SKY2000 Master Catalog, Version 5 (Myers+ 2006)
NASA Astrophysics Data System (ADS)
Myers, J. R.; Sande, C. B.; Miller, A. C.; Warren, W. H., Jr.; Tracewell, D. A.
2015-02-01
The SKYMAP Star Catalog System consists of a Master Catalog stellar database and a collection of utility software designed to create and maintain the database and to generate derivative mission star catalogs (run catalogs). It contains an extensive compilation of information on almost 300000 stars brighter than 8.0mag. The original SKYMAP Master Catalog was generated in the early 1970's. Incremental updates and corrections were made over the following years but the first complete revision of the source data occurred with Version 4.0. This revision also produced a unique, consolidated source of astrometric information which can be used by the astronomical community. The derived quantities were removed and wideband and photometric data in the R (red) and I (infrared) systems were added. Version 4 of the SKY2000 Master Catalog was completed in April 2002; it marks the global replacement of the variability identifier and variability data fields. More details can be found in the description file sky2kv4.pdf. The SKY2000 Version 5 Revision 4 Master Catalog differs from Revision 3 in that MK and HD spectral types have been added from the Catalogue of Stellar Spectral Classifications (B. A. Skiff of Lowell Observatory, 2005), which has been assigned source code 50 in this process. 9622 entries now have MK types from this source, while 3976 entries have HD types from this source. SKY2000 V5 R4 also differs globally from preceding MC versions in that the Galactic coordinate computations performed by UPDATE have been increased in accuracy, so that differences from the same quantities from other sources are now typically in the last decimal places carried in the MC. This version supersedes the previous versions 1(V/95), 2(V/102), 3(V/105) and 4(V/109). (6 data files).
NASA Astrophysics Data System (ADS)
Smith, J. A.; Peter, D. B.; Tromp, J.; Komatitsch, D.; Lefebvre, M. P.
2015-12-01
We present both SPECFEM3D_Cartesian and SPECFEM3D_GLOBE open-source codes, representing high-performance numerical wave solvers simulating seismic wave propagation for local-, regional-, and global-scale application. These codes are suitable for both forward propagation in complex media and tomographic imaging. Both solvers compute highly accurate seismic wave fields using the continuous Galerkin spectral-element method on unstructured meshes. Lateral variations in compressional- and shear-wave speeds, density, as well as 3D attenuation Q models, topography and fluid-solid coupling are all readily included in both codes. For global simulations, effects due to rotation, ellipticity, the oceans, 3D crustal models, and self-gravitation are additionally included. Both packages provide forward and adjoint functionality suitable for adjoint tomography on high-performance computing architectures. We highlight the most recent release of the global version which includes improved performance, simultaneous MPI runs, OpenCL and CUDA support via an automatic source-to-source transformation library (BOAST), parallel I/O readers and writers for databases using ADIOS and seismograms using the recently developed Adaptable Seismic Data Format (ASDF) with built-in provenance. This makes our spectral-element solvers current state-of-the-art, open-source community codes for high-performance seismic wave propagation on arbitrarily complex 3D models. Together with these solvers, we provide full-waveform inversion tools to image the Earth's interior at unprecedented resolution.
Large liquid rocket engine transient performance simulation system
NASA Technical Reports Server (NTRS)
Mason, J. R.; Southwick, R. D.
1989-01-01
Phase 1 of the Rocket Engine Transient Simulation (ROCETS) program consists of seven technical tasks: architecture; system requirements; component and submodel requirements; submodel implementation; component implementation; submodel testing and verification; and subsystem testing and verification. These tasks were completed. Phase 2 of ROCETS consists of two technical tasks: Technology Test Bed Engine (TTBE) model data generation; and system testing verification. During this period specific coding of the system processors was begun and the engineering representations of Phase 1 were expanded to produce a simple model of the TTBE. As the code was completed, some minor modifications to the system architecture centering on the global variable common, GLOBVAR, were necessary to increase processor efficiency. The engineering modules completed during Phase 2 are listed: INJTOO - main injector; MCHBOO - main chamber; NOZLOO - nozzle thrust calculations; PBRNOO - preburner; PIPE02 - compressible flow without inertia; PUMPOO - polytropic pump; ROTROO - rotor torque balance/speed derivative; and TURBOO - turbine. Detailed documentation of these modules is in the Appendix. In addition to the engineering modules, several submodules were also completed. These submodules include combustion properties, component performance characteristics (maps), and specific utilities. Specific coding was begun on the system configuration processor. All functions necessary for multiple module operation were completed but the SOLVER implementation is still under development. This system, the Verification Checkout Facility (VCF) allows interactive comparison of module results to store data as well as provides an intermediate checkout of the processor code. After validation using the VCF, the engineering modules and submodules were used to build a simple TTBE.
AN OPEN-SOURCE NEUTRINO RADIATION HYDRODYNAMICS CODE FOR CORE-COLLAPSE SUPERNOVAE
DOE Office of Scientific and Technical Information (OSTI.GOV)
O’Connor, Evan, E-mail: evanoconnor@ncsu.edu; CITA, Canadian Institute for Theoretical Astrophysics, Toronto, M5S 3H8
2015-08-15
We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrinomore » transport calculations is the neutrino–matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.« less
1 CFR 11.3 - Code of Federal Regulations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 1 General Provisions 1 2010-01-01 2010-01-01 false Code of Federal Regulations. 11.3 Section 11.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER AVAILABILITY OF OFFICE OF THE FEDERAL... complete set of the Code of Federal Regulations is $1,019 per year for the bound, paper edition, or $247...
1 CFR 11.3 - Code of Federal Regulations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 1 General Provisions 1 2011-01-01 2011-01-01 false Code of Federal Regulations. 11.3 Section 11.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER AVAILABILITY OF OFFICE OF THE FEDERAL... complete set of the Code of Federal Regulations is $1,019 per year for the bound, paper edition, or $247...
NASA Technical Reports Server (NTRS)
Dash, S. M.; Pergament, H. S.
1978-01-01
The basic code structure is discussed, including the overall program flow and a brief description of all subroutines. Instructions on the preparation of input data, definitions of key FORTRAN variables, sample input and output, and a complete listing of the code are presented.
Shock Spectrum Calculation from Acceleration Time Histories
1980-09-01
CLASSIFICATIONe OF THIS PAGE (Uh-e DOg ~ 9--t)____________________ REPORT DOCUMENTATION PAGE BEFORE COMPLETING FORM I. REPRT NU9911ACCUIISIO6 NO .3ASCCSPICHT’S...SCE. Oakland CA NAVSCOLCECOFF C35 Port Hueneme. CA,. CO, Code C44A Porn Hueneme. CA NAVSEASYSCOM Code 05M13 (Newhouse) Wash DC; Code 6212, Wash DC
1 CFR 11.3 - Code of Federal Regulations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 1 General Provisions 1 2012-01-01 2012-01-01 false Code of Federal Regulations. 11.3 Section 11.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER AVAILABILITY OF OFFICE OF THE FEDERAL... complete set of the Code of Federal Regulations is $1,019 per year for the bound, paper edition, or $247...
Towards Holography via Quantum Source-Channel Codes.
Pastawski, Fernando; Eisert, Jens; Wilming, Henrik
2017-07-14
While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.
Towards Holography via Quantum Source-Channel Codes
NASA Astrophysics Data System (ADS)
Pastawski, Fernando; Eisert, Jens; Wilming, Henrik
2017-07-01
While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.
1978-01-01
complex, applications of the code . NASCAP CODE DESCRIPTION The NASCAP code is a finite-element spacecraft-charging simulation that is written in FORTRAN ...transport code POEM (ref. 1), is applicable to arbitrary dielectrics, source spectra, and current time histories. The code calculations are illustrated by...iaxk ’. Vlbouced _DstributionL- 9TNA Availability Codes %ELECTEf Nationa Aeronautics and Dist. Spec al TAvalland/or. MAY 2 21980 Space Administration
Technology Infusion of CodeSonar into the Space Network Ground Segment
NASA Technical Reports Server (NTRS)
Benson, Markland J.
2009-01-01
This slide presentation reviews the applicability of CodeSonar to the Space Network software. CodeSonar is a commercial off the shelf system that analyzes programs written in C, C++ or Ada for defects in the code. Software engineers use CodeSonar results as an input to the existing source code inspection process. The study is focused on large scale software developed using formal processes. The systems studied are mission critical in nature but some use commodity computer systems.
Experimental benchmark of the NINJA code for application to the Linac4 H- ion source plasma
NASA Astrophysics Data System (ADS)
Briefi, S.; Mattei, S.; Rauner, D.; Lettry, J.; Tran, M. Q.; Fantz, U.
2017-10-01
For a dedicated performance optimization of negative hydrogen ion sources applied at particle accelerators, a detailed assessment of the plasma processes is required. Due to the compact design of these sources, diagnostic access is typically limited to optical emission spectroscopy yielding only line-of-sight integrated results. In order to allow for a spatially resolved investigation, the electromagnetic particle-in-cell Monte Carlo collision code NINJA has been developed for the Linac4 ion source at CERN. This code considers the RF field generated by the ICP coil as well as the external static magnetic fields and calculates self-consistently the resulting discharge properties. NINJA is benchmarked at the diagnostically well accessible lab experiment CHARLIE (Concept studies for Helicon Assisted RF Low pressure Ion sourcEs) at varying RF power and gas pressure. A good general agreement is observed between experiment and simulation although the simulated electron density trends for varying pressure and power as well as the absolute electron temperature values deviate slightly from the measured ones. This can be explained by the assumption of strong inductive coupling in NINJA, whereas the CHARLIE discharges show the characteristics of loosely coupled plasmas. For the Linac4 plasma, this assumption is valid. Accordingly, both the absolute values of the accessible plasma parameters and their trends for varying RF power agree well in measurement and simulation. At varying RF power, the H- current extracted from the Linac4 source peaks at 40 kW. For volume operation, this is perfectly reflected by assessing the processes in front of the extraction aperture based on the simulation results where the highest H- density is obtained for the same power level. In surface operation, the production of negative hydrogen ions at the converter surface can only be considered by specialized beam formation codes, which require plasma parameters as input. It has been demonstrated that this input can be provided reliably by the NINJA code.
Christie, Andrew E.; Sommer, Stephanie A.; Cieslak, Matthew C.; Hartline, Daniel K.; Lenz, Petra H.
2017-01-01
Coral reef ecosystems of many sub-tropical and tropical marine coastal environments have suffered significant degradation from anthropogenic sources. Research to inform management strategies that mitigate stressors and promote a healthy ecosystem has focused on the ecology and physiology of coral reefs and associated organisms. Few studies focus on the surrounding pelagic communities, which are equally important to ecosystem function. Zooplankton, often dominated by small crustaceans such as copepods, is an important food source for invertebrates and fishes, especially larval fishes. The reef-associated zooplankton includes a sub-neustonic copepod family that could serve as an indicator species for the community. Here, we describe the generation of a de novo transcriptome for one such copepod, Labidocera madurae, a pontellid from an intensively-studied coral reef ecosystem, Kāne‘ohe Bay, Oahu, Hawai‘i. The transcriptome was assembled using high-throughput sequence data obtained from whole organisms. It comprised 211,002 unique transcripts, including 72,391 with coding regions. It was assessed for quality and completeness using multiple workflows. Bench-marking-universal-single-copy-orthologs (BUSCO) analysis identified transcripts for 88% of expected eukaryotic core proteins. Targeted gene-discovery analyses included searches for transcripts coding full-length “giant” proteins (>4,000 amino acids), proteins and splice variants of voltage-gated sodium channels, and proteins involved in the circadian signaling pathway. Four different reference transcriptomes were generated and compared for the detection of differential gene expression between copepodites and adult females; 6,229 genes were consistently identified as differentially expressed between the two regardless of reference. Automated bioinformatics analyses and targeted manual gene curation suggest that the de novo assembled L. madurae transcriptome is of high quality and completeness. This transcriptome provides a new resource for assessing the global physiological status of a planktonic species inhabiting a coral reef ecosystem that is subjected to multiple anthropogenic stressors. The workflows provide a template for generating and assessing transcriptomes in other non-model species. PMID:29065152
Roncalli, Vittoria; Christie, Andrew E; Sommer, Stephanie A; Cieslak, Matthew C; Hartline, Daniel K; Lenz, Petra H
2017-01-01
Coral reef ecosystems of many sub-tropical and tropical marine coastal environments have suffered significant degradation from anthropogenic sources. Research to inform management strategies that mitigate stressors and promote a healthy ecosystem has focused on the ecology and physiology of coral reefs and associated organisms. Few studies focus on the surrounding pelagic communities, which are equally important to ecosystem function. Zooplankton, often dominated by small crustaceans such as copepods, is an important food source for invertebrates and fishes, especially larval fishes. The reef-associated zooplankton includes a sub-neustonic copepod family that could serve as an indicator species for the community. Here, we describe the generation of a de novo transcriptome for one such copepod, Labidocera madurae, a pontellid from an intensively-studied coral reef ecosystem, Kāne'ohe Bay, Oahu, Hawai'i. The transcriptome was assembled using high-throughput sequence data obtained from whole organisms. It comprised 211,002 unique transcripts, including 72,391 with coding regions. It was assessed for quality and completeness using multiple workflows. Bench-marking-universal-single-copy-orthologs (BUSCO) analysis identified transcripts for 88% of expected eukaryotic core proteins. Targeted gene-discovery analyses included searches for transcripts coding full-length "giant" proteins (>4,000 amino acids), proteins and splice variants of voltage-gated sodium channels, and proteins involved in the circadian signaling pathway. Four different reference transcriptomes were generated and compared for the detection of differential gene expression between copepodites and adult females; 6,229 genes were consistently identified as differentially expressed between the two regardless of reference. Automated bioinformatics analyses and targeted manual gene curation suggest that the de novo assembled L. madurae transcriptome is of high quality and completeness. This transcriptome provides a new resource for assessing the global physiological status of a planktonic species inhabiting a coral reef ecosystem that is subjected to multiple anthropogenic stressors. The workflows provide a template for generating and assessing transcriptomes in other non-model species.
NASA Technical Reports Server (NTRS)
Meyer, Harold D.
1999-01-01
This second volume of Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code provides the scattering plots referenced by Volume 1. There are 648 plots. Half are for the 8750 rpm "high speed" operating condition and the other half are for the 7031 rpm "mid speed" operating condition.
Multispectral data compression through transform coding and block quantization
NASA Technical Reports Server (NTRS)
Ready, P. J.; Wintz, P. A.
1972-01-01
Transform coding and block quantization techniques are applied to multispectral aircraft scanner data, and digitized satellite imagery. The multispectral source is defined and an appropriate mathematical model proposed. The Karhunen-Loeve, Fourier, and Hadamard encoders are considered and are compared to the rate distortion function for the equivalent Gaussian source and to the performance of the single sample PCM encoder.
D-DSC: Decoding Delay-based Distributed Source Coding for Internet of Sensing Things
Akan, Ozgur B.
2018-01-01
Spatial correlation between densely deployed sensor nodes in a wireless sensor network (WSN) can be exploited to reduce the power consumption through a proper source coding mechanism such as distributed source coding (DSC). In this paper, we propose the Decoding Delay-based Distributed Source Coding (D-DSC) to improve the energy efficiency of the classical DSC by employing the decoding delay concept which enables the use of the maximum correlated portion of sensor samples during the event estimation. In D-DSC, network is partitioned into clusters, where the clusterheads communicate their uncompressed samples carrying the side information, and the cluster members send their compressed samples. Sink performs joint decoding of the compressed and uncompressed samples and then reconstructs the event signal using the decoded sensor readings. Based on the observed degree of the correlation among sensor samples, the sink dynamically updates and broadcasts the varying compression rates back to the sensor nodes. Simulation results for the performance evaluation reveal that D-DSC can achieve reliable and energy-efficient event communication and estimation for practical signal detection/estimation applications having massive number of sensors towards the realization of Internet of Sensing Things (IoST). PMID:29538405
NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML
NASA Technical Reports Server (NTRS)
Stueber, Thomas J.; Paxson, Daniel E.
2014-01-01
The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.
D-DSC: Decoding Delay-based Distributed Source Coding for Internet of Sensing Things.
Aktas, Metin; Kuscu, Murat; Dinc, Ergin; Akan, Ozgur B
2018-01-01
Spatial correlation between densely deployed sensor nodes in a wireless sensor network (WSN) can be exploited to reduce the power consumption through a proper source coding mechanism such as distributed source coding (DSC). In this paper, we propose the Decoding Delay-based Distributed Source Coding (D-DSC) to improve the energy efficiency of the classical DSC by employing the decoding delay concept which enables the use of the maximum correlated portion of sensor samples during the event estimation. In D-DSC, network is partitioned into clusters, where the clusterheads communicate their uncompressed samples carrying the side information, and the cluster members send their compressed samples. Sink performs joint decoding of the compressed and uncompressed samples and then reconstructs the event signal using the decoded sensor readings. Based on the observed degree of the correlation among sensor samples, the sink dynamically updates and broadcasts the varying compression rates back to the sensor nodes. Simulation results for the performance evaluation reveal that D-DSC can achieve reliable and energy-efficient event communication and estimation for practical signal detection/estimation applications having massive number of sensors towards the realization of Internet of Sensing Things (IoST).
Code Analysis and Refactoring with Clang Tools, Version 0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelley, Timothy M.
2016-12-23
Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.
CACTI: free, open-source software for the sequential coding of behavioral interactions.
Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.
TEA: A Code Calculating Thermochemical Equilibrium Abundances
NASA Astrophysics Data System (ADS)
Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver
2016-07-01
We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. We tested the code against the method of Burrows & Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows & Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.
TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu
2016-07-01
We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but withmore » higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.« less
(I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.
van Rijnsoever, Frank J
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.
JGrass-NewAge hydrological system: an open-source platform for the replicability of science.
NASA Astrophysics Data System (ADS)
Bancheri, Marialaura; Serafin, Francesco; Formetta, Giuseppe; Rigon, Riccardo; David, Olaf
2017-04-01
JGrass-NewAge is an open source semi-distributed hydrological modelling system. It is based on the object modelling framework (OMS version 3), on the JGrasstools and on the Geotools. OMS3 allows to create independent packages of software which can be connected at run-time in a working modelling solution. These components are available as library/dependency or as repository to fork in order to add further features. Different tools are adopted to make easier the integration, the interoperability and the use of each package. Most of the components are Gradle integrated, since it represents the state-of-art of the building systems, especially for Java projects. The continuous integration is a further layer between local source code (client-side) and remote repository (server-side) and ensures the building and the testing of the source code at each commit. Finally, the use of Zenodo makes the code hosted in GitHub unique, citable and traceable, with a defined DOI. Following the previous standards, each part of the hydrological cycle is implemented in JGrass-NewAge as a component that can be selected, adopted, and connected to obtain a user "customized" hydrological model. A variety of modelling solutions are possible, allowing a complete hydrological analysis. Moreover, thanks to the JGrasstools and the Geotools, the visualization of the data and of the results using a selected GIS is possible. After the geomorphological analysis of the watershed, the spatial interpolation of the meteorological inputs can be performed using both deterministic (IDW) and geostatistic (Kriging) algorithms. For the radiation balance, the shortwave and longwave radiation can be estimated, which are, in turn, inputs for the simulation of the evapotranspiration, according to Priestly-Taylor and Penman-Monteith formulas. Three degree-day models are implemented for the snow melting and SWE. The runoff production can be simulated using two different components, "Adige" and "Embedded Reservoirs". The travel time theory has recently been integrated for a coupled analysis of the solute transport. Eventually, each component can be connected to the different calibration tools such as LUCA and PSO. Further information about the actual implementation can be found at (https://github.com/geoframecomponents), while the OMS projects with the examples, data and results are available at (https://github.com/GEOframeOMSProjects).
Almansa, Julio F; Guerrero, Rafael; Torres, Javier; Lallena, Antonio M
60 Co sources have been commercialized as an alternative to 192 Ir sources for high-dose-rate (HDR) brachytherapy. One of them is the Flexisource Co-60 HDR source manufactured by Elekta. The only available dosimetric characterization of this source is that of Vijande et al. [J Contemp Brachytherapy 2012; 4:34-44], whose results were not included in the AAPM/ESTRO consensus document. In that work, the dosimetric quantities were calculated as averages of the results obtained with the Geant4 and PENELOPE Monte Carlo (MC) codes, though for other sources, significant differences have been quoted between the values obtained with these two codes. The aim of this work is to perform the dosimetric characterization of the Flexisource Co-60 HDR source using PENELOPE. The MC simulation code PENELOPE (v. 2014) has been used. Following the recommendations of the AAPM/ESTRO report, the radial dose function, the anisotropy function, the air-kerma strength, the dose rate constant, and the absorbed dose rate in water have been calculated. The results we have obtained exceed those of Vijande et al. In particular, the absorbed dose rate constant is ∼0.85% larger. A similar difference is also found in the other dosimetric quantities. The effect of the electrons emitted in the decay of 60 Co, usually neglected in this kind of simulations, is significant up to the distances of 0.25 cm from the source. The systematic and significant differences we have found between PENELOPE results and the average values found by Vijande et al. point out that the dosimetric characterizations carried out with the various MC codes should be provided independently. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Complete mitochondrial genome of the agarophyte red alga Gelidium vagum (Gelidiales).
Yang, Eun Chan; Kim, Kyeong Mi; Boo, Ga Hun; Lee, Jung-Hyun; Boo, Sung Min; Yoon, Hwan Su
2014-08-01
We describe the first complete mitochondrial genome of Gelidium vagum (Gelidiales) (24,901 bp, 30.4% GC content), an agar-producing red alga. The circular mitochondrial genome contains 43 genes, including 23 protein-coding, 18 tRNA and 2 rRNA genes. All the protein-coding genes have a typical ATG start codon. No introns were found. Two genes, secY and rps12, were overlapped by 41 bp.
Peng, Rui; Zeng, Bo; Meng, Xiuxiang; Yue, Bisong; Zhang, Zhihe; Zou, Fangdong
2007-08-01
The complete mitochondrial genome sequence of the giant panda, Ailuropoda melanoleuca, was determined by the long and accurate polymerase chain reaction (LA-PCR) with conserved primers and primer walking sequence methods. The complete mitochondrial DNA is 16,805 nucleotides in length and contains two ribosomal RNA genes, 13 protein-coding genes, 22 transfer RNA genes and one control region. The total length of the 13 protein-coding genes is longer than the American black bear, brown bear and polar bear by 3 amino acids at the end of ND5 gene. The codon usage also followed the typical vertebrate pattern except for an unusual ATT start codon, which initiates the NADH dehydrogenase subunit 5 (ND5) gene. The molecular phylogenetic analysis was performed on the sequences of 12 concatenated heavy-strand encoded protein-coding genes, and suggested that the giant panda is most closely related to bears.
NASA Astrophysics Data System (ADS)
Ferland, G. J.; Chatzikos, M.; Guzmán, F.; Lykins, M. L.; van Hoof, P. A. M.; Williams, R. J. R.; Abel, N. P.; Badnell, N. R.; Keenan, F. P.; Porter, R. L.; Stancil, P. C.
2017-10-01
We describe the 2017 release of the spectral synthesis code Cloudy, summarizing the many improvements to the scope and accuracy of the physics which have been made since the previous release. Exporting the atomic data into external data files has enabled many new large datasets to be incorporated into the code. The use of the complete datasets is not realistic for most calculations, so we describe the limited subset of data used by default, which predicts significantly more lines than the previous release of Cloudy. This version is nevertheless faster than the previous release, as a result of code optimizations. We give examples of the accuracy limits using small models, and the performance requirements of large complete models. We summarize several advances in the H- and He-like iso-electronic sequences and use our complete collisional-radiative models to establish the densities where the coronal and local thermodynamic equilibrium approximations work.
The Final Kepler Planet Candidate Catalog (DR25)
NASA Astrophysics Data System (ADS)
Coughlin, Jeffrey; Thompson, Susan E.; Kepler Team
2017-06-01
We present Kepler's final planet candidate catalog, which is based on the Q1--Q17 DR25 data release and was created to allow for accurate calculations of planetary occurrence rates. We discuss improvements made to our fully automated candidate vetting procedure, which yields specific categories of false positives and a disposition score value to indicate decision confidence. We present the use of light curve inversion and scrambling, in addition to our continued use of pixel-level transit injection, to produce artificial planet candidates and false positives. Since these simulated data sets were subjected to the same automated vetting procedure as the real data set, we are able to measure both the completeness and reliability of the catalog. The DR25 catalog, source code, and a multitude of completeness and reliability data products are available at the Exoplanet Archive (http://exoplanetarchive.ipac.caltech.edu). The DR25 light curves and pixel-level data are available at MAST (http://archive.stsci.edu/kepler).
Polymer electrolyte fuel cells
NASA Astrophysics Data System (ADS)
Gottesfeld, S.
The recent increase in attention to polymer electrolyte fuel cells (PEFC's) is the result of significant technical advances in this technology and the initiation of some projects for the demonstration of complete PEFC-based power system in a bus or in a passenger car. A PEFC powered vehicle has the potential for zero emission, high energy conversion efficiency and extended range compared to present day battery powered EV's. This paper describes recent achievements in R&D on PEFC's. The major thrust areas have been: (1) demonstration of membrane/electrode assemblies with stable high performance in life tests lasting 4000 hours, employing ultra-low Pt loadings corresponding to only 1/2 oz of Pt for the complete power source of a passenger car; (2) effective remedies for the high sensitivity of the Pt electrocatalyst to impurities in the fuel feed stream; and (3) comprehensive evaluation of the physicochemical properties of membrane and electrodes in the PEFC, clarifying the water management issues and enabling effective codes and diagnostics for this fuel cell.
A forensic identification case and DPid - can it be a useful tool?
Queiroz, Cristhiane Leão de; Bostock, Ellen Marie; Santos, Carlos Ferreira; Guimarães, Marco Aurélio; Silva, Ricardo Henrique Alves da
2017-01-01
The aim of this study was to show DPid as an important tool of potential application to solve cases with dental prosthesis, such as the forensic case reported, in which a skull, denture and dental records were received for analysis. Human identification is still challenging in various circumstances and Dental Prosthetics Identification (DPid) stores the patient's name and prosthesis information and provides access through an embedded code in dental prosthesis or an identification card. All of this information is digitally stored on servers accessible only by dentists, laboratory technicians and patients with their own level of secure access. DPid provides a complete single-source list of all dental prosthesis features (materials and components) under complete and secure documentation used for clinical follow-up and for human identification. If DPid tool was present in this forensic case, it could have been solved without requirement of DNA exam, which confirmed the dental comparison of antemortem and postmortem records, and concluded the case as a positive identification.
Zipf's Law in Short-Time Timbral Codings of Speech, Music, and Environmental Sound Signals
Haro, Martín; Serrà, Joan; Herrera, Perfecto; Corral, Álvaro
2012-01-01
Timbre is a key perceptual feature that allows discrimination between different sounds. Timbral sensations are highly dependent on the temporal evolution of the power spectrum of an audio signal. In order to quantitatively characterize such sensations, the shape of the power spectrum has to be encoded in a way that preserves certain physical and perceptual properties. Therefore, it is common practice to encode short-time power spectra using psychoacoustical frequency scales. In this paper, we study and characterize the statistical properties of such encodings, here called timbral code-words. In particular, we report on rank-frequency distributions of timbral code-words extracted from 740 hours of audio coming from disparate sources such as speech, music, and environmental sounds. Analogously to text corpora, we find a heavy-tailed Zipfian distribution with exponent close to one. Importantly, this distribution is found independently of different encoding decisions and regardless of the audio source. Further analysis on the intrinsic characteristics of most and least frequent code-words reveals that the most frequent code-words tend to have a more homogeneous structure. We also find that speech and music databases have specific, distinctive code-words while, in the case of the environmental sounds, this database-specific code-words are not present. Finally, we find that a Yule-Simon process with memory provides a reasonable quantitative approximation for our data, suggesting the existence of a common simple generative mechanism for all considered sound sources. PMID:22479497
The HYPE Open Source Community
NASA Astrophysics Data System (ADS)
Strömbäck, L.; Pers, C.; Isberg, K.; Nyström, K.; Arheimer, B.
2013-12-01
The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model. It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. HYPE has been successfully used in many hydrological applications at SMHI. For Europe, we currently have three different models; The S-HYPE model for Sweden; The BALT-HYPE model for the Baltic Sea; and the E-HYPE model for the whole Europe. These models simulate hydrological conditions and nutrients for their respective areas and are used for characterization, forecasts, and scenario analyses. Model data can be downloaded from hypeweb.smhi.se. In addition, we provide models for the Arctic region, the Arab (Middle East and Northern Africa) region, India, the Niger River basin, the La Plata Basin. This demonstrates the applicability of the HYPE model for large scale modeling in different regions of the world. An important goal with our work is to make our data and tools available as open data and services. For this aim we created the HYPE Open Source Community (OSC) that makes the source code of HYPE available for anyone interested in further development of HYPE. The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modeling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed and learnt from. New versions of the main code are delivered frequently. HYPE OSC is open to everyone interested in hydrology, hydrological modeling and code development - e.g. scientists, authorities, and consultancies. By joining the HYPE OSC you get access a state-of-the-art operational hydrological model. The HYPE source code is designed to efficiently handle large scale modeling for forecast, hindcast and climate applications. The code is under constant development to improve the hydrological processes, efficiency and readability. In the beginning of 2013 we released a version with new and better modularization based on hydrological processes. This will make the code easier to understand and further develop for a new user. An important challenge in this process is to produce code that is easy for anyone to understand and work with, but still maintain the properties that make the code efficient enough for large scale applications. Input from the HYPE Open Source Community is an important source for future improvements of the HYPE model. Therefore, by joining the community you become an active part of the development, get access to the latest features and can influence future versions of the model.
The language parallel Pascal and other aspects of the massively parallel processor
NASA Technical Reports Server (NTRS)
Reeves, A. P.; Bruner, J. D.
1982-01-01
A high level language for the Massively Parallel Processor (MPP) was designed. This language, called Parallel Pascal, is described in detail. A description of the language design, a description of the intermediate language, Parallel P-Code, and details for the MPP implementation are included. Formal descriptions of Parallel Pascal and Parallel P-Code are given. A compiler was developed which converts programs in Parallel Pascal into the intermediate Parallel P-Code language. The code generator to complete the compiler for the MPP is being developed independently. A Parallel Pascal to Pascal translator was also developed. The architecture design for a VLSI version of the MPP was completed with a description of fault tolerant interconnection networks. The memory arrangement aspects of the MPP are discussed and a survey of other high level languages is given.
The complete mitochondrial genome of Hydra vulgaris (Hydroida: Hydridae).
Pan, Hong-Chun; Fang, Hong-Yan; Li, Shi-Wei; Liu, Jun-Hong; Wang, Ying; Wang, An-Tai
2014-12-01
The complete mitochondrial genome of Hydra vulgaris (Hydroida: Hydridae) is composed of two linear DNA molecules. The mitochondrial DNA (mtDNA) molecule 1 is 8010 bp long and contains six protein-coding genes, large subunit rRNA, methionine and tryptophan tRNAs, two pseudogenes consisting respectively of a partial copy of COI, and terminal sequences at two ends of the linear mtDNA, while the mtDNA molecule 2 is 7576 bp long and contains seven protein-coding genes, small subunit rRNA, methionine tRNA, a pseudogene consisting of a partial copy of COI and terminal sequences at two ends of the linear mtDNA. COI gene begins with GTG as start codon, whereas other 12 protein-coding genes start with a typical ATG initiation codon. In addition, all protein-coding genes are terminated with TAA as stop codon.
40 CFR Appendix A to Subpart A of... - Tables
Code of Federal Regulations, 2011 CFR
2011-07-01
... precursor of PM2.5. Table 2a to Appendix A of Subpart A—Data Elements for Reporting on Emissions From Point Sources, Where Required by 40 CFR 51.30 Data elements Every-yearreporting Three-yearreporting (1... phone number ✓ ✓ (6) FIPS code ✓ ✓ (7) Facility ID codes ✓ ✓ (8) Unit ID code ✓ ✓ (9) Process ID code...