Is Ontario Moving to Provincial Negotiation of Teaching Contracts?
ERIC Educational Resources Information Center
Jefferson, Anne L.
2008-01-01
In Canada, the statutes governing public school teachers' collective bargaining are a combination of the provincial Labour Relations Act or Code and the respective provincial Education/School/Public Schools Act. As education is within the provincial, not federal, domain of legal responsibility, the specifics of each act or code can vary.…
Wright, Brandy; Semaan, Salaam
2013-01-01
Objectives. We assessed expected ethics competencies of public health professionals in codes and competencies, reviewed ethics instruction at schools of public health, and recommended ways to bridge the gap between them. Methods. We reviewed the code of ethics and 3 sets of competencies, separating ethics-related competencies into 3 domains: professional, research, and public health. We reviewed ethics course requirements in 2010–2011 on the Internet sites of 46 graduate schools of public health and categorized courses as required, not required, or undetermined. Results. Half of schools (n = 23) required an ethics course for graduation (master’s or doctoral level), 21 did not, and 2 had no information. Sixteen of 23 required courses were 3-credit courses. Course content varied from 1 ethics topic to many topics addressing multiple ethics domains. Conclusions. Consistent ethics education and competency evaluation can be accomplished through a combination of a required course addressing the 3 domains, integration of ethics topics in other courses, and “booster” trainings. Enhancing ethics competence of public health professionals is important to address the ethical questions that arise in public health research, surveillance, practice, and policy. PMID:22994177
Lessons Learned through the Development and Publication of AstroImageJ
NASA Astrophysics Data System (ADS)
Collins, Karen
2018-01-01
As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
A Multiphysics and Multiscale Software Environment for Modeling Astrophysical Systems
NASA Astrophysics Data System (ADS)
Portegies Zwart, Simon; McMillan, Steve; O'Nualláin, Breanndán; Heggie, Douglas; Lombardi, James; Hut, Piet; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Fuji, Michiko; Gaburov, Evghenii; Glebbeek, Evert; Groen, Derek; Harfst, Stefan; Izzard, Rob; Jurić, Mario; Justham, Stephen; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel
We present MUSE, a software framework for tying together existing computational tools for different astrophysical domains into a single multiphysics, multiscale workload. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for a generalized stellar systems workload. MUSE has now reached a "Noah's Ark" milestone, with two available numerical solvers for each domain. MUSE can treat small stellar associations, galaxies and everything in between, including planetary systems, dense stellar clusters and galactic nuclei. Here we demonstrate an examples calculated with MUSE: the merger of two galaxies. In addition we demonstrate the working of MUSE on a distributed computer. The current MUSE code base is publicly available as open source at http://muse.li.
Weather Research and Forecasting Model with Vertical Nesting Capability
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-08-01
The Weather Research and Forecasting (WRF) model with vertical nesting capability is an extension of the WRF model, which is available in the public domain, from www.wrf-model.org. The new code modifies the nesting procedure, which passes lateral boundary conditions between computational domains in the WRF model. Previously, the same vertical grid was required on all domains, while the new code allows different vertical grids to be used on concurrently run domains. This new functionality improves WRF's ability to produce high-resolution simulations of the atmosphere by allowing a wider range of scales to be efficiently resolved and more accurate lateral boundarymore » conditions to be provided through the nesting procedure.« less
Bernheim, Ruth Gaare; Stefanak, Matthew; Brandenburg, Terry; Pannone, Aaron; Melnick, Alan
2013-01-01
As public health departments around the country undergo accreditation using the Public Health Accreditation Board standards, the process provides a new opportunity to integrate ethics metrics into day-to-day public health practice. While the accreditation standards do not explicitly address ethics, ethical tools and considerations can enrich the accreditation process by helping health departments and their communities understand what ethical principles underlie the accreditation standards and how to use metrics based on these ethical principles to support decision making in public health practice. We provide a crosswalk between a public health essential service, Public Health Accreditation Board community engagement domain standards, and the relevant ethical principles in the Public Health Code of Ethics (Code). A case study illustrates how the accreditation standards and the ethical principles in the Code together can enhance the practice of engaging the community in decision making in the local health department.
Kim Jong IL and North Korea: The Leader and the System
2006-03-01
defined in Title 17, United States Code , section 101. As such, it is in the public domain, and under the provisions of Title 17, United States Code ...diplomat.35 Kim probably also had a major hand in designing the massive Nuremburg -style rally at the stadium that Albright witnessed the day before
NASA Astrophysics Data System (ADS)
Nelson, J. M.; Shimizu, Y.; McDonald, R.; Takebayashi, H.
2009-12-01
The International River Interface Cooperative is an informal organization made up of academic faculty and government scientists with the goal of developing, distributing and providing education for a public-domain software interface for modeling river flow and morphodynamics. Formed in late 2007, the group released the first version of this interface (iRIC) in late 2009. iRIC includes models for two and three-dimensional flow, sediment transport, bed evolution, groundwater-surface water interaction, topographic data processing, and habitat assessment, as well as comprehensive data and model output visualization, mapping, and editing tools. All the tools in iRIC are specifically designed for use in river reaches and utilize common river data sets. The models are couched within a single graphical user interface so that a broad spectrum of models are available to users without learning new pre- and post-processing tools. The first version of iRIC was developed by combining the USGS public-domain Multi-Dimensional Surface Water Modeling System (MD_SWMS), developed at the USGS Geomorphology and Sediment Transport Laboratory in Golden, Colorado, with the public-domain river modeling code NAYS developed by the Universities of Hokkaido and Kyoto, Mizuho Corporation, and the Foundation of the River Disaster Prevention Research Institute in Sapporo, Japan. Since this initial effort, other Universities and Agencies have joined the group, and the interface has been expanded to allow users to integrate their own modeling code using Executable Markup Language (XML), which provides easy access and expandability to the iRIC software interface. In this presentation, the current components of iRIC are described and results from several practical modeling applications are presented to illustrate the capabilities and flexibility of the software. In addition, some future extensions to iRIC are demonstrated, including software for Lagrangian particle tracking and the prediction of bedform development and response to time-varying flows. Education and supporting documentation for iRIC, including detailed tutorials, are available at www.i-ric.org. The iRIC model codes, interface, and all supporting documentation are in the public domain.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-31
... governance. This notice of inquiry (NOI) seeks to meet that goal by requesting public comment on current... delegated manager facilitates and manages domain name registrations using this locality name such as tourism... the USG policy supporting the multistakholder model of Internet governance. Input regarding the value...
A multiphysics and multiscale software environment for modeling astrophysical systems
NASA Astrophysics Data System (ADS)
Portegies Zwart, Simon; McMillan, Steve; Harfst, Stefan; Groen, Derek; Fujii, Michiko; Nualláin, Breanndán Ó.; Glebbeek, Evert; Heggie, Douglas; Lombardi, James; Hut, Piet; Angelou, Vangelis; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Gaburov, Evghenii; Izzard, Rob; Jurić, Mario; Justham, Stephen; Sottoriva, Andrea; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel
2009-05-01
We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a "Noah's Ark" milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multiscale and multiphysics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe three examples calculated using MUSE: the merger of two galaxies, the merger of two evolving stars, and a hybrid N-body simulation. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.
NASA Astrophysics Data System (ADS)
Harfst, S.; Portegies Zwart, S.; McMillan, S.
2008-12-01
We present MUSE, a software framework for combining existing computational tools from different astrophysical domains into a single multi-physics, multi-scale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a ``Noah's Ark'' milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multi-scale and multi-physics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe two examples calculated using MUSE: the merger of two galaxies and an N-body simulation with live stellar evolution. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
Automated UMLS-Based Comparison of Medical Forms
Dugas, Martin; Fritz, Fleur; Krumm, Rainer; Breil, Bernhard
2013-01-01
Medical forms are very heterogeneous: on a European scale there are thousands of data items in several hundred different systems. To enable data exchange for clinical care and research purposes there is a need to develop interoperable documentation systems with harmonized forms for data capture. A prerequisite in this harmonization process is comparison of forms. So far – to our knowledge – an automated method for comparison of medical forms is not available. A form contains a list of data items with corresponding medical concepts. An automatic comparison needs data types, item names and especially item with these unique concept codes from medical terminologies. The scope of the proposed method is a comparison of these items by comparing their concept codes (coded in UMLS). Each data item is represented by item name, concept code and value domain. Two items are called identical, if item name, concept code and value domain are the same. Two items are called matching, if only concept code and value domain are the same. Two items are called similar, if their concept codes are the same, but the value domains are different. Based on these definitions an open-source implementation for automated comparison of medical forms in ODM format with UMLS-based semantic annotations was developed. It is available as package compareODM from http://cran.r-project.org. To evaluate this method, it was applied to a set of 7 real medical forms with 285 data items from a large public ODM repository with forms for different medical purposes (research, quality management, routine care). Comparison results were visualized with grid images and dendrograms. Automated comparison of semantically annotated medical forms is feasible. Dendrograms allow a view on clustered similar forms. The approach is scalable for a large set of real medical forms. PMID:23861827
A Note on NCOM Temperature Forecast Error Calibration Using the Ensemble Transform
2009-01-01
Division Head Ruth H. Preller, 7300 Security, Code 1226 Office of Counsel,Code 1008.3 ADOR/Director NCST E. R. Franchi , 7000 Public Affairs...problem, local unbiased (correlation) and persistent errors (bias) of the Navy Coastal Ocean Modeling (NCOM) System nested in global ocean domains, are...system were made available in real-time without performing local data assimilation, though remote sensing and global data was assimilated on the
Validating administrative records in post-traumatic stress disorder.
Abrams, Thad E; Vaughan-Sarrazin, Mary; Keane, Terence M; Richardson, Kelly
2016-03-01
There is insufficient data on the accuracy of administrative coding data (ACD) for post-traumatic stress disorder (PTSD). Medical records were reviewed for (1) a diagnosis of PTSD; (2) treatment for PTSD. The records were compared against the Veterans Health Administration (VHA) data in order to determine the positive predictive value (PPV) and negative predictive value (NPV) of three commonly used approaches. The PPV and NPV varied according to the ACD approach. Relative to a medical records review, the ACD approach of one or two PTSD coded outpatient encounters had a PPV of 78% and an NPV of 91%; whereas the PPV was 97% and the NPV was 98% for three or more PTSD codes. For pharmacotherapy, the ACD approach with one or two codes for PTSD had a PPV of 33% (NPV = 93%), whereas three or more PTSD coded encounters improved the PPV to 85% (NPV = 100%). When using VHA data, we recommend tailoring the identification strategy according to the research aims. An ACD approach identifying one or more PTSD outpatient encounters should be considered sufficient for a diagnosis of PTSD. Assessments for PTSD associated pharmacotherapy require using an ACD approach that identifies veterans with the presence ≥ 3 outpatient PTSD encounters. Published 2015. This article is a U.S. Government work and is in the public domain in the USA. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.
Public domain optical character recognition
NASA Astrophysics Data System (ADS)
Garris, Michael D.; Blue, James L.; Candela, Gerald T.; Dimmick, Darrin L.; Geist, Jon C.; Grother, Patrick J.; Janet, Stanley A.; Wilson, Charles L.
1995-03-01
A public domain document processing system has been developed by the National Institute of Standards and Technology (NIST). The system is a standard reference form-based handprint recognition system for evaluating optical character recognition (OCR), and it is intended to provide a baseline of performance on an open application. The system's source code, training data, performance assessment tools, and type of forms processed are all publicly available. The system recognizes the handprint entered on handwriting sample forms like the ones distributed with NIST Special Database 1. From these forms, the system reads hand-printed numeric fields, upper and lowercase alphabetic fields, and unconstrained text paragraphs comprised of words from a limited-size dictionary. The modular design of the system makes it useful for component evaluation and comparison, training and testing set validation, and multiple system voting schemes. The system contains a number of significant contributions to OCR technology, including an optimized probabilistic neural network (PNN) classifier that operates a factor of 20 times faster than traditional software implementations of the algorithm. The source code for the recognition system is written in C and is organized into 11 libraries. In all, there are approximately 19,000 lines of code supporting more than 550 subroutines. Source code is provided for form registration, form removal, field isolation, field segmentation, character normalization, feature extraction, character classification, and dictionary-based postprocessing. The recognition system has been successfully compiled and tested on a host of UNIX workstations. This paper gives an overview of the recognition system's software architecture, including descriptions of the various system components along with timing and accuracy statistics.
NASA Astrophysics Data System (ADS)
Wang, Cheng; Wang, Hongxiang; Ji, Yuefeng
2018-01-01
In this paper, a multi-bit wavelength coding phase-shift-keying (PSK) optical steganography method is proposed based on amplified spontaneous emission noise and wavelength selection switch. In this scheme, the assignment codes and the delay length differences provide a large two-dimensional key space. A 2-bit wavelength coding PSK system is simulated to show the efficiency of our proposed method. The simulated results demonstrate that the stealth signal after encoded and modulated is well-hidden in both time and spectral domains, under the public channel and noise existing in the system. Besides, even the principle of this scheme and the existence of stealth channel are known to the eavesdropper, the probability of recovering the stealth data is less than 0.02 if the key is unknown. Thus it can protect the security of stealth channel more effectively. Furthermore, the stealth channel will results in 0.48 dB power penalty to the public channel at 1 × 10-9 bit error rate, and the public channel will have no influence on the receiving of the stealth channel.
AN OVERVIEW OF EPANET VERSION 3.0
EPANET is a widely used public domain software package for modeling the hydraulic and water quality behavior of water distribution systems over an extended period of time. The last major update to the code was version 2.0 released in 2000 (Rossman, 2000). Since that time there ha...
PARAVT: Parallel Voronoi tessellation code
NASA Astrophysics Data System (ADS)
González, R. E.
2016-10-01
In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.
Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...
Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code
NASA Astrophysics Data System (ADS)
Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.
2015-12-01
WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).
Object-oriented controlled-vocabulary translator using TRANSOFT + HyperPAD.
Moore, G W; Berman, J J
1991-01-01
Automated coding of surgical pathology reports is demonstrated. This public-domain translation software operates on surgical pathology files, extracting diagnoses and assigning codes in a controlled medical vocabulary, such as SNOMED. Context-sensitive translation algorithms are employed, and syntactically correct diagnostic items are produced that are matched with controlled vocabulary. English-language surgical pathology reports, accessioned over one year at the Baltimore Veterans Affairs Medical Center, were translated. With an interface to a larger hospital information system, all natural language pathology reports are automatically rendered as topography and morphology codes. This translator frees the pathologist from the time-intensive task of personally coding each report, and may be used to flag certain diagnostic categories that require specific quality assurance actions.
Object-oriented controlled-vocabulary translator using TRANSOFT + HyperPAD.
Moore, G. W.; Berman, J. J.
1991-01-01
Automated coding of surgical pathology reports is demonstrated. This public-domain translation software operates on surgical pathology files, extracting diagnoses and assigning codes in a controlled medical vocabulary, such as SNOMED. Context-sensitive translation algorithms are employed, and syntactically correct diagnostic items are produced that are matched with controlled vocabulary. English-language surgical pathology reports, accessioned over one year at the Baltimore Veterans Affairs Medical Center, were translated. With an interface to a larger hospital information system, all natural language pathology reports are automatically rendered as topography and morphology codes. This translator frees the pathologist from the time-intensive task of personally coding each report, and may be used to flag certain diagnostic categories that require specific quality assurance actions. PMID:1807773
McParland, Joanna L; Williams, Lynn; Gozdzielewska, Lucyna; Young, Mairi; Smith, Fraser; MacDonald, Jennifer; Langdridge, Darren; Davis, Mark; Price, Lesley; Flowers, Paul
2018-05-27
Changing public awareness of antimicrobial resistance (AMR) represents a global public health priority. A systematic review of interventions that targeted public AMR awareness and associated behaviour was previously conducted. Here, we focus on identifying the active content of these interventions and explore potential mechanisms of action. The project took a novel approach to intervention mapping utilizing the following steps: (1) an exploration of explicit and tacit theory and theoretical constructs within the interventions using the Theoretical Domains Framework (TDFv2), (2) retrospective coding of behaviour change techniques (BCTs) using the BCT Taxonomy v1, and (3) an investigation of coherent links between the TDF domains and BCTs across the interventions. Of 20 studies included, only four reported an explicit theoretical basis to their intervention. However, TDF analysis revealed that nine of the 14 TDF domains were utilized, most commonly 'Knowledge' and 'Environmental context and resources'. The BCT analysis showed that all interventions contained at least one BCT, and 14 of 93 (15%) BCTs were coded, most commonly 'Information about health consequences', 'Credible source', and 'Instruction on how to perform the behaviour'. We identified nine relevant TDF domains and 14 BCTs used in these interventions. Only 15% of BCTs have been applied in AMR interventions thus providing a clear opportunity for the development of novel interventions in this context. This methodological approach provides a useful way of retrospectively mapping theoretical constructs and BCTs when reviewing studies that provide limited information on theory and intervention content. Statement of contribution What is already known on this subject? Evidence of the effectiveness of interventions that target the public to engage them with AMR is mixed; the public continue to show poor knowledge and misperceptions of AMR. Little is known about the common, active ingredients of AMR interventions targeting the public and information on explicit theoretical content is sparse. Information on the components of AMR public health interventions is urgently needed to enable the design of effective interventions to engage the public with AMR stewardship behaviour. What does this study add? The analysis shows very few studies reported any explicit theoretical basis to the interventions they described. Many interventions share common components, including core mechanisms of action and behaviour change techniques. The analysis suggests components of future interventions to engage the public with AMR. © 2018 The Authors. British Journal of Health Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
Resources for comparing the speed and performance of medical autocoders.
Berman, Jules J
2004-06-15
Concept indexing is a popular method for characterizing medical text, and is one of the most important early steps in many data mining efforts. Concept indexing differs from simple word or phrase indexing because concepts are typically represented by a nomenclature code that binds a medical concept to all equivalent representations. A concept search on the term renal cell carcinoma would be expected to find occurrences of hypernephroma, and renal carcinoma (concept equivalents). The purpose of this study is to provide freely available resources to compare speed and performance among different autocoders. These tools consist of: 1) a public domain autocoder written in Perl (a free and open source programming language that installs on any operating system); 2) a nomenclature database derived from the unencumbered subset of the publicly available Unified Medical Language System; 3) a large corpus of autocoded output derived from a publicly available medical text. A simple lexical autocoder was written that parses plain-text into a listing of all 1,2,3, and 4-word strings contained in text, assigning a nomenclature code for text strings that match terms in the nomenclature. The nomenclature used is the unencumbered subset of the 2003 Unified Medical Language System (UMLS). The unencumbered subset of UMLS was reduced to exclude homonymous one-word terms and proper names, resulting in a term/code data dictionary containing about a half million medical terms. The Online Mendelian Inheritance in Man (OMIM), a 92+ Megabyte publicly available medical opus, was used as sample medical text for the autocoder. The autocoding Perl script is remarkably short, consisting of just 38 command lines. The 92+ Megabyte OMIM file was completely autocoded in 869 seconds on a 2.4 GHz processor (less than 10 seconds per Megabyte of text). The autocoded output file (9,540,442 bytes) contains 367,963 coded terms from OMIM and is distributed with this manuscript. A public domain Perl script is provided that can parse through plain-text files of any length, matching concepts against an external nomenclature. The script and associated files can be used freely to compare the speed and performance of autocoding software.
UTM: Universal Transit Modeller
NASA Astrophysics Data System (ADS)
Deeg, Hans J.
2014-12-01
The Universal Transit Modeller (UTM) is a light-curve simulator for all kinds of transiting or eclipsing configurations between arbitrary numbers of several types of objects, which may be stars, planets, planetary moons, and planetary rings. A separate fitting program, UFIT (Universal Fitter) is part of the UTM distribution and may be used to derive best fits to light-curves for any set of continuously variable parameters. UTM/UFIT is written in IDL code and its source is released in the public domain under the GNU General Public License.
ERIC Educational Resources Information Center
Enfinger, Kerry Wayne
2016-01-01
The number of malicious files present in the public domain continues to rise at a substantial rate. Current anti-malware software utilizes a signature-based method to detect the presence of malicious software. Generating these pattern signatures is time consuming due to malicious code complexity and the need for expert analysis, however, by making…
Genomics-Based Security Protocols: From Plaintext to Cipherprotein
NASA Technical Reports Server (NTRS)
Shaw, Harry; Hussein, Sayed; Helgert, Hermann
2011-01-01
The evolving nature of the internet will require continual advances in authentication and confidentiality protocols. Nature provides some clues as to how this can be accomplished in a distributed manner through molecular biology. Cryptography and molecular biology share certain aspects and operations that allow for a set of unified principles to be applied to problems in either venue. A concept for developing security protocols that can be instantiated at the genomics level is presented. A DNA (Deoxyribonucleic acid) inspired hash code system is presented that utilizes concepts from molecular biology. It is a keyed-Hash Message Authentication Code (HMAC) capable of being used in secure mobile Ad hoc networks. It is targeted for applications without an available public key infrastructure. Mechanics of creating the HMAC are presented as well as a prototype HMAC protocol architecture. Security concepts related to the implementation differences between electronic domain security and genomics domain security are discussed.
Consumer behavior: a quadrennium.
Jacoby, J; Johar, G V; Morrin, M
1998-01-01
Consumer behavior continued to attract additional researchers and publication outlets from 1993 through 1996. Both general interest and domain-specific scholarly contributions are discussed, along with limitations and suggested areas for future research. A concluding section observes that the integrity of consumer research is unnecessarily compromised by the failure of the major scholarly association in the field to develop and adopt a code of researcher ethics.
Gonzalo, Jed D; Dekhtyar, Michael; Starr, Stephanie R; Borkan, Jeffrey; Brunett, Patrick; Fancher, Tonya; Green, Jennifer; Grethlein, Sara Jo; Lai, Cindy; Lawson, Luan; Monrad, Seetha; O'Sullivan, Patricia; Schwartz, Mark D; Skochelak, Susan
2017-01-01
The authors performed a review of 30 Accelerating Change in Medical Education full grant submissions and an analysis of the health systems science (HSS)-related curricula at the 11 grant recipient schools to develop a potential comprehensive HSS curricular framework with domains and subcategories. In phase 1, to identify domains, grant submissions were analyzed and coded using constant comparative analysis. In phase 2, a detailed review of all existing and planned syllabi and curriculum documents at the grantee schools was performed, and content in the core curricular domains was coded into subcategories. The lead investigators reviewed and discussed drafts of the categorization scheme, collapsed and combined domains and subcategories, and resolved disagreements via group discussion. Analysis yielded three types of domains: core, cross-cutting, and linking. Core domains included health care structures and processes; health care policy, economics, and management; clinical informatics and health information technology; population and public health; value-based care; and health system improvement. Cross-cutting domains included leadership and change agency; teamwork and interprofessional education; evidence-based medicine and practice; professionalism and ethics; and scholarship. One linking domain was identified: systems thinking. This broad framework aims to build on the traditional definition of systems-based practice and highlight the need for medical and other health professions schools to better align education programs with the anticipated needs of the systems in which students will practice. HSS will require a critical investigation into existing curricula to determine the most efficient methods for integration with the basic and clinical sciences.
Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.
2013-12-01
Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.
Ridgely, M Susan; Giard, Julienne; Shern, David; Mulkern, Virginia; Burnam, M Audrey
2002-01-01
Objective To develop an instrument to characterize public sector managed behavioral health care arrangements to capture key differences between managed and “unmanaged” care and among managed care arrangements. Study Design The instrument was developed by a multi-institutional group of collaborators with participation of an expert panel. Included are six domains predicted to have an impact on access, service utilization, costs, and quality. The domains are: characteristics of the managed care plan, enrolled population, benefit design, payment and risk arrangements, composition of provider networks, and accountability. Data are collected at three levels: managed care organization, subcontractor, and network of service providers. Data Collection Methods Data are collected through contract abstraction and key informant interviews. A multilevel coding scheme is used to organize the data into a matrix along key domains, which is then reviewed and verified by the key informants. Principal Findings This instrument can usefully differentiate between and among Medicaid fee-for-service programs and Medicaid managed care plans along key domains of interest. Beyond documenting basic features of the plans and providing contextual information, these data will support the refinement and testing of hypotheses about the impact of public sector managed care on access, quality, costs, and outcomes of care. Conclusions If managed behavioral health care research is to advance beyond simple case study comparisons, a well-conceptualized set of instruments is necessary. PMID:12236386
Construction of an environmental quality index for public health research
2014-01-01
Background A more comprehensive estimate of environmental quality would improve our understanding of the relationship between environmental conditions and human health. An environmental quality index (EQI) for all counties in the U.S. was developed. Methods The EQI was developed in four parts: domain identification; data source acquisition; variable construction; and data reduction. Five environmental domains (air, water, land, built and sociodemographic) were recognized. Within each domain, data sources were identified; each was temporally (years 2000–2005) and geographically (county) restricted. Variables were constructed for each domain and assessed for missingness, collinearity, and normality. Domain-specific data reduction was accomplished using principal components analysis (PCA), resulting in domain-specific indices. Domain-specific indices were then combined into an overall EQI using PCA. In each PCA procedure, the first principal component was retained. Both domain-specific indices and overall EQI were stratified by four rural–urban continuum codes (RUCC). Higher values for each index were set to correspond to areas with poorer environmental quality. Results Concentrations of included variables differed across rural–urban strata, as did within-domain variable loadings, and domain index loadings for the EQI. In general, higher values of the air and sociodemographic indices were found in the more metropolitan areas and the most thinly populated areas have the lowest values of each of the domain indices. The less-urbanized counties (RUCC 3) demonstrated the greatest heterogeneity and range of EQI scores (−4.76, 3.57) while the thinly populated strata (RUCC 4) contained counties with the most positive scores (EQI score ranges from −5.86, 2.52). Conclusion The EQI holds promise for improving our characterization of the overall environment for public health. The EQI describes the non-residential ambient county-level conditions to which residents are exposed and domain-specific EQI loadings indicate which of the environmental domains account for the largest portion of the variability in the EQI environment. The EQI was constructed for all counties in the United States, incorporating a variety of data to provide a broad picture of environmental conditions. We undertook a reproducible approach that primarily utilized publically-available data sources. PMID:24886426
Aggarwal, Neil Krishan; Cedeno, Kryst; John, Dolly; Lewis-Fernandez, Roberto
2017-08-01
This study reports the extent to which states have adopted the national culturally and linguistically appropriate services (CLAS) standards. Officials from public mental health agencies in the 50 states, Washington, D.C., and Puerto Rico were contacted between January and June 2016 to obtain information about adoption of CLAS standards in current policies. Each policy was coded through thematic analysis to determine its correspondence with any of the 14 national CLAS standards, which are grouped into three domains. Officials from 47 states and territories (90%) responded. Eight states (17%) reported adopting all national CLAS standards. Ten (23%) had adopted no CLAS policies, five (12%) had adopted policies under one domain, three (7%) under two domains, and 25 (58%) under all three domains. Most states do not have policies that meet all CLAS standards, raising questions about how CLAS standards should be adopted.
NASA Technical Reports Server (NTRS)
Rochon, Gilbert L.
1989-01-01
A user requirements analysis (URA) was undertaken to determine and appropriate public domain Geographic Information System (GIS) software package for potential integration with NASA's LAS (Land Analysis System) 5.0 image processing system. The necessity for a public domain system was underscored due to the perceived need for source code access and flexibility in tailoring the GIS system to the needs of a heterogenous group of end-users, and to specific constraints imposed by LAS and its user interface, Transportable Applications Executive (TAE). Subsequently, a review was conducted of a variety of public domain GIS candidates, including GRASS 3.0, MOSS, IEMIS, and two university-based packages, IDRISI and KBGIS. The review method was a modified version of the GIS evaluation process, development by the Federal Interagency Coordinating Committee on Digital Cartography. One IEMIS-derivative product, the ALBE (AirLand Battlefield Environment) GIS, emerged as the most promising candidate for integration with LAS. IEMIS (Integrated Emergency Management Information System) was developed by the Federal Emergency Management Agency (FEMA). ALBE GIS is currently under development at the Pacific Northwest Laboratory under contract with the U.S. Army Corps of Engineers' Engineering Topographic Laboratory (ETL). Accordingly, recommendations are offered with respect to a potential LAS/ALBE GIS linkage and with respect to further system enhancements, including coordination with the development of the Spatial Analysis and Modeling System (SAMS) GIS in Goddard's IDM (Intelligent Data Management) developments in Goddard's National Space Science Data Center.
Frequency- and Time-Domain Methods in Soil-Structure Interaction Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolisetti, Chandrakanth; Whittaker, Andrew S.; Coleman, Justin L.
2015-06-01
Soil-structure interaction (SSI) analysis in the nuclear industry is currently performed using linear codes that function in the frequency domain. There is a consensus that these frequency-domain codes give reasonably accurate results for low-intensity ground motions that result in almost linear response. For higher intensity ground motions, which may result in nonlinear response in the soil, structure or at the vicinity of the foundation, the adequacy of frequency-domain codes is unproven. Nonlinear analysis, which is only possible in the time domain, is theoretically more appropriate in such cases. These methods are available but are rarely used due to the largemore » computational requirements and a lack of experience with analysts and regulators. This paper presents an assessment of the linear frequency-domain code, SASSI, which is widely used in the nuclear industry, and the time-domain commercial finite-element code, LS-DYNA, for SSI analysis. The assessment involves benchmarking the SSI analysis procedure in LS-DYNA against SASSI for linearly elastic models. After affirming that SASSI and LS-DYNA result in almost identical responses for these models, they are used to perform nonlinear SSI analyses of two structures founded on soft soil. An examination of the results shows that, in spite of using identical material properties, the predictions of frequency- and time-domain codes are significantly different in the presence of nonlinear behavior such as gapping and sliding of the foundation.« less
Isosurface Display of 3-D Scalar Fields from a Meteorological Model on Google Earth
2013-07-01
facets to four, we have chosen to adopt and implement a revised method discussed and made available by Bourke (1994), which can accommodate up to...five facets for a given grid cube. While the published code from Bourke (1994) is in the public domain, it was originally implemented in the C...and atmospheric temperatures. 17 4. References Bourke , P. Polygonising a Scalar Field. http://paulbourke.net/geometry/polygonise
A Repository of Codes of Ethics and Technical Standards in Health Informatics
Zaïane, Osmar R.
2014-01-01
We present a searchable repository of codes of ethics and standards in health informatics. It is built using state-of-the-art search algorithms and technologies. The repository will be potentially beneficial for public health practitioners, researchers, and software developers in finding and comparing ethics topics of interest. Public health clinics, clinicians, and researchers can use the repository platform as a one-stop reference for various ethics codes and standards. In addition, the repository interface is built for easy navigation, fast search, and side-by-side comparative reading of documents. Our selection criteria for codes and standards are two-fold; firstly, to maintain intellectual property rights, we index only codes and standards freely available on the internet. Secondly, major international, regional, and national health informatics bodies across the globe are surveyed with the aim of understanding the landscape in this domain. We also look at prevalent technical standards in health informatics from major bodies such as the International Standards Organization (ISO) and the U. S. Food and Drug Administration (FDA). Our repository contains codes of ethics from the International Medical Informatics Association (IMIA), the iHealth Coalition (iHC), the American Health Information Management Association (AHIMA), the Australasian College of Health Informatics (ACHI), the British Computer Society (BCS), and the UK Council for Health Informatics Professions (UKCHIP), with room for adding more in the future. Our major contribution is enhancing the findability of codes and standards related to health informatics ethics by compilation and unified access through the health informatics ethics repository. PMID:25422725
Patient complaints in healthcare systems: a systematic review and coding taxonomy
Reader, Tom W; Gillespie, Alex; Roberts, Jane
2014-01-01
Background Patient complaints have been identified as a valuable resource for monitoring and improving patient safety. This article critically reviews the literature on patient complaints, and synthesises the research findings to develop a coding taxonomy for analysing patient complaints. Methods The PubMed, Science Direct and Medline databases were systematically investigated to identify patient complaint research studies. Publications were included if they reported primary quantitative data on the content of patient-initiated complaints. Data were extracted and synthesised on (1) basic study characteristics; (2) methodological details; and (3) the issues patients complained about. Results 59 studies, reporting 88 069 patient complaints, were included. Patient complaint coding methodologies varied considerably (eg, in attributing single or multiple causes to complaints). In total, 113 551 issues were found to underlie the patient complaints. These were analysed using 205 different analytical codes which when combined represented 29 subcategories of complaint issue. The most common issues complained about were ‘treatment’ (15.6%) and ‘communication’ (13.7%). To develop a patient complaint coding taxonomy, the subcategories were thematically grouped into seven categories, and then three conceptually distinct domains. The first domain related to complaints on the safety and quality of clinical care (representing 33.7% of complaint issues), the second to the management of healthcare organisations (35.1%) and the third to problems in healthcare staff–patient relationships (29.1%). Conclusions Rigorous analyses of patient complaints will help to identify problems in patient safety. To achieve this, it is necessary to standardise how patient complaints are analysed and interpreted. Through synthesising data from 59 patient complaint studies, we propose a coding taxonomy for supporting future research and practice in the analysis of patient complaint data. PMID:24876289
Sargent, Lucy; McCullough, Amanda; Del Mar, Chris; Lowe, John
2017-02-13
Delayed antibiotic prescribing reduces antibiotic use for acute respiratory infections in trials in general practice, but the uptake in clinical practice is low. The aim of the study was to identify facilitators and barriers to general practitioners' (GPs') use of delayed prescribing and to gain pharmacists' and the public's views about delayed prescribing in Australia. This study used the Theoretical Domains Framework and the Behaviour Change Wheel to explore facilitators and barriers to delayed prescribing in Australia. Forty-three semi-structured, face-to-face interviews with general practitioners, pharmacists and patients were conducted. Responses were coded into domains of the Theoretical Domains Framework, and specific criteria from the Behaviour Change Wheel were used to identify which domains were relevant to increasing the use of delayed prescribing by GPs. The interviews revealed nine key domains that influence GPs' use of delayed prescribing: knowledge; cognitive and interpersonal skills; memory, attention and decision-making processes; optimism; beliefs about consequences; intentions; goals; emotion; and social influences: GPs knew about delayed prescribing; however, they did not use it consistently, preferring to bring patients back for review and only using it with patients in a highly selective way. Pharmacists would support GPs and the public in delayed prescribing but would fill the prescription if people insisted. The public said they would delay taking their antibiotics if asked by their GP and given the right information on managing symptoms and when to take antibiotics. Using a theory-driven approach, we identified nine key domains that influence GPs' willingness to provide a delayed prescription to patients with an acute respiratory infection presenting to general practice. These data can be used to develop a structured intervention to change this behaviour and thus reduce antibiotic use for acute respiratory infections in general practice.
Optimum Vessel Performance in Evolving Nonlinear Wave Fields
2012-11-01
TEMPEST , the new, nonlinear, time-domain ship motion code being developed by the Navy. Table of Contents Executive Summary i List of Figures iii...domain ship motion code TEMPEST . The radiation and diffraction forces in the level 3.0 version of TEMPEST will be computed by the body-exact strip theory...nonlinear responses of a ship to a seaway are being incorporated into version 3 of TEMPEST , the new, nonlinear, time-domain ship motion code that
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tominaga, Nozomu; Shibata, Sanshiro; Blinnikov, Sergei I., E-mail: tominaga@konan-u.ac.jp, E-mail: sshibata@post.kek.jp, E-mail: Sergei.Blinnikov@itep.ru
We develop a time-dependent, multi-group, multi-dimensional relativistic radiative transfer code, which is required to numerically investigate radiation from relativistic fluids that are involved in, e.g., gamma-ray bursts and active galactic nuclei. The code is based on the spherical harmonic discrete ordinate method (SHDOM) which evaluates a source function including anisotropic scattering in spherical harmonics and implicitly solves the static radiative transfer equation with ray tracing in discrete ordinates. We implement treatments of time dependence, multi-frequency bins, Lorentz transformation, and elastic Thomson and inelastic Compton scattering to the publicly available SHDOM code. Our code adopts a mixed-frame approach; the source functionmore » is evaluated in the comoving frame, whereas the radiative transfer equation is solved in the laboratory frame. This implementation is validated using various test problems and comparisons with the results from a relativistic Monte Carlo code. These validations confirm that the code correctly calculates the intensity and its evolution in the computational domain. The code enables us to obtain an Eddington tensor that relates the first and third moments of intensity (energy density and radiation pressure) and is frequently used as a closure relation in radiation hydrodynamics calculations.« less
Conversion of the agent-oriented domain-specific language ALAS into JavaScript
NASA Astrophysics Data System (ADS)
Sredojević, Dejan; Vidaković, Milan; Okanović, Dušan; Mitrović, Dejan; Ivanović, Mirjana
2016-06-01
This paper shows generation of JavaScript code from code written in agent-oriented domain-specific language ALAS. ALAS is an agent-oriented domain-specific language for writing software agents that are executed within XJAF middleware. Since the agents can be executed on various platforms, they must be converted into a language of the target platform. We also try to utilize existing tools and technologies to make the whole conversion process as simple as possible, as well as faster and more efficient. We use the Xtext framework that is compatible with Java to implement ALAS infrastructure - editor and code generator. Since Xtext supports Java, generation of Java code from ALAS code is straightforward. To generate a JavaScript code that will be executed within the target JavaScript XJAF implementation, Google Web Toolkit (GWT) is used.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.
User interfaces for computational science: A domain specific language for OOMMF embedded in Python
NASA Astrophysics Data System (ADS)
Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans
2017-05-01
Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.
NASA Technical Reports Server (NTRS)
Saltsman, James F.
1992-01-01
This manual presents computer programs for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the total strain version of Strainrange Partitioning (TS-SRP). An extensive database has also been developed in a parallel effort. This database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life prediction methods as well. This users manual, software, and database are all in the public domain and are available through COSMIC (382 East Broad Street, Athens, GA 30602; (404) 542-3265, FAX (404) 542-4807). Two disks accompany this manual. The first disk contains the source code, executable files, and sample output from these programs. The second disk contains the creep-fatigue data in a format compatible with these programs.
Self-Taught Low-Rank Coding for Visual Learning.
Li, Sheng; Li, Kang; Fu, Yun
2018-03-01
The lack of labeled data presents a common challenge in many computer vision and machine learning tasks. Semisupervised learning and transfer learning methods have been developed to tackle this challenge by utilizing auxiliary samples from the same domain or from a different domain, respectively. Self-taught learning, which is a special type of transfer learning, has fewer restrictions on the choice of auxiliary data. It has shown promising performance in visual learning. However, existing self-taught learning methods usually ignore the structure information in data. In this paper, we focus on building a self-taught coding framework, which can effectively utilize the rich low-level pattern information abstracted from the auxiliary domain, in order to characterize the high-level structural information in the target domain. By leveraging a high quality dictionary learned across auxiliary and target domains, the proposed approach learns expressive codings for the samples in the target domain. Since many types of visual data have been proven to contain subspace structures, a low-rank constraint is introduced into the coding objective to better characterize the structure of the given target set. The proposed representation learning framework is called self-taught low-rank (S-Low) coding, which can be formulated as a nonconvex rank-minimization and dictionary learning problem. We devise an efficient majorization-minimization augmented Lagrange multiplier algorithm to solve it. Based on the proposed S-Low coding mechanism, both unsupervised and supervised visual learning algorithms are derived. Extensive experiments on five benchmark data sets demonstrate the effectiveness of our approach.
Hashemi, Seirana; Nowzari Dalini, Abbas; Jalali, Adrin; Banaei-Moghaddam, Ali Mohammad; Razaghi-Moghadam, Zahra
2017-08-16
Discriminating driver mutations from the ones that play no role in cancer is a severe bottleneck in elucidating molecular mechanisms underlying cancer development. Since protein domains are representatives of functional regions within proteins, mutations on them may disturb the protein functionality. Therefore, studying mutations at domain level may point researchers to more accurate assessment of the functional impact of the mutations. This article presents a comprehensive study to map mutations from 29 cancer types to both sequence- and structure-based domains. Statistical analysis was performed to identify candidate domains in which mutations occur with high statistical significance. For each cancer type, the corresponding type-specific domains were distinguished among all candidate domains. Subsequently, cancer type-specific domains facilitated the identification of specific proteins for each cancer type. Besides, performing interactome analysis on specific proteins of each cancer type showed high levels of interconnectivity among them, which implies their functional relationship. To evaluate the role of mitochondrial genes, stem cell-specific genes and DNA repair genes in cancer development, their mutation frequency was determined via further analysis. This study has provided researchers with a publicly available data repository for studying both CATH and Pfam domain regions on protein-coding genes. Moreover, the associations between different groups of genes/domains and various cancer types have been clarified. The work is available at http://www.cancerouspdomains.ir .
Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains
NASA Astrophysics Data System (ADS)
Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao
2017-11-01
A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.
Encounter Detection Using Visual Analytics to Improve Maritime Domain Awareness
2015-06-01
assigned to be processed in a record set consisting of all the records within a one degree of latitude by one degree of longitude square box. For the case...0.002 3 30 185 0.001 4 30 370 0.002 37 a degree of latitude by a tenth of a degree of longitude . This prototype further reduces the processing ...STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) A visual analytics process
Advanced propeller noise prediction in the time domain
NASA Technical Reports Server (NTRS)
Farassat, F.; Dunn, M. H.; Spence, P. L.
1992-01-01
The time domain code ASSPIN gives acousticians a powerful technique of advanced propeller noise prediction. Except for nonlinear effects, the code uses exact solutions of the Ffowcs Williams-Hawkings equation with exact blade geometry and kinematics. By including nonaxial inflow, periodic loading noise, and adaptive time steps to accelerate computer execution, the development of this code becomes complete.
NASA Technical Reports Server (NTRS)
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael
1992-01-01
Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.
Reproducibility and Transparency in Ocean-Climate Modeling
NASA Astrophysics Data System (ADS)
Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.
2015-12-01
Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.
Pang, Erli; Wu, Xiaomei; Lin, Kui
2016-06-01
Protein evolution plays an important role in the evolution of each genome. Because of their functional nature, in general, most of their parts or sites are differently constrained selectively, particularly by purifying selection. Most previous studies on protein evolution considered individual proteins in their entirety or compared protein-coding sequences with non-coding sequences. Less attention has been paid to the evolution of different parts within each protein of a given genome. To this end, based on PfamA annotation of all human proteins, each protein sequence can be split into two parts: domains or unassigned regions. Using this rationale, single nucleotide polymorphisms (SNPs) in protein-coding sequences from the 1000 Genomes Project were mapped according to two classifications: SNPs occurring within protein domains and those within unassigned regions. With these classifications, we found: the density of synonymous SNPs within domains is significantly greater than that of synonymous SNPs within unassigned regions; however, the density of non-synonymous SNPs shows the opposite pattern. We also found there are signatures of purifying selection on both the domain and unassigned regions. Furthermore, the selective strength on domains is significantly greater than that on unassigned regions. In addition, among all of the human protein sequences, there are 117 PfamA domains in which no SNPs are found. Our results highlight an important aspect of protein domains and may contribute to our understanding of protein evolution.
Modeling Guidelines for Code Generation in the Railway Signaling Context
NASA Technical Reports Server (NTRS)
Ferrari, Alessio; Bacherini, Stefano; Fantechi, Alessandro; Zingoni, Niccolo
2009-01-01
Modeling guidelines constitute one of the fundamental cornerstones for Model Based Development. Their relevance is essential when dealing with code generation in the safety-critical domain. This article presents the experience of a railway signaling systems manufacturer on this issue. Introduction of Model-Based Development (MBD) and code generation in the industrial safety-critical sector created a crucial paradigm shift in the development process of dependable systems. While traditional software development focuses on the code, with MBD practices the focus shifts to model abstractions. The change has fundamental implications for safety-critical systems, which still need to guarantee a high degree of confidence also at code level. Usage of the Simulink/Stateflow platform for modeling, which is a de facto standard in control software development, does not ensure by itself production of high-quality dependable code. This issue has been addressed by companies through the definition of modeling rules imposing restrictions on the usage of design tools components, in order to enable production of qualified code. The MAAB Control Algorithm Modeling Guidelines (MathWorks Automotive Advisory Board)[3] is a well established set of publicly available rules for modeling with Simulink/Stateflow. This set of recommendations has been developed by a group of OEMs and suppliers of the automotive sector with the objective of enforcing and easing the usage of the MathWorks tools within the automotive industry. The guidelines have been published in 2001 and afterwords revisited in 2007 in order to integrate some additional rules developed by the Japanese division of MAAB [5]. The scope of the current edition of the guidelines ranges from model maintainability and readability to code generation issues. The rules are conceived as a reference baseline and therefore they need to be tailored to comply with the characteristics of each industrial context. Customization of these recommendations has been performed for the automotive control systems domain in order to enforce code generation [7]. The MAAB guidelines have been found profitable also in the aerospace/avionics sector [1] and they have been adopted by the MathWorks Aerospace Leadership Council (MALC). General Electric Transportation Systems (GETS) is a well known railway signaling systems manufacturer leading in Automatic Train Protection (ATP) systems technology. Inside an effort of adopting formal methods within its own development process, GETS decided to introduce system modeling by means of the MathWorks tools [2], and in 2008 chose to move to code generation. This article reports the experience performed by GETS in developing its own modeling standard through customizing the MAAB rules for the railway signaling domain and shows the result of this experience with a successful product development story.
Zhang, Yinsheng; Zhang, Guoming
2018-01-01
A terminology (or coding system) is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity). Based on a collection of historical ROP patients' data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system-ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.
Natural Language Interface for Safety Certification of Safety-Critical Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2011-01-01
Model-based design and automated code generation are being used increasingly at NASA. The trend is to move beyond simulation and prototyping to actual flight code, particularly in the guidance, navigation, and control domain. However, there are substantial obstacles to more widespread adoption of code generators in such safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. The AutoCert generator plug-in supports the certification of automatically generated code by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews.
Ethical issues in using Twitter for population-level depression monitoring: a qualitative study.
Mikal, Jude; Hurst, Samantha; Conway, Mike
2016-04-14
Recently, significant research effort has focused on using Twitter (and other social media) to investigate mental health at the population-level. While there has been influential work in developing ethical guidelines for Internet discussion forum-based research in public health, there is currently limited work focused on addressing ethical problems in Twitter-based public health research, and less still that considers these issues from users' own perspectives. In this work, we aim to investigate public attitudes towards utilizing public domain Twitter data for population-level mental health monitoring using a qualitative methodology. The study explores user perspectives in a series of five, 2-h focus group interviews. Following a semi-structured protocol, 26 Twitter users with and without a diagnosed history of depression discussed general Twitter use, along with privacy expectations, and ethical issues in using social media for health monitoring, with a particular focus on mental health monitoring. Transcripts were then transcribed, redacted, and coded using a constant comparative approach. While participants expressed a wide range of opinions, there was an overall trend towards a relatively positive view of using public domain Twitter data as a resource for population level mental health monitoring, provided that results are appropriately aggregated. Results are divided into five sections: (1) a profile of respondents' Twitter use patterns and use variability; (2) users' privacy expectations, including expectations regarding data reach and permanence; (3) attitudes towards social media based population-level health monitoring in general, and attitudes towards mental health monitoring in particular; (4) attitudes towards individual versus population-level health monitoring; and (5) users' own recommendations for the appropriate regulation of population-level mental health monitoring. Focus group data reveal a wide range of attitudes towards the use of public-domain social media "big data" in population health research, from enthusiasm, through acceptance, to opposition. Study results highlight new perspectives in the discussion of ethical use of public data, particularly with respect to consent, privacy, and oversight.
Specific and Modular Binding Code for Cytosine Recognition in Pumilio/FBF (PUF) RNA-binding Domains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Shuyun; Wang, Yang; Cassidy-Amstutz, Caleb
2011-10-28
Pumilio/fem-3 mRNA-binding factor (PUF) proteins possess a recognition code for bases A, U, and G, allowing designed RNA sequence specificity of their modular Pumilio (PUM) repeats. However, recognition side chains in a PUM repeat for cytosine are unknown. Here we report identification of a cytosine-recognition code by screening random amino acid combinations at conserved RNA recognition positions using a yeast three-hybrid system. This C-recognition code is specific and modular as specificity can be transferred to different positions in the RNA recognition sequence. A crystal structure of a modified PUF domain reveals specific contacts between an arginine side chain and themore » cytosine base. We applied the C-recognition code to design PUF domains that recognize targets with multiple cytosines and to generate engineered splicing factors that modulate alternative splicing. Finally, we identified a divergent yeast PUF protein, Nop9p, that may recognize natural target RNAs with cytosine. This work deepens our understanding of natural PUF protein target recognition and expands the ability to engineer PUF domains to recognize any RNA sequence.« less
Bhattacharya, D; Steinkötter, J; Melkonian, M
1993-12-01
Centrin (= caltractin) is a ubiquitous, cytoskeletal protein which is a member of the EF-hand superfamily of calcium-binding proteins. A centrin-coding cDNA was isolated and characterized from the prasinophyte green alga Scherffelia dubia. Centrin PCR amplification primers were used to isolate partial, homologous cDNA sequences from the green algae Tetraselmis striata and Spermatozopsis similis. Annealing analyses suggested that centrin is a single-copy-coding region in T. striata and S. similis and other green algae studied. Centrin-coding regions from S. dubia, S. similis and T. striata encode four colinear EF-hand domains which putatively bind calcium. Phylogenetic analyses, including homologous sequences from Chlamydomonas reinhardtii and the land plant Atriplex nummularia, demonstrate that the domains of centrins are congruent and arose from the two-fold duplication of an ancestral EF hand with Domains 1+3 and Domains 2+4 clustering. The domains of centrins are also congruent with those of calmodulins demonstrating that, like calmodulin, centrin is an ancient protein which arose within the ancestor of all eukaryotes via gene duplication. Phylogenetic relationships inferred from centrin-coding region comparisons mirror results of small subunit ribosomal RNA sequence analyses suggesting that centrin-coding regions are useful evolutionary markers within the green algae.
Research Prototype: Automated Analysis of Scientific and Engineering Semantics
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.; Follen, Greg (Technical Monitor)
2001-01-01
Physical and mathematical formulae and concepts are fundamental elements of scientific and engineering software. These classical equations and methods are time tested, universally accepted, and relatively unambiguous. The existence of this classical ontology suggests an ideal problem for automated comprehension. This problem is further motivated by the pervasive use of scientific code and high code development costs. To investigate code comprehension in this classical knowledge domain, a research prototype has been developed. The prototype incorporates scientific domain knowledge to recognize code properties (including units, physical, and mathematical quantity). Also, the procedure implements programming language semantics to propagate these properties through the code. This prototype's ability to elucidate code and detect errors will be demonstrated with state of the art scientific codes.
Simulating Coupling Complexity in Space Plasmas: First Results from a new code
NASA Astrophysics Data System (ADS)
Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.
2005-12-01
The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.
Fast-Running Aeroelastic Code Based on Unsteady Linearized Aerodynamic Solver Developed
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Bakhle, Milind A.; Keith, T., Jr.
2003-01-01
The NASA Glenn Research Center has been developing aeroelastic analyses for turbomachines for use by NASA and industry. An aeroelastic analysis consists of a structural dynamic model, an unsteady aerodynamic model, and a procedure to couple the two models. The structural models are well developed. Hence, most of the development for the aeroelastic analysis of turbomachines has involved adapting and using unsteady aerodynamic models. Two methods are used in developing unsteady aerodynamic analysis procedures for the flutter and forced response of turbomachines: (1) the time domain method and (2) the frequency domain method. Codes based on time domain methods require considerable computational time and, hence, cannot be used during the design process. Frequency domain methods eliminate the time dependence by assuming harmonic motion and, hence, require less computational time. Early frequency domain analyses methods neglected the important physics of steady loading on the analyses for simplicity. A fast-running unsteady aerodynamic code, LINFLUX, which includes steady loading and is based on the frequency domain method, has been modified for flutter and response calculations. LINFLUX, solves unsteady linearized Euler equations for calculating the unsteady aerodynamic forces on the blades, starting from a steady nonlinear aerodynamic solution. First, we obtained a steady aerodynamic solution for a given flow condition using the nonlinear unsteady aerodynamic code TURBO. A blade vibration analysis was done to determine the frequencies and mode shapes of the vibrating blades, and an interface code was used to convert the steady aerodynamic solution to a form required by LINFLUX. A preprocessor was used to interpolate the mode shapes from the structural dynamic mesh onto the computational dynamics mesh. Then, we used LINFLUX to calculate the unsteady aerodynamic forces for a given mode, frequency, and phase angle. A postprocessor read these unsteady pressures and calculated the generalized aerodynamic forces, eigenvalues, and response amplitudes. The eigenvalues determine the flutter frequency and damping. As a test case, the flutter of a helical fan was calculated with LINFLUX and compared with calculations from TURBO-AE, a nonlinear time domain code, and from ASTROP2, a code based on linear unsteady aerodynamics.
Decay-ratio calculation in the frequency domain with the LAPUR code using 1D-kinetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munoz-Cobo, J. L.; Escriva, A.; Garcia, C.
This paper deals with the problem of computing the Decay Ratio in the frequency domain codes as the LAPUR code. First, it is explained how to calculate the feedback reactivity in the frequency domain using slab-geometry i.e. 1D kinetics, also we show how to perform the coupling of the 1D kinetics with the thermal-hydraulic part of the LAPUR code in order to obtain the reactivity feedback coefficients for the different channels. In addition, we show how to obtain the reactivity variation in the complex domain by solving the eigenvalue equation in the frequency domain and we compare this result withmore » the reactivity variation obtained in first order perturbation theory using the 1D neutron fluxes of the base case. Because LAPUR works in the linear regime, it is assumed that in general the perturbations are small. There is also a section devoted to the reactivity weighting factors used to couple the reactivity contribution from the different channels to the reactivity of the entire reactor core in point kinetics and 1D kinetics. Finally we analyze the effects of the different approaches on the DR value. (authors)« less
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file, a discussion of radar cross section computations, a discussion of some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
Bilingual First Language Acquisition: Exploring the Limits of the Language Faculty.
ERIC Educational Resources Information Center
Genesee, Fred
2001-01-01
Reviews current research in three domains of bilingual acquisition: pragmatic features of bilingual code mixing, grammatical constraints on child bilingual code mixing, and bilingual syntactic development. Examines implications from these domains for the understanding of the limits of the mental faculty to acquire language. (Author/VWL)
Bishop, Mark D; Coronado, Rogelio A; Hill, Alexandra D; Alappattu, Meryl J
2017-05-01
The objective of this study was to examine the type and content of Journal of Women's Health Physical Therapy (JWHPT) publications over the last decade. Content and bibliometric analysis of published literature. Component sections, such as the Section on Women's Health (SoWH) of the American Physical Therapy Association provide content expertise to areas of specialty physical therapist practice, thereby supporting the dissemination of evidence for physical therapists to use. Closely aligned with the mission of the SoWH, JWHPT provides evidence reflecting this practice area. The purpose of our analysis was to examine publications within JWHPT to determine how closely JWHPT is meeting the mission and focus of section members. We used established bibliographic methodology to code and review manuscripts published online between 2005 and 2015 in JWHPT using established domains (article type, participant type, research design, study purpose, and area of focus). Total publications and proportion of publications based on domain were described. Impact by citation and author was examined using bibliometric software. Eighteen percent of the items published in JWHPT were original research papers submitted for the first time. Of those papers, the primary study design was cross-sectional experimental research, most commonly studying interventions. The primary practice area reported was management of incontinence. We suggest that a continued need to increase efforts for the submission and publication of a greater proportion of randomized controlled trials and metric articles.
A generic interface between COSMIC/NASTRAN and PATRAN (R)
NASA Technical Reports Server (NTRS)
Roschke, Paul N.; Premthamkorn, Prakit; Maxwell, James C.
1990-01-01
Despite its powerful analytical capabilities, COSMIC/NASTRAN lacks adequate post-processing adroitness. PATRAN, on the other hand is widely accepted for its graphical capabilities. A nonproprietary, public domain code mnemonically titled CPI (for COSMIC/NASTRAN-PATRAN Interface) is designed to manipulate a large number of files rapidly and efficiently between the two parent codes. In addition to PATRAN's results file preparation, CPI also prepares PATRAN's P/PLOT data files for xy plotting. The user is prompted for necessary information during an interactive session. Current implementation supports NASTRAN's displacement approach including the following rigid formats: (1) static analysis, (2) normal modal analysis, (3) direct transient response, and (4) modal transient response. A wide variety of data blocks are also supported. Error trapping is given special consideration. A sample session with CPI illustrates its simplicity and ease of use.
Devailly, Guillaume; Mantsoki, Anna; Joshi, Anagha
2016-11-01
Better protocols and decreasing costs have made high-throughput sequencing experiments now accessible even to small experimental laboratories. However, comparing one or few experiments generated by an individual lab to the vast amount of relevant data freely available in the public domain might be limited due to lack of bioinformatics expertise. Though several tools, including genome browsers, allow such comparison at a single gene level, they do not provide a genome-wide view. We developed Heat*seq, a web-tool that allows genome scale comparison of high throughput experiments chromatin immuno-precipitation followed by sequencing, RNA-sequencing and Cap Analysis of Gene Expression) provided by a user, to the data in the public domain. Heat*seq currently contains over 12 000 experiments across diverse tissues and cell types in human, mouse and drosophila. Heat*seq displays interactive correlation heatmaps, with an ability to dynamically subset datasets to contextualize user experiments. High quality figures and tables are produced and can be downloaded in multiple formats. Web application: http://www.heatstarseq.roslin.ed.ac.uk/ Source code: https://github.com/gdevailly CONTACT: Guillaume.Devailly@roslin.ed.ac.uk or Anagha.Joshi@roslin.ed.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Earl, Christopher; Might, Matthew; Bagusetty, Abhishek
This study presents Nebo, a declarative domain-specific language embedded in C++ for discretizing partial differential equations for transport phenomena on multiple architectures. Application programmers use Nebo to write code that appears sequential but can be run in parallel, without editing the code. Currently Nebo supports single-thread execution, multi-thread execution, and many-core (GPU-based) execution. With single-thread execution, Nebo performs on par with code written by domain experts. With multi-thread execution, Nebo can linearly scale (with roughly 90% efficiency) up to 12 cores, compared to its single-thread execution. Moreover, Nebo’s many-core execution can be over 140x faster than its single-thread execution.
Earl, Christopher; Might, Matthew; Bagusetty, Abhishek; ...
2016-01-26
This study presents Nebo, a declarative domain-specific language embedded in C++ for discretizing partial differential equations for transport phenomena on multiple architectures. Application programmers use Nebo to write code that appears sequential but can be run in parallel, without editing the code. Currently Nebo supports single-thread execution, multi-thread execution, and many-core (GPU-based) execution. With single-thread execution, Nebo performs on par with code written by domain experts. With multi-thread execution, Nebo can linearly scale (with roughly 90% efficiency) up to 12 cores, compared to its single-thread execution. Moreover, Nebo’s many-core execution can be over 140x faster than its single-thread execution.
Open Rotor Noise Prediction Methods at NASA Langley- A Technology Review
NASA Technical Reports Server (NTRS)
Farassat, F.; Dunn, Mark H.; Tinetti, Ana F.; Nark, Douglas M.
2009-01-01
Open rotors are once again under consideration for propulsion of the future airliners because of their high efficiency. The noise generated by these propulsion systems must meet the stringent noise standards of today to reduce community impact. In this paper we review the open rotor noise prediction methods available at NASA Langley. We discuss three codes called ASSPIN (Advanced Subsonic-Supersonic Propeller Induced Noise), FW - Hpds (Ffowcs Williams-Hawkings with penetrable data surface) and the FSC (Fast Scattering Code). The first two codes are in the time domain and the third code is a frequency domain code. The capabilities of these codes and the input data requirements as well as the output data are presented. Plans for further improvements of these codes are discussed. In particular, a method based on equivalent sources is outlined to get rid of spurious signals in the FW - Hpds code.
ESAS Deliverable PS 1.1.2.3: Customer Survey on Code Generations in Safety-Critical Applications
NASA Technical Reports Server (NTRS)
Schumann, Johann; Denney, Ewen
2006-01-01
Automated code generators (ACG) are tools that convert a (higher-level) model of a software (sub-)system into executable code without the necessity for a developer to actually implement the code. Although both commercially supported and in-house tools have been used in many industrial applications, little data exists on how these tools are used in safety-critical domains (e.g., spacecraft, aircraft, automotive, nuclear). The aims of the survey, therefore, were threefold: 1) to determine if code generation is primarily used as a tool for prototyping, including design exploration and simulation, or for fiight/production code; 2) to determine the verification issues with code generators relating, in particular, to qualification and certification in safety-critical domains; and 3) to determine perceived gaps in functionality of existing tools.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version D is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version D code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMOND.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONB.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
Jin, Qijiang; Hu, Xin; Li, Xin; Wang, Bei; Wang, Yanjie; Jiang, Hongwei; Mattson, Neil; Xu, Yingchun
2016-01-01
Trehalose-6-phosphate synthase (TPS) plays a key role in plant carbohydrate metabolism and the perception of carbohydrate availability. In the present work, the publicly available Nelumbo nucifera (lotus) genome sequence database was analyzed which led to identification of nine lotus TPS genes (NnTPS). It was found that at least two introns are included in the coding sequences of NnTPS genes. When the motif compositions were analyzed we found that NnTPS generally shared the similar motifs, implying that they have similar functions. The dN/dS ratios were always less than 1 for different domains and regions outside domains, suggesting purifying selection on the lotus TPS gene family. The regions outside TPS domain evolved relatively faster than NnTPS domains. A phylogenetic tree was constructed using all predicted coding sequences of lotus TPS genes, together with those from Arabidopsis, poplar, soybean, and rice. The result indicated that those TPS genes could be clearly divided into two main subfamilies (I-II), where each subfamily could be further divided into 2 (I) and 5 (II) subgroups. Analyses of divergence and adaptive evolution show that purifying selection may have been the main force driving evolution of plant TPS genes. Some of the critical sites that contributed to divergence may have been under positive selection. Transcriptome data analysis revealed that most NnTPS genes were predominantly expressed in sink tissues. Expression pattern of NnTPS genes under copper and submergence stress indicated that NNU_014679 and NNU_022788 might play important roles in lotus energy metabolism and participate in stress response. Our results can facilitate further functional studies of TPS genes in lotus. PMID:27746792
NASA Astrophysics Data System (ADS)
Palma, V.; Carli, M.; Neri, A.
2011-02-01
In this paper a Multi-view Distributed Video Coding scheme for mobile applications is presented. Specifically a new fusion technique between temporal and spatial side information in Zernike Moments domain is proposed. Distributed video coding introduces a flexible architecture that enables the design of very low complex video encoders compared to its traditional counterparts. The main goal of our work is to generate at the decoder the side information that optimally blends temporal and interview data. Multi-view distributed coding performance strongly depends on the side information quality built at the decoder. At this aim for improving its quality a spatial view compensation/prediction in Zernike moments domain is applied. Spatial and temporal motion activity have been fused together to obtain the overall side-information. The proposed method has been evaluated by rate-distortion performances for different inter-view and temporal estimation quality conditions.
Three-dimensional time domain model of lightning including corona effects
NASA Technical Reports Server (NTRS)
Podgorski, Andrew S.
1991-01-01
A new 3-D lightning model that incorporates the effect of corona is described for the first time. The new model is based on a Thin Wire Time Domain Lightning (TWTDL) Code developed previously. The TWTDL Code was verified during the 1985 and 1986 lightning seasons by the measurements conducted at the 553 m CN Tower in Toronto, Ontario. The inclusion of corona in the TWTDL code allowed study of the corona effects on the lightning current parameters and the associated electric field parameters.
NASA Astrophysics Data System (ADS)
Eaves, Nick A.; Zhang, Qingan; Liu, Fengshan; Guo, Hongsheng; Dworkin, Seth B.; Thomson, Murray J.
2016-10-01
Mitigation of soot emissions from combustion devices is a global concern. For example, recent EURO 6 regulations for vehicles have placed stringent limits on soot emissions. In order to allow design engineers to achieve the goal of reduced soot emissions, they must have the tools to so. Due to the complex nature of soot formation, which includes growth and oxidation, detailed numerical models are required to gain fundamental insights into the mechanisms of soot formation. A detailed description of the CoFlame FORTRAN code which models sooting laminar coflow diffusion flames is given. The code solves axial and radial velocity, temperature, species conservation, and soot aggregate and primary particle number density equations. The sectional particle dynamics model includes nucleation, PAH condensation and HACA surface growth, surface oxidation, coagulation, fragmentation, particle diffusion, and thermophoresis. The code utilizes a distributed memory parallelization scheme with strip-domain decomposition. The public release of the CoFlame code, which has been refined in terms of coding structure, to the research community accompanies this paper. CoFlame is validated against experimental data for reattachment length in an axi-symmetric pipe with a sudden expansion, and ethylene-air and methane-air diffusion flames for multiple soot morphological parameters and gas-phase species. Finally, the parallel performance and computational costs of the code is investigated.
Lossless Compression of JPEG Coded Photo Collections.
Wu, Hao; Sun, Xiaoyan; Yang, Jingyu; Zeng, Wenjun; Wu, Feng
2016-04-06
The explosion of digital photos has posed a significant challenge to photo storage and transmission for both personal devices and cloud platforms. In this paper, we propose a novel lossless compression method to further reduce the size of a set of JPEG coded correlated images without any loss of information. The proposed method jointly removes inter/intra image redundancy in the feature, spatial, and frequency domains. For each collection, we first organize the images into a pseudo video by minimizing the global prediction cost in the feature domain. We then present a hybrid disparity compensation method to better exploit both the global and local correlations among the images in the spatial domain. Furthermore, the redundancy between each compensated signal and the corresponding target image is adaptively reduced in the frequency domain. Experimental results demonstrate the effectiveness of the proposed lossless compression method. Compared to the JPEG coded image collections, our method achieves average bit savings of more than 31%.
Ontology patterns for tabular representations of biomedical knowledge on neglected tropical diseases
Santana, Filipe; Schober, Daniel; Medeiros, Zulma; Freitas, Fred; Schulz, Stefan
2011-01-01
Motivation: Ontology-like domain knowledge is frequently published in a tabular format embedded in scientific publications. We explore the re-use of such tabular content in the process of building NTDO, an ontology of neglected tropical diseases (NTDs), where the representation of the interdependencies between hosts, pathogens and vectors plays a crucial role. Results: As a proof of concept we analyzed a tabular compilation of knowledge about pathogens, vectors and geographic locations involved in the transmission of NTDs. After a thorough ontological analysis of the domain of interest, we formulated a comprehensive design pattern, rooted in the biomedical domain upper level ontology BioTop. This pattern was implemented in a VBA script which takes cell contents of an Excel spreadsheet and transforms them into OWL-DL. After minor manual post-processing, the correctness and completeness of the ontology was tested using pre-formulated competence questions as description logics (DL) queries. The expected results could be reproduced by the ontology. The proposed approach is recommended for optimizing the acquisition of ontological domain knowledge from tabular representations. Availability and implementation: Domain examples, source code and ontology are freely available on the web at http://www.cin.ufpe.br/~ntdo. Contact: fss3@cin.ufpe.br PMID:21685092
Modeling and inversion Matlab algorithms for resistivity, induced polarization and seismic data
NASA Astrophysics Data System (ADS)
Karaoulis, M.; Revil, A.; Minsley, B. J.; Werkema, D. D.
2011-12-01
M. Karaoulis (1), D.D. Werkema (3), A. Revil (1,2), A., B. Minsley (4), (1) Colorado School of Mines, Dept. of Geophysics, Golden, CO, USA. (2) ISTerre, CNRS, UMR 5559, Université de Savoie, Equipe Volcan, Le Bourget du Lac, France. (3) U.S. EPA, ORD, NERL, ESD, CMB, Las Vegas, Nevada, USA . (4) USGS, Federal Center, Lakewood, 10, 80225-0046, CO. Abstract We propose 2D and 3D forward modeling and inversion package for DC resistivity, time domain induced polarization (IP), frequency-domain IP, and seismic refraction data. For the resistivity and IP case, discretization is based on rectangular cells, where each cell has as unknown resistivity in the case of DC modelling, resistivity and chargeability in the time domain IP modelling, and complex resistivity in the spectral IP modelling. The governing partial-differential equations are solved with the finite element method, which can be applied to both real and complex variables that are solved for. For the seismic case, forward modeling is based on solving the eikonal equation using a second-order fast marching method. The wavepaths are materialized by Fresnel volumes rather than by conventional rays. This approach accounts for complicated velocity models and is advantageous because it considers frequency effects on the velocity resolution. The inversion can accommodate data at a single time step, or as a time-lapse dataset if the geophysical data are gathered for monitoring purposes. The aim of time-lapse inversion is to find the change in the velocities or resistivities of each model cell as a function of time. Different time-lapse algorithms can be applied such as independent inversion, difference inversion, 4D inversion, and 4D active time constraint inversion. The forward algorithms are benchmarked against analytical solutions and inversion results are compared with existing ones. The algorithms are packaged as Matlab codes with a simple Graphical User Interface. Although the code is parallelized for multi-core cpus, it is not as fast as machine code. In the case of large datasets, someone should consider transferring parts of the code to C or Fortran through mex files. This code is available through EPA's website on the following link http://www.epa.gov/esd/cmb/GeophysicsWebsite/index.html Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy.
UTM, a universal simulator for lightcurves of transiting systems
NASA Astrophysics Data System (ADS)
Deeg, Hans
2009-02-01
The Universal Transit Modeller (UTM) is a light-curve simulator for all kinds of transiting or eclipsing configurations between arbitrary numbers of several types of objects, which may be stars, planets, planetary moons, and planetary rings. Applications of UTM to date have been mainly in the generation of light-curves for the testing of detection algorithms. For the preparation of such test for the Corot Mission, a special version has been used to generate multicolour light-curves in Corot's passbands. A separate fitting program, UFIT (Universal Fitter) is part of the UTM distribution and may be used to derive best fits to light-curves for any set of continuously variable parameters. UTM/UFIT is written in IDL code and its source is released in the public domain under the GNU General Public License.
NASA Technical Reports Server (NTRS)
Warren, Gary
1988-01-01
The SOS code is used to compute the resonance modes (frequency-domain information) of sample devices and separately to compute the transient behavior of the same devices. A code, DOT, is created to compute appropriate dot products of the time-domain and frequency-domain results. The transient behavior of individual modes in the device is then plotted. Modes in a coupled-cavity traveling-wave tube (CCTWT) section excited beam in separate simulations are analyzed. Mode energy vs. time and mode phase vs. time are computed and it is determined whether the transient waves are forward or backward waves for each case. Finally, the hot-test mode frequencies of the CCTWT section are computed.
Finite difference time domain electromagnetic scattering from frequency-dependent lossy materials
NASA Technical Reports Server (NTRS)
Luebbers, Raymond J.; Beggs, John H.
1991-01-01
Four different FDTD computer codes and companion Radar Cross Section (RCS) conversion codes on magnetic media are submitted. A single three dimensional dispersive FDTD code for both dispersive dielectric and magnetic materials was developed, along with a user's manual. The extension of FDTD to more complicated materials was made. The code is efficient and is capable of modeling interesting radar targets using a modest computer workstation platform. RCS results for two different plate geometries are reported. The FDTD method was also extended to computing far zone time domain results in two dimensions. Also the capability to model nonlinear materials was incorporated into FDTD and validated.
NASA Technical Reports Server (NTRS)
Chang, C. Y.; Kwok, R.; Curlander, J. C.
1987-01-01
Five coding techniques in the spatial and transform domains have been evaluated for SAR image compression: linear three-point predictor (LTPP), block truncation coding (BTC), microadaptive picture sequencing (MAPS), adaptive discrete cosine transform (ADCT), and adaptive Hadamard transform (AHT). These techniques have been tested with Seasat data. Both LTPP and BTC spatial domain coding techniques provide very good performance at rates of 1-2 bits/pixel. The two transform techniques, ADCT and AHT, demonstrate the capability to compress the SAR imagery to less than 0.5 bits/pixel without visible artifacts. Tradeoffs such as the rate distortion performance, the computational complexity, the algorithm flexibility, and the controllability of compression ratios are also discussed.
Tyralis, Hristos; Karakatsanis, Georgios; Tzouka, Katerina; Mamassis, Nikos
2017-08-01
We present data and code for visualizing the electrical energy data and weather-, climate-related and socioeconomic variables in the time domain in Greece. The electrical energy data include hourly demand, weekly-ahead forecasted values of the demand provided by the Greek Independent Power Transmission Operator and pricing values in Greece. We also present the daily temperature in Athens and the Gross Domestic Product of Greece. The code combines the data to a single report, which includes all visualizations with combinations of all variables in multiple time scales. The data and code were used in Tyralis et al. (2017) [1].
Schuller, Björn
2017-01-01
Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies—the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain. PMID:28658285
Coutinho, Eduardo; Schuller, Björn
2017-01-01
Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies-the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain.
Architecture for time or transform domain decoding of reed-solomon codes
NASA Technical Reports Server (NTRS)
Hsu, In-Shek (Inventor); Truong, Trieu-Kie (Inventor); Deutsch, Leslie J. (Inventor); Shao, Howard M. (Inventor)
1989-01-01
Two pipeline (255,233) RS decoders, one a time domain decoder and the other a transform domain decoder, use the same first part to develop an errata locator polynomial .tau.(x), and an errata evaluator polynominal A(x). Both the time domain decoder and transform domain decoder have a modified GCD that uses an input multiplexer and an output demultiplexer to reduce the number of GCD cells required. The time domain decoder uses a Chien search and polynomial evaluator on the GCD outputs .tau.(x) and A(x), for the final decoding steps, while the transform domain decoder uses a transform error pattern algorithm operating on .tau.(x) and the initial syndrome computation S(x), followed by an inverse transform algorithm in sequence for the final decoding steps prior to adding the received RS coded message to produce a decoded output message.
Import Manipulate Plot RELAP5/MOD3 Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, K. R.
1999-10-05
XMGR5 was derived from an XY plotting tool called ACE/gr, which is copyrighted by Paul J. Turner and in the public domain. The interactive version of ACE/GR is xmgr, and includes a graphical interface to the X-windows system. Enhancements to xmgr have been developed which import, manipualate, and plot data from RELAP/MOD3, MELCOR, FRAPCON, and SINDA codes, and NRC databank files. capabilities, include two-phase property table lookup functions, an equation interpreter, arithmetic library functions, and units conversion. Plot titles, labels, legends, and narrative can be displayed using Latin or Cyrillic alphabets.
Understanding Mixed Code and Classroom Code-Switching: Myths and Realities
ERIC Educational Resources Information Center
Li, David C. S.
2008-01-01
Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…
NASA Astrophysics Data System (ADS)
Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.
2014-12-01
We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.
Vectorization, threading, and cache-blocking considerations for hydrocodes on emerging architectures
Fung, J.; Aulwes, R. T.; Bement, M. T.; ...
2015-07-14
This work reports on considerations for improving computational performance in preparation for current and expected changes to computer architecture. The algorithms studied will include increasingly complex prototypes for radiation hydrodynamics codes, such as gradient routines and diffusion matrix assembly (e.g., in [1-6]). The meshes considered for the algorithms are structured or unstructured meshes. The considerations applied for performance improvements are meant to be general in terms of architecture (not specifically graphical processing unit (GPUs) or multi-core machines, for example) and include techniques for vectorization, threading, tiling, and cache blocking. Out of a survey of optimization techniques on applications such asmore » diffusion and hydrodynamics, we make general recommendations with a view toward making these techniques conceptually accessible to the applications code developer. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.« less
Sentiment analysis of political communication: combining a dictionary approach with crowdcoding.
Haselmayer, Martin; Jenny, Marcelo
2017-01-01
Sentiment is important in studies of news values, public opinion, negative campaigning or political polarization and an explosive expansion of digital textual data and fast progress in automated text analysis provide vast opportunities for innovative social science research. Unfortunately, tools currently available for automated sentiment analysis are mostly restricted to English texts and require considerable contextual adaption to produce valid results. We present a procedure for collecting fine-grained sentiment scores through crowdcoding to build a negative sentiment dictionary in a language and for a domain of choice. The dictionary enables the analysis of large text corpora that resource-intensive hand-coding struggles to cope with. We calculate the tonality of sentences from dictionary words and we validate these estimates with results from manual coding. The results show that the crowdbased dictionary provides efficient and valid measurement of sentiment. Empirical examples illustrate its use by analyzing the tonality of party statements and media reports.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-30
...] FDA's Public Database of Products With Orphan-Drug Designation: Replacing Non-Informative Code Names... replaced non- informative code names with descriptive identifiers on its public database of products that... on our public database with non-informative code names. After careful consideration of this matter...
The IHMC CmapTools software in research and education: a multi-level use case in Space Meteorology
NASA Astrophysics Data System (ADS)
Messerotti, Mauro
2010-05-01
The IHMC (Institute for Human and Machine Cognition, Florida University System, USA) CmapTools software is a powerful multi-platform tool for knowledge modelling in graphical form based on concept maps. In this work we present its application for the high-level development of a set of multi-level concept maps in the framework of Space Meteorology to act as the kernel of a space meteorology domain ontology. This is an example of a research use case, as a domain ontology coded in machine-readable form via e.g. OWL (Web Ontology Language) is suitable to be an active layer of any knowledge management system embedded in a Virtual Observatory (VO). Apart from being manageable at machine level, concept maps developed via CmapTools are intrinsically human-readable and can embed hyperlinks and objects of many kinds. Therefore they are suitable to be published on the web: the coded knowledge can be exploited for educational purposes by the students and the public, as the level of information can be naturally organized among linked concept maps in progressively increasing complexity levels. Hence CmapTools and its advanced version COE (Concept-map Ontology Editor) represent effective and user-friendly software tools for high-level knowledge represention in research and education.
Coronado, Rogelio A.; Hill, Alexandra D.; Alappattu, Meryl J.
2018-01-01
Objective The objective of this study was to examine the type and content of Journal of Women’s Health Physical Therapy (JWHPT) publications over the last decade. Study Design Content and bibliometric analysis of published literature Background Component sections, such as the Section on Women’s Health (SoWH) of the American Physical Therapy Association provide content expertise to areas of specialty physical therapist practice, thereby supporting the dissemination of evidence for physical therapists to use. Closely aligned with the mission of the SoWH, JWHPT provides evidence reflecting this practice area. The purpose of our analysis was to examine publications within JWHPT to determine how closely JWHPT is meeting the mission and focus of section members. Methods and Measures We used established bibliographic methodology to code and review manuscripts published online between 2005 and 2015 in JWHPT using established domains (article type, participant type, research design, study purpose, and area of focus). Total publications and proportion of publications based on domain were described. Impact by citation and author was examined using bibliometric software. Results Eighteen percent of the items published in JWHPT were original research papers submitted for the first time. Of those papers, the primary study design was cross-sectional experimental research, most commonly studying interventions. The primary practice area reported was management of incontinence. Conclusions We suggest that a continued need to increase efforts for the submission and publication of a greater proportion of randomized controlled trials and metric articles. PMID:29375281
Domain Specific Language Support for Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadayappan, Ponnuswamy
Domain-Specific Languages (DSLs) offer an attractive path to Exascale software since they provide expressive power through appropriate abstractions and enable domain-specific optimizations. But the advantages of a DSL compete with the difficulties of implementing a DSL, even for a narrowly defined domain. The DTEC project addresses how a variety of DSLs can be easily implemented to leverage existing compiler analysis and transformation capabilities within the ROSE open source compiler as part of a research program focusing on Exascale challenges. The OSU contributions to the DTEC project are in the area of code generation from high-level DSL descriptions, as well asmore » verification of the automatically-generated code.« less
Code of Federal Regulations, 2011 CFR
2011-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2010 CFR
2010-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2013 CFR
2013-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2012 CFR
2012-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems
NASA Astrophysics Data System (ADS)
Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.
2008-08-01
This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.
WEC3: Wave Energy Converter Code Comparison Project: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien
This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less
Employing multi-GPU power for molecular dynamics simulation: an extension of GALAMOST
NASA Astrophysics Data System (ADS)
Zhu, You-Liang; Pan, Deng; Li, Zhan-Wei; Liu, Hong; Qian, Hu-Jun; Zhao, Yang; Lu, Zhong-Yuan; Sun, Zhao-Yan
2018-04-01
We describe the algorithm of employing multi-GPU power on the basis of Message Passing Interface (MPI) domain decomposition in a molecular dynamics code, GALAMOST, which is designed for the coarse-grained simulation of soft matters. The code of multi-GPU version is developed based on our previous single-GPU version. In multi-GPU runs, one GPU takes charge of one domain and runs single-GPU code path. The communication between neighbouring domains takes a similar algorithm of CPU-based code of LAMMPS, but is optimised specifically for GPUs. We employ a memory-saving design which can enlarge maximum system size at the same device condition. An optimisation algorithm is employed to prolong the update period of neighbour list. We demonstrate good performance of multi-GPU runs on the simulation of Lennard-Jones liquid, dissipative particle dynamics liquid, polymer and nanoparticle composite, and two-patch particles on workstation. A good scaling of many nodes on cluster for two-patch particles is presented.
Pereira, Joana; Johnson, Warren E.; O’Brien, Stephen J.; Jarvis, Erich D.; Zhang, Guojie; Gilbert, M. Thomas P.; Vasconcelos, Vitor; Antunes, Agostinho
2014-01-01
The Hedgehog (Hh) gene family codes for a class of secreted proteins composed of two active domains that act as signalling molecules during embryo development, namely for the development of the nervous and skeletal systems and the formation of the testis cord. While only one Hh gene is found typically in invertebrate genomes, most vertebrates species have three (Sonic hedgehog – Shh; Indian hedgehog – Ihh; and Desert hedgehog – Dhh), each with different expression patterns and functions, which likely helped promote the increasing complexity of vertebrates and their successful diversification. In this study, we used comparative genomic and adaptive evolutionary analyses to characterize the evolution of the Hh genes in vertebrates following the two major whole genome duplication (WGD) events. To overcome the lack of Hh-coding sequences on avian publicly available databases, we used an extensive dataset of 45 avian and three non-avian reptilian genomes to show that birds have all three Hh paralogs. We find suggestions that following the WGD events, vertebrate Hh paralogous genes evolved independently within similar linkage groups and under different evolutionary rates, especially within the catalytic domain. The structural regions around the ion-binding site were identified to be under positive selection in the signaling domain. These findings contrast with those observed in invertebrates, where different lineages that experienced gene duplication retained similar selective constraints in the Hh orthologs. Our results provide new insights on the evolutionary history of the Hh gene family, the functional roles of these paralogs in vertebrate species, and on the location of mutational hotspots. PMID:25549322
Caetano-Anollés, Gustavo; Wang, Minglei; Caetano-Anollés, Derek
2013-01-01
The genetic code shapes the genetic repository. Its origin has puzzled molecular scientists for over half a century and remains a long-standing mystery. Here we show that the origin of the genetic code is tightly coupled to the history of aminoacyl-tRNA synthetase enzymes and their interactions with tRNA. A timeline of evolutionary appearance of protein domain families derived from a structural census in hundreds of genomes reveals the early emergence of the ‘operational’ RNA code and the late implementation of the standard genetic code. The emergence of codon specificities and amino acid charging involved tight coevolution of aminoacyl-tRNA synthetases and tRNA structures as well as episodes of structural recruitment. Remarkably, amino acid and dipeptide compositions of single-domain proteins appearing before the standard code suggest archaic synthetases with structures homologous to catalytic domains of tyrosyl-tRNA and seryl-tRNA synthetases were capable of peptide bond formation and aminoacylation. Results reveal that genetics arose through coevolutionary interactions between polypeptides and nucleic acid cofactors as an exacting mechanism that favored flexibility and folding of the emergent proteins. These enhancements of phenotypic robustness were likely internalized into the emerging genetic system with the early rise of modern protein structure. PMID:23991065
Domain Wall Fermion Inverter on Pentium 4
NASA Astrophysics Data System (ADS)
Pochinsky, Andrew
2005-03-01
A highly optimized domain wall fermion inverter has been developed as part of the SciDAC lattice initiative. By designing the code to minimize memory bus traffic, it achieves high cache reuse and performance in excess of 2 GFlops for out of L2 cache problem sizes on a GigE cluster with 2.66 GHz Xeon processors. The code uses the SciDAC QMP communication library.
Colour cyclic code for Brillouin distributed sensors
NASA Astrophysics Data System (ADS)
Le Floch, Sébastien; Sauser, Florian; Llera, Miguel; Rochat, Etienne
2015-09-01
For the first time, a colour cyclic coding (CCC) is theoretically and experimentally demonstrated for Brillouin optical time-domain analysis (BOTDA) distributed sensors. Compared to traditional intensity-modulated cyclic codes, the code presents an additional gain of √2 while keeping the same number of sequences as for a colour coding. A comparison with a standard BOTDA sensor is realized and validates the theoretical coding gain.
A Note on a Sampling Theorem for Functions over GF(q)n Domain
NASA Astrophysics Data System (ADS)
Ukita, Yoshifumi; Saito, Tomohiko; Matsushima, Toshiyasu; Hirasawa, Shigeichi
In digital signal processing, the sampling theorem states that any real valued function ƒ can be reconstructed from a sequence of values of ƒ that are discretely sampled with a frequency at least twice as high as the maximum frequency of the spectrum of ƒ. This theorem can also be applied to functions over finite domain. Then, the range of frequencies of ƒ can be expressed in more detail by using a bounded set instead of the maximum frequency. A function whose range of frequencies is confined to a bounded set is referred to as bandlimited function. And a sampling theorem for bandlimited functions over Boolean domain has been obtained. Here, it is important to obtain a sampling theorem for bandlimited functions not only over Boolean domain (GF(q)n domain) but also over GF(q)n domain, where q is a prime power and GF(q) is Galois field of order q. For example, in experimental designs, although the model can be expressed as a linear combination of the Fourier basis functions and the levels of each factor can be represented by GF(q)n, the number of levels often take a value greater than two. However, the sampling theorem for bandlimited functions over GF(q)n domain has not been obtained. On the other hand, the sampling points are closely related to the codewords of a linear code. However, the relation between the parity check matrix of a linear code and any distinct error vectors has not been obtained, although it is necessary for understanding the meaning of the sampling theorem for bandlimited functions. In this paper, we generalize the sampling theorem for bandlimited functions over Boolean domain to a sampling theorem for bandlimited functions over GF(q)n domain. We also present a theorem for the relation between the parity check matrix of a linear code and any distinct error vectors. Lastly, we clarify the relation between the sampling theorem for functions over GF(q)n domain and linear codes.
Discriminative Transfer Subspace Learning via Low-Rank and Sparse Representation.
Xu, Yong; Fang, Xiaozhao; Wu, Jian; Li, Xuelong; Zhang, David
2016-02-01
In this paper, we address the problem of unsupervised domain transfer learning in which no labels are available in the target domain. We use a transformation matrix to transfer both the source and target data to a common subspace, where each target sample can be represented by a combination of source samples such that the samples from different domains can be well interlaced. In this way, the discrepancy of the source and target domains is reduced. By imposing joint low-rank and sparse constraints on the reconstruction coefficient matrix, the global and local structures of data can be preserved. To enlarge the margins between different classes as much as possible and provide more freedom to diminish the discrepancy, a flexible linear classifier (projection) is obtained by learning a non-negative label relaxation matrix that allows the strict binary label matrix to relax into a slack variable matrix. Our method can avoid a potentially negative transfer by using a sparse matrix to model the noise and, thus, is more robust to different types of noise. We formulate our problem as a constrained low-rankness and sparsity minimization problem and solve it by the inexact augmented Lagrange multiplier method. Extensive experiments on various visual domain adaptation tasks show the superiority of the proposed method over the state-of-the art methods. The MATLAB code of our method will be publicly available at http://www.yongxu.org/lunwen.html.
A Smad action turnover switch operated by WW domain readers of a phosphoserine code
Aragón, Eric; Goerner, Nina; Zaromytidou, Alexia-Ileana; Xi, Qiaoran; Escobedo, Albert; Massagué, Joan; Macias, Maria J.
2011-01-01
When directed to the nucleus by TGF-β or BMP signals, Smad proteins undergo cyclin-dependent kinase 8/9 (CDK8/9) and glycogen synthase kinase-3 (GSK3) phosphorylations that mediate the binding of YAP and Pin1 for transcriptional action, and of ubiquitin ligases Smurf1 and Nedd4L for Smad destruction. Here we demonstrate that there is an order of events—Smad activation first and destruction later—and that it is controlled by a switch in the recognition of Smad phosphoserines by WW domains in their binding partners. In the BMP pathway, Smad1 phosphorylation by CDK8/9 creates binding sites for the WW domains of YAP, and subsequent phosphorylation by GSK3 switches off YAP binding and adds binding sites for Smurf1 WW domains. Similarly, in the TGF-β pathway, Smad3 phosphorylation by CDK8/9 creates binding sites for Pin1 and GSK3, then adds sites to enhance Nedd4L binding. Thus, a Smad phosphoserine code and a set of WW domain code readers provide an efficient solution to the problem of coupling TGF-β signal delivery to turnover of the Smad signal transducers. PMID:21685363
Wilson, Mandy L; Okumoto, Sakiko; Adam, Laura; Peccoud, Jean
2014-01-15
Expression vectors used in different biotechnology applications are designed with domain-specific rules. For instance, promoters, origins of replication or homologous recombination sites are host-specific. Similarly, chromosomal integration or viral delivery of an expression cassette imposes specific structural constraints. As de novo gene synthesis and synthetic biology methods permeate many biotechnology specialties, the design of application-specific expression vectors becomes the new norm. In this context, it is desirable to formalize vector design strategies applicable in different domains. Using the design of constructs to express genes in the chloroplast of Chlamydomonas reinhardtii as an example, we show that a vector design strategy can be formalized as a domain-specific language. We have developed a graphical editor of context-free grammars usable by biologists without prior exposure to language theory. This environment makes it possible for biologists to iteratively improve their design strategies throughout the course of a project. It is also possible to ensure that vectors designed with early iterations of the language are consistent with the latest iteration of the language. The context-free grammar editor is part of the GenoCAD application. A public instance of GenoCAD is available at http://www.genocad.org. GenoCAD source code is available from SourceForge and licensed under the Apache v2.0 open source license.
Parallelization of PANDA discrete ordinates code using spatial decomposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humbert, P.
2006-07-01
We present the parallel method, based on spatial domain decomposition, implemented in the 2D and 3D versions of the discrete Ordinates code PANDA. The spatial mesh is orthogonal and the spatial domain decomposition is Cartesian. For 3D problems a 3D Cartesian domain topology is created and the parallel method is based on a domain diagonal plane ordered sweep algorithm. The parallel efficiency of the method is improved by directions and octants pipelining. The implementation of the algorithm is straightforward using MPI blocking point to point communications. The efficiency of the method is illustrated by an application to the 3D-Ext C5G7more » benchmark of the OECD/NEA. (authors)« less
1DTempPro: analyzing temperature profiles for groundwater/surface-water exchange.
Voytek, Emily B; Drenkelfuss, Anja; Day-Lewis, Frederick D; Healy, Richard; Lane, John W; Werkema, Dale
2014-01-01
A new computer program, 1DTempPro, is presented for the analysis of vertical one-dimensional (1D) temperature profiles under saturated flow conditions. 1DTempPro is a graphical user interface to the U.S. Geological Survey code Variably Saturated 2-Dimensional Heat Transport (VS2DH), which numerically solves the flow and heat-transport equations. Pre- and postprocessor features allow the user to calibrate VS2DH models to estimate vertical groundwater/surface-water exchange and also hydraulic conductivity for cases where hydraulic head is known. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.
Distributed databases for materials study of thermo-kinetic properties
NASA Astrophysics Data System (ADS)
Toher, Cormac
2015-03-01
High-throughput computational materials science provides researchers with the opportunity to rapidly generate large databases of materials properties. To rapidly add thermal properties to the AFLOWLIB consortium and Materials Project repositories, we have implemented an automated quasi-harmonic Debye model, the Automatic GIBBS Library (AGL). This enables us to screen thousands of materials for thermal conductivity, bulk modulus, thermal expansion and related properties. The search and sort functions of the online database can then be used to identify suitable materials for more in-depth study using more precise computational or experimental techniques. AFLOW-AGL source code is public domain and will soon be released within the GNU-GPL license.
A domain specific language for performance portable molecular dynamics algorithms
NASA Astrophysics Data System (ADS)
Saunders, William Robert; Grant, James; Müller, Eike Hermann
2018-03-01
Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.
Histone Code Modulation by Oncogenic PWWP-Domain Protein in Breast Cancers
2010-06-01
athanogene 4 * DDHD2 DDHD domain containing 2 * PPAPDC1B phosphatidic acid phosphatase type 2 domain containing 1B * WHSC1L1 Wolf-Hirschhorn syndrome...from alternative splicing of exon 10. The WHSC1L1 long isoform encodes a 1437 amino acid protein containing 2 PWWP domains, 2 PHD-type zinc finger...motifs, a TANG2 domain, an AWS domain and a SET domain. The short isoform encodes a 645 amino acid protein containing a PWWP domain only. Our western
CDinFusion – Submission-Ready, On-Line Integration of Sequence and Contextual Data
Hankeln, Wolfgang; Wendel, Norma Johanna; Gerken, Jan; Waldmann, Jost; Buttigieg, Pier Luigi; Kostadinov, Ivaylo; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver
2011-01-01
State of the art (DNA) sequencing methods applied in “Omics” studies grant insight into the ‘blueprints’ of organisms from all domains of life. Sequencing is carried out around the globe and the data is submitted to the public repositories of the International Nucleotide Sequence Database Collaboration. However, the context in which these studies are conducted often gets lost, because experimental data, as well as information about the environment are rarely submitted along with the sequence data. If these contextual or metadata are missing, key opportunities of comparison and analysis across studies and habitats are hampered or even impossible. To address this problem, the Genomic Standards Consortium (GSC) promotes checklists and standards to better describe our sequence data collection and to promote the capturing, exchange and integration of sequence data with contextual data. In a recent community effort the GSC has developed a series of recommendations for contextual data that should be submitted along with sequence data. To support the scientific community to significantly enhance the quality and quantity of contextual data in the public sequence data repositories, specialized software tools are needed. In this work we present CDinFusion, a web-based tool to integrate contextual and sequence data in (Multi)FASTA format prior to submission. The tool is open source and available under the Lesser GNU Public License 3. A public installation is hosted and maintained at the Max Planck Institute for Marine Microbiology at http://www.megx.net/cdinfusion. The tool may also be installed locally using the open source code available at http://code.google.com/p/cdinfusion. PMID:21935468
Aerothermo-Structural Analysis of Low Cost Composite Nozzle/Inlet Components
NASA Technical Reports Server (NTRS)
Shivakumar, Kuwigai; Challa, Preeli; Sree, Dave; Reddy, D.
1999-01-01
This research is a cooperative effort among the Turbomachinery and Propulsion Division of NASA Glenn, CCMR of NC A&T State University, and the Tuskegee University. The NC A&T is the lead center and Tuskegee University is the participating institution. Objectives of the research were to develop an integrated aerodynamic, thermal and structural analysis code for design of aircraft engine components, such as, nozzles and inlets made of textile composites; conduct design studies on typical inlets for hypersonic transportation vehicles and setup standards test examples and finally manufacture a scaled down composite inlet. These objectives are accomplished through the following seven tasks: (1) identify the relevant public domain codes for all three types of analysis; (2) evaluate the codes for the accuracy of results and computational efficiency; (3) develop aero-thermal and thermal structural mapping algorithms; (4) integrate all the codes into one single code; (5) write a graphical user interface to improve the user friendliness of the code; (6) conduct test studies for rocket based combined-cycle engine inlet; and finally (7) fabricate a demonstration inlet model using textile preform composites. Tasks one, two and six are being pursued. Selected and evaluated NPARC for flow field analysis, CSTEM for in-depth thermal analysis of inlets and nozzles and FRAC3D for stress analysis. These codes have been independently verified for accuracy and performance. In addition, graphical user interface based on micromechanics analysis for laminated as well as textile composites was developed. Demonstration of this code will be made at the conference. A rocket based combined cycle engine was selected for test studies. Flow field analysis of various inlet geometries were studied. Integration of codes is being continued. The codes developed are being applied to a candidate example of trailblazer engine proposed for space transportation. A successful development of the code will provide a simpler, faster and user-friendly tool for conducting design studies of aircraft and spacecraft engines, applicable in high speed civil transport and space missions.
Finite difference time domain grid generation from AMC helicopter models
NASA Technical Reports Server (NTRS)
Cravey, Robin L.
1992-01-01
A simple technique is presented which forms a cubic grid model of a helicopter from an Aircraft Modeling Code (AMC) input file. The AMC input file defines the helicopter fuselage as a series of polygonal cross sections. The cubic grid model is used as an input to a Finite Difference Time Domain (FDTD) code to obtain predictions of antenna performance on a generic helicopter model. The predictions compare reasonably well with measured data.
A study on multiresolution lossless video coding using inter/intra frame adaptive prediction
NASA Astrophysics Data System (ADS)
Nakachi, Takayuki; Sawabe, Tomoko; Fujii, Tetsuro
2003-06-01
Lossless video coding is required in the fields of archiving and editing digital cinema or digital broadcasting contents. This paper combines a discrete wavelet transform and adaptive inter/intra-frame prediction in the wavelet transform domain to create multiresolution lossless video coding. The multiresolution structure offered by the wavelet transform facilitates interchange among several video source formats such as Super High Definition (SHD) images, HDTV, SDTV, and mobile applications. Adaptive inter/intra-frame prediction is an extension of JPEG-LS, a state-of-the-art lossless still image compression standard. Based on the image statistics of the wavelet transform domains in successive frames, inter/intra frame adaptive prediction is applied to the appropriate wavelet transform domain. This adaptation offers superior compression performance. This is achieved with low computational cost and no increase in additional information. Experiments on digital cinema test sequences confirm the effectiveness of the proposed algorithm.
NASA Technical Reports Server (NTRS)
Zimmerman, Martin L.
1995-01-01
This manual explains the theory and operation of the finite-difference time domain code FDTD-ANT developed by Analex Corporation at the NASA Lewis Research Center in Cleveland, Ohio. This code can be used for solving electromagnetic problems that are electrically small or medium (on the order of 1 to 50 cubic wavelengths). Calculated parameters include transmission line impedance, relative effective permittivity, antenna input impedance, and far-field patterns in both the time and frequency domains. The maximum problem size may be adjusted according to the computer used. This code has been run on the DEC VAX and 486 PC's and on workstations such as the Sun Sparc and the IBM RS/6000.
Coupling MHD and PIC models in 2 dimensions
NASA Astrophysics Data System (ADS)
Daldorff, L.; Toth, G.; Sokolov, I.; Gombosi, T. I.; Lapenta, G.; Brackbill, J. U.; Markidis, S.; Amaya, J.
2013-12-01
Even for extended fluid plasma models, like Hall, anisotropic ion pressure and multi fluid MHD, there are still many plasma phenomena that are not well captured. For this reason, we have coupled the Implicit Particle-In-Cell (iPIC3D) code with the BATSRUS global MHD code. The PIC solver is applied in a part of the computational domain, for example, in the vicinity of reconnection sites, and overwrites the MHD solution. On the other hand, the fluid solver provides the boundary conditions for the PIC code. To demonstrate the use of the coupled codes for magnetospheric applications, we perform a 2D magnetosphere simulation, where BATSRUS solves for Hall MHD in the whole domain except for the tail reconnection region, which is handled by iPIC3D.
Detecting well-being via computerized content analysis of brief diary entries.
Tov, William; Ng, Kok Leong; Lin, Han; Qiu, Lin
2013-12-01
Two studies evaluated the correspondence between self-reported well-being and codings of emotion and life content by the Linguistic Inquiry and Word Count (LIWC; Pennebaker, Booth, & Francis, 2011). Open-ended diary responses were collected from 206 participants daily for 3 weeks (Study 1) and from 139 participants twice a week for 8 weeks (Study 2). LIWC negative emotion consistently correlated with self-reported negative emotion. LIWC positive emotion correlated with self-reported positive emotion in Study 1 but not in Study 2. No correlations were observed with global life satisfaction. Using a co-occurrence coding method to combine LIWC emotion codings with life-content codings, we estimated the frequency of positive and negative events in 6 life domains (family, friends, academics, health, leisure, and money). Domain-specific event frequencies predicted self-reported satisfaction in all domains in Study 1 but not consistently in Study 2. We suggest that the correspondence between LIWC codings and self-reported well-being is affected by the number of writing samples collected per day as well as the target period (e.g., past day vs. past week) assessed by the self-report measure. Extensions and possible implications for the analyses of similar types of open-ended data (e.g., social media messages) are discussed. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Yang, Xinmai; Cleveland, Robin O.
2005-01-01
A time-domain numerical code (the so-called Texas code) that solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation has been extended from an axis-symmetric coordinate system to a three-dimensional (3D) Cartesian coordinate system. The code accounts for diffraction (in the parabolic approximation), nonlinearity and absorption and dispersion associated with thermoviscous and relaxation processes. The 3D time domain code was shown to be in agreement with benchmark solutions for circular and rectangular sources, focused and unfocused beams, and linear and nonlinear propagation. The 3D code was used to model the nonlinear propagation of diagnostic ultrasound pulses through tissue. The prediction of the second-harmonic field was sensitive to the choice of frequency-dependent absorption: a frequency squared f2 dependence produced a second-harmonic field which peaked closer to the transducer and had a lower amplitude than that computed for an f1.1 dependence. In comparing spatial maps of the harmonics we found that the second harmonic had dramatically reduced amplitude in the near field and also lower amplitude side lobes in the focal region than the fundamental. These findings were consistent for both uniform and apodized sources and could be contributing factors in the improved imaging reported with clinical scanners using tissue harmonic imaging. .
Yang, Xinmai; Cleveland, Robin O
2005-01-01
A time-domain numerical code (the so-called Texas code) that solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation has been extended from an axis-symmetric coordinate system to a three-dimensional (3D) Cartesian coordinate system. The code accounts for diffraction (in the parabolic approximation), nonlinearity and absorption and dispersion associated with thermoviscous and relaxation processes. The 3D time domain code was shown to be in agreement with benchmark solutions for circular and rectangular sources, focused and unfocused beams, and linear and nonlinear propagation. The 3D code was used to model the nonlinear propagation of diagnostic ultrasound pulses through tissue. The prediction of the second-harmonic field was sensitive to the choice of frequency-dependent absorption: a frequency squared f2 dependence produced a second-harmonic field which peaked closer to the transducer and had a lower amplitude than that computed for an f1.1 dependence. In comparing spatial maps of the harmonics we found that the second harmonic had dramatically reduced amplitude in the near field and also lower amplitude side lobes in the focal region than the fundamental. These findings were consistent for both uniform and apodized sources and could be contributing factors in the improved imaging reported with clinical scanners using tissue harmonic imaging.
Tools for open geospatial science
NASA Astrophysics Data System (ADS)
Petras, V.; Petrasova, A.; Mitasova, H.
2017-12-01
Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare Department of Health and Human Services ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
Meta-synthesis of health behavior change meta-analyses.
Johnson, Blair T; Scott-Sheldon, Lori A J; Carey, Michael P
2010-11-01
We integrated and compared meta-analytic findings across diverse behavioral interventions to characterize how well they have achieved change in health behavior. Outcomes from 62 meta-analyses of interventions for change in health behavior were quantitatively synthesized, including 1011 primary-level investigations with 599,559 participants. Content coding suggested 6 behavioral domains: eating and physical activity, sexual behavior, addictive behaviors, stress management, female-specific screening and intervention behaviors, and behaviors involving use of health services. Behavior change interventions were efficacious (mean effect sizes = 0.08-0.45). Behavior change was more evident in more recent meta-analyses; those that sampled older interventions and literatures or sampled more published articles; those that included studies that relied on self-report, used briefer interventions, or sampled fewer, older, or female participants; and in some domains (e.g., stress management) more than others (e.g., sexual behaviors). Interventions improved health-related behaviors; however, efficacy varied as a function of participant and intervention characteristics. This meta-synthesis provides information about the efficacy of behavioral change interventions across health domains and populations; this knowledge can inform the design and development of public health interventions and future meta-analyses of these studies.
Quantifying the mechanisms of domain gain in animal proteins.
Buljan, Marija; Frankish, Adam; Bateman, Alex
2010-01-01
Protein domains are protein regions that are shared among different proteins and are frequently functionally and structurally independent from the rest of the protein. Novel domain combinations have a major role in evolutionary innovation. However, the relative contributions of the different molecular mechanisms that underlie domain gains in animals are still unknown. By using animal gene phylogenies we were able to identify a set of high confidence domain gain events and by looking at their coding DNA investigate the causative mechanisms. Here we show that the major mechanism for gains of new domains in metazoan proteins is likely to be gene fusion through joining of exons from adjacent genes, possibly mediated by non-allelic homologous recombination. Retroposition and insertion of exons into ancestral introns through intronic recombination are, in contrast to previous expectations, only minor contributors to domain gains and have accounted for less than 1% and 10% of high confidence domain gain events, respectively. Additionally, exonization of previously non-coding regions appears to be an important mechanism for addition of disordered segments to proteins. We observe that gene duplication has preceded domain gain in at least 80% of the gain events. The interplay of gene duplication and domain gain demonstrates an important mechanism for fast neofunctionalization of genes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... of public domain computer software. (a) General. This section prescribes the procedures for... software under section 805 of Public Law 101-650, 104 Stat. 5089 (1990). Documents recorded in the...
ERIC Educational Resources Information Center
Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias
2017-01-01
Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…
Green's function methods in heavy ion shielding
NASA Technical Reports Server (NTRS)
Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.
1993-01-01
An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.
A valiant little terminal: A VLT user's manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weinstein, A.
1992-08-01
VLT came to be used at SLAC (Stanford Linear Accelerator Center), because SLAC wanted to assess the Amiga's usefulness as a color graphics terminal and T{sub E}X workstation. Before the project could really begin, the people at SLAC needed a terminal emulator which could successfully talk to the IBM 3081 (now the IBM ES9000-580) and all the VAXes on the site. Moreover, it had to compete in quality with the Ann Arbor Ambassador GXL terminals which were already in use at the laboratory. Unfortunately, at the time there was no commercial program which fit the bill. Luckily, Willy Langeveld hadmore » been independently hacking up a public domain VT100 emulator written by Dave Wecker et al. and the result, VLT, suited SLAC's purpose. Over the years, as the program was debugged and rewritten, the original code disappeared, so that now, in the present version of VLT, none of the original VT100 code remains.« less
A valiant little terminal: A VLT user`s manual. Revision 4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weinstein, A.
1992-08-01
VLT came to be used at SLAC (Stanford Linear Accelerator Center), because SLAC wanted to assess the Amiga`s usefulness as a color graphics terminal and T{sub E}X workstation. Before the project could really begin, the people at SLAC needed a terminal emulator which could successfully talk to the IBM 3081 (now the IBM ES9000-580) and all the VAXes on the site. Moreover, it had to compete in quality with the Ann Arbor Ambassador GXL terminals which were already in use at the laboratory. Unfortunately, at the time there was no commercial program which fit the bill. Luckily, Willy Langeveld hadmore » been independently hacking up a public domain VT100 emulator written by Dave Wecker et al. and the result, VLT, suited SLAC`s purpose. Over the years, as the program was debugged and rewritten, the original code disappeared, so that now, in the present version of VLT, none of the original VT100 code remains.« less
Taki, M; Signorini, A; Oton, C J; Nannipieri, T; Di Pasquale, F
2013-10-15
We experimentally demonstrate the use of cyclic pulse coding for distributed strain and temperature measurements in hybrid Raman/Brillouin optical time-domain analysis (BOTDA) optical fiber sensors. The highly integrated proposed solution effectively addresses the strain/temperature cross-sensitivity issue affecting standard BOTDA sensors, allowing for simultaneous meter-scale strain and temperature measurements over 10 km of standard single mode fiber using a single narrowband laser source only.
Dickinson, Dwight; Ramsey, Mary E; Gold, James M
2007-05-01
In focusing on potentially localizable cognitive impairments, the schizophrenia meta-analytic literature has overlooked the largest single impairment: on digit symbol coding tasks. To compare the magnitude of the schizophrenia impairment on coding tasks with impairments on other traditional neuropsychological instruments. MEDLINE and PsycINFO electronic databases and reference lists from identified articles. English-language studies from 1990 to present, comparing performance of patients with schizophrenia and healthy controls on coding tasks and cognitive measures representing at least 2 other cognitive domains. Of 182 studies identified, 40 met all criteria for inclusion in the meta-analysis. Means, standard deviations, and sample sizes were extracted for digit symbol coding and 36 other cognitive variables. In addition, we recorded potential clinical moderator variables, including chronicity/severity, medication status, age, and education, and potential study design moderators, including coding task variant, matching, and study publication date. Main analyses synthesized data from 37 studies comprising 1961 patients with schizophrenia and 1444 comparison subjects. Combination of mean effect sizes across studies by means of a random effects model yielded a weighted mean effect for digit symbol coding of g = -1.57 (95% confidence interval, -1.66 to -1.48). This effect compared with a grand mean effect of g = -0.98 and was significantly larger than effects for widely used measures of episodic memory, executive functioning, and working memory. Moderator variable analyses indicated that clinical and study design differences between studies had little effect on the coding task effect. Comparison with previous meta-analyses suggested that current results were representative of the broader literature. Subsidiary analysis of data from relatives of patients with schizophrenia also suggested prominent coding task impairments in this group. The 5-minute digit symbol coding task, reliable and easy to administer, taps an information processing inefficiency that is a central feature of the cognitive deficit in schizophrenia and deserves systematic investigation.
48 CFR 1501.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 1501.105-1 Section 1501.105-1 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY GENERAL GENERAL Purpose, Authority, Issuance 1501.105-1 Publication and code arrangement. The...
Meta4: a web application for sharing and annotating metagenomic gene predictions using web services.
Richardson, Emily J; Escalettes, Franck; Fotheringham, Ian; Wallace, Robert J; Watson, Mick
2013-01-01
Whole-genome shotgun metagenomics experiments produce DNA sequence data from entire ecosystems, and provide a huge amount of novel information. Gene discovery projects require up-to-date information about sequence homology and domain structure for millions of predicted proteins to be presented in a simple, easy-to-use system. There is a lack of simple, open, flexible tools that allow the rapid sharing of metagenomics datasets with collaborators in a format they can easily interrogate. We present Meta4, a flexible and extensible web application that can be used to share and annotate metagenomic gene predictions. Proteins and predicted domains are stored in a simple relational database, with a dynamic front-end which displays the results in an internet browser. Web services are used to provide up-to-date information about the proteins from homology searches against public databases. Information about Meta4 can be found on the project website, code is available on Github, a cloud image is available, and an example implementation can be seen at.
Libpsht - algorithms for efficient spherical harmonic transforms
NASA Astrophysics Data System (ADS)
Reinecke, M.
2011-02-01
Libpsht (or "library for performant spherical harmonic transforms") is a collection of algorithms for efficient conversion between spatial-domain and spectral-domain representations of data defined on the sphere. The package supports both transforms of scalars and spin-1 and spin-2 quantities, and can be used for a wide range of pixelisations (including HEALPix, GLESP, and ECP). It will take advantage of hardware features such as multiple processor cores and floating-point vector operations, if available. Even without this additional acceleration, the employed algorithms are among the most efficient (in terms of CPU time, as well as memory consumption) currently being used in the astronomical community. The library is written in strictly standard-conforming C90, ensuring portability to many different hard- and software platforms, and allowing straightforward integration with codes written in various programming languages like C, C++, Fortran, Python etc. Libpsht is distributed under the terms of the GNU General Public License (GPL) version 2 and can be downloaded from .
Libpsht: Algorithms for Efficient Spherical Harmonic Transforms
NASA Astrophysics Data System (ADS)
Reinecke, Martin
2010-10-01
Libpsht (or "library for Performing Spherical Harmonic Transforms") is a collection of algorithms for efficient conversion between spatial-domain and spectral-domain representations of data defined on the sphere. The package supports transforms of scalars as well as spin-1 and spin-2 quantities, and can be used for a wide range of pixelisations (including HEALPix, GLESP and ECP). It will take advantage of hardware features like multiple processor cores and floating-point vector operations, if available. Even without this additional acceleration, the employed algorithms are among the most efficient (in terms of CPU time as well as memory consumption) currently being used in the astronomical community. The library is written in strictly standard-conforming C90, ensuring portability to many different hard- and software platforms, and allowing straightforward integration with codes written in various programming languages like C, C++, Fortran, Python etc. Libpsht is distributed under the terms of the GNU General Public License (GPL) version 2. Development on this project has ended; its successor is libsharp (ascl:1402.033).
ERIC Educational Resources Information Center
Meadows, William C.
2011-01-01
Interest in North American Indian code talkers continues to increase. In addition to numerous works about the Navajo code talkers, several publications on other groups of Native American code talkers--including the Choctaw, Comanche, Hopi, Meskwaki, Canadian Cree--and about code talkers in general have appeared. This article chronicles recent…
48 CFR 1.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...
48 CFR 901.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Publication and code arrangement. 901.105-1 Section 901.105-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 901.105-1 Publication and code...
48 CFR 1.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...
48 CFR 901.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Publication and code arrangement. 901.105-1 Section 901.105-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 901.105-1 Publication and code...
48 CFR 2001.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 2001.104-1 Section 2001.104-1 Federal Acquisition Regulations System NUCLEAR REGULATORY... 2001.104-1 Publication and code arrangement. (a) The NRCAR and its subsequent changes are: (1...
48 CFR 1.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...
Scientific and Technical Publishing at Goddard Space Flight Center in Fiscal Year 1994
NASA Technical Reports Server (NTRS)
1994-01-01
This publication is a compilation of scientific and technical material that was researched, written, prepared, and disseminated by the Center's scientists and engineers during FY94. It is presented in numerical order of the GSFC author's sponsoring technical directorate; i.e., Code 300 is the Office of Flight Assurance, Code 400 is the Flight Projects Directorate, Code 500 is the Mission Operations and Data Systems Directorate, Code 600 is the Space Sciences Directorate, Code 700 is the Engineering Directorate, Code 800 is the Suborbital Projects and Operations Directorate, and Code 900 is the Earth Sciences Directorate. The publication database contains publication or presentation title, author(s), document type, sponsor, and organizational code. This is the second annual compilation for the Center.
A UML-based meta-framework for system design in public health informatics.
Orlova, Anna O; Lehmann, Harold
2002-01-01
The National Agenda for Public Health Informatics calls for standards in data and knowledge representation within public health, which requires a multi-level framework that links all aspects of public health. The literature of public health informatics and public health informatics application were reviewed. A UML-based systems analysis was performed. Face validity of results was evaluated in analyzing the public health domain of lead poisoning. The core class of the UML-based system of public health is the Public Health Domain, which is associated with multiple Problems, for which Actors provide Perspectives. Actors take Actions that define, generate, utilize and/or evaluate Data Sources. The life cycle of the domain is a sequence of activities attributed to its problems that spirals through multiple iterations and realizations within a domain. The proposed Public Health Informatics Meta-Framework broadens efforts in applying informatics principles to the field of public health
48 CFR 3001.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...
48 CFR 1001.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Publication and code arrangement. 1001.105-1 Section 1001.105-1 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY....105-1 Publication and code arrangement. The DTAR and its subsequent changes will be published in the...
48 CFR 1601.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 1601.104-1 Section 1601.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL... SYSTEM Purpose, Authority, Issuance 1601.104-1 Publication and code arrangement. (a) The FEHBAR and its...
48 CFR 1301.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Publication and code arrangement. 1301.105-1 Section 1301.105-1 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE... Publication and code arrangement. (a) The CAR is published in the Federal Register, in cumulative form in the...
48 CFR 2101.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 2101.104-1 Section 2101.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL... REGULATIONS SYSTEM Purpose, Authority, Issuance 2101.104-1 Publication and code arrangement. (a) The LIFAR and...
48 CFR 1001.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Publication and code arrangement. 1001.105-1 Section 1001.105-1 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY....105-1 Publication and code arrangement. The DTAR and its subsequent changes will be published in the...
48 CFR 3001.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...
48 CFR 3001.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...
48 CFR 1901.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 1901.104-1 Section 1901.104-1 Federal Acquisition Regulations System BROADCASTING BOARD OF..., Issuance 1901.104-1 Publication and code arrangement. (a) The IAAR is published in the Federal Register and...
48 CFR 3001.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...
48 CFR 501.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES... 501.105-1 Publication and code arrangement. The GSAR is published in the following sources: (a) Daily...
48 CFR 1301.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Publication and code arrangement. 1301.105-1 Section 1301.105-1 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE... Publication and code arrangement. (a) The CAR is published in the Federal Register, in cumulative form in the...
48 CFR 501.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES... 501.105-1 Publication and code arrangement. The GSAR is published in the following sources: (a) Daily...
48 CFR 501.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES... 501.105-1 Publication and code arrangement. The GSAR is published in the following sources: (a) Daily...
Vargus-Adams, J N; Martin, L K
2011-03-01
The aim of this study was to assess the domains of importance in therapeutic intervention for cerebral palsy (CP) using categories of the International Classification of Functioning, Disability, and Health - Children and Youth Version (ICF-CY). A total of 17 youth, 19 parents and 39 medical professionals responded to the open-ended query: 'What are the things you find most important to consider when you evaluate the effects of an intervention for yourself/your child/your patient with cerebral palsy?' Surveys were either mailed or conducted on-line. Responses were coded by two reviewers using the ICF-CY and discrepancies were resolved. Responses were distributed across the ICF-CY domains of Body Functions and Structures, Activities and Participation, and Environmental Factors, as well as non-ICF-CY concepts including quality of life. The most common responses overall were pain, motor function, mobility, community life and public services. Youth identified strength, gait pattern, hand/arm use and use of assistive technologies as priorities whereas parents were concerned with motor function, communication, mobility and provision of public services. Medical professionals listed pain, function, mobility, community life and participation most often. All surveyed groups indicate a desire to see changes in body functions and structures (pain, mental function, strength, movement), activities and participation (communication, hand/arm use, walking, school, recreation/community life) and quality of life following therapeutic interventions for CP. These results demonstrate the multiple, varied concerns regarding CP across the spectrum of functioning and health. © 2010 Blackwell Publishing Ltd.
Bhasi, Ashwini; Philip, Philge; Manikandan, Vinu; Senapathy, Periannan
2009-01-01
We have developed ExDom, a unique database for the comparative analysis of the exon–intron structures of 96 680 protein domains from seven eukaryotic organisms (Homo sapiens, Mus musculus, Bos taurus, Rattus norvegicus, Danio rerio, Gallus gallus and Arabidopsis thaliana). ExDom provides integrated access to exon-domain data through a sophisticated web interface which has the following analytical capabilities: (i) intergenomic and intragenomic comparative analysis of exon–intron structure of domains; (ii) color-coded graphical display of the domain architecture of proteins correlated with their corresponding exon-intron structures; (iii) graphical analysis of multiple sequence alignments of amino acid and coding nucleotide sequences of homologous protein domains from seven organisms; (iv) comparative graphical display of exon distributions within the tertiary structures of protein domains; and (v) visualization of exon–intron structures of alternative transcripts of a gene correlated to variations in the domain architecture of corresponding protein isoforms. These novel analytical features are highly suited for detailed investigations on the exon–intron structure of domains and make ExDom a powerful tool for exploring several key questions concerning the function, origin and evolution of genes and proteins. ExDom database is freely accessible at: http://66.170.16.154/ExDom/. PMID:18984624
Universal Frequency Domain Baseband Receiver Structure for Future Military Software Defined Radios
2010-09-01
selective channels, i.e., it may have a poor performance at good conditions [4]. Military systems may require a direct sequence ( DS ) component for...frequency bins using a spreading code. This is called the MC- CDMA signal. Note that spreading does not need to cover all the subcarriers but just a few, like...preambles with appropriate frequency domain properties. A DS component can be added as usually. The FDP block then includes this code as a reference
NASA Astrophysics Data System (ADS)
Baran, Á.; Noszály, Cs.; Vertse, T.
2018-07-01
A renewed version of the computer code GAMOW (Vertse et al., 1982) is given in which the difficulties in calculating broad neutron resonances are amended. New types of phenomenological neutron potentials with strict finite range are built in. Landscape of the S-matrix can be generated on a given domain of the complex wave number plane and S-matrix poles in the domain are localized. Normalized Gamow wave functions and trajectories of given poles can be calculated optionally.
myChEMBL: a virtual machine implementation of open data and cheminformatics tools.
Ochoa, Rodrigo; Davies, Mark; Papadatos, George; Atkinson, Francis; Overington, John P
2014-01-15
myChEMBL is a completely open platform, which combines public domain bioactivity data with open source database and cheminformatics technologies. myChEMBL consists of a Linux (Ubuntu) Virtual Machine featuring a PostgreSQL schema with the latest version of the ChEMBL database, as well as the latest RDKit cheminformatics libraries. In addition, a self-contained web interface is available, which can be modified and improved according to user specifications. The VM is available at: ftp://ftp.ebi.ac.uk/pub/databases/chembl/VM/myChEMBL/current. The web interface and web services code is available at: https://github.com/rochoa85/myChEMBL.
A distributed version of the NASA Engine Performance Program
NASA Technical Reports Server (NTRS)
Cours, Jeffrey T.; Curlett, Brian P.
1993-01-01
Distributed NEPP, a version of the NASA Engine Performance Program, uses the original NEPP code but executes it in a distributed computer environment. Multiple workstations connected by a network increase the program's speed and, more importantly, the complexity of the cases it can handle in a reasonable time. Distributed NEPP uses the public domain software package, called Parallel Virtual Machine, allowing it to execute on clusters of machines containing many different architectures. It includes the capability to link with other computers, allowing them to process NEPP jobs in parallel. This paper discusses the design issues and granularity considerations that entered into programming Distributed NEPP and presents the results of timing runs.
Spatio-thermal depth correction of RGB-D sensors based on Gaussian processes in real-time
NASA Astrophysics Data System (ADS)
Heindl, Christoph; Pönitz, Thomas; Stübl, Gernot; Pichler, Andreas; Scharinger, Josef
2018-04-01
Commodity RGB-D sensors capture color images along with dense pixel-wise depth information in real-time. Typical RGB-D sensors are provided with a factory calibration and exhibit erratic depth readings due to coarse calibration values, ageing and thermal influence effects. This limits their applicability in computer vision and robotics. We propose a novel method to accurately calibrate depth considering spatial and thermal influences jointly. Our work is based on Gaussian Process Regression in a four dimensional Cartesian and thermal domain. We propose to leverage modern GPUs for dense depth map correction in real-time. For reproducibility we make our dataset and source code publicly available.
An analytic approach to sunset diagrams in chiral perturbation theory: Theory and practice
NASA Astrophysics Data System (ADS)
Ananthanarayan, B.; Bijnens, Johan; Ghosh, Shayan; Hebbar, Aditya
2016-12-01
We demonstrate the use of several code implementations of the Mellin-Barnes method available in the public domain to derive analytic expressions for the sunset diagrams that arise in the two-loop contribution to the pion mass and decay constant in three-flavoured chiral perturbation theory. We also provide results for all possible two mass configurations of the sunset integral, and derive a new one-dimensional integral representation for the one mass sunset integral with arbitrary external momentum. Thoroughly annotated Mathematica notebooks are provided as ancillary files in the Electronic Supplementary Material to this paper, which may serve as pedagogical supplements to the methods described in this paper.
Structural reliability analysis of laminated CMC components
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Palko, Joseph L.; Gyekenyesi, John P.
1991-01-01
For laminated ceramic matrix composite (CMC) materials to realize their full potential in aerospace applications, design methods and protocols are a necessity. The time independent failure response of these materials is focussed on and a reliability analysis is presented associated with the initiation of matrix cracking. A public domain computer algorithm is highlighted that was coupled with the laminate analysis of a finite element code and which serves as a design aid to analyze structural components made from laminated CMC materials. Issues relevant to the effect of the size of the component are discussed, and a parameter estimation procedure is presented. The estimation procedure allows three parameters to be calculated from a failure population that has an underlying Weibull distribution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makwana, K. D., E-mail: kirit.makwana@gmx.com; Cattaneo, F.; Zhdankin, V.
Simulations of decaying magnetohydrodynamic (MHD) turbulence are performed with a fluid and a kinetic code. The initial condition is an ensemble of long-wavelength, counter-propagating, shear-Alfvén waves, which interact and rapidly generate strong MHD turbulence. The total energy is conserved and the rate of turbulent energy decay is very similar in both codes, although the fluid code has numerical dissipation, whereas the kinetic code has kinetic dissipation. The inertial range power spectrum index is similar in both the codes. The fluid code shows a perpendicular wavenumber spectral slope of k{sub ⊥}{sup −1.3}. The kinetic code shows a spectral slope of k{submore » ⊥}{sup −1.5} for smaller simulation domain, and k{sub ⊥}{sup −1.3} for larger domain. We estimate that collisionless damping mechanisms in the kinetic code can account for the dissipation of the observed nonlinear energy cascade. Current sheets are geometrically characterized. Their lengths and widths are in good agreement between the two codes. The length scales linearly with the driving scale of the turbulence. In the fluid code, their thickness is determined by the grid resolution as there is no explicit diffusivity. In the kinetic code, their thickness is very close to the skin-depth, irrespective of the grid resolution. This work shows that kinetic codes can reproduce the MHD inertial range dynamics at large scales, while at the same time capturing important kinetic physics at small scales.« less
Flexible Generation of Kalman Filter Code
NASA Technical Reports Server (NTRS)
Richardson, Julian; Wilson, Edward
2006-01-01
Domain-specific program synthesis can automatically generate high quality code in complex domains from succinct specifications, but the range of programs which can be generated by a given synthesis system is typically narrow. Obtaining code which falls outside this narrow scope necessitates either 1) extension of the code generator, which is usually very expensive, or 2) manual modification of the generated code, which is often difficult and which must be redone whenever changes are made to the program specification. In this paper, we describe adaptations and extensions of the AUTOFILTER Kalman filter synthesis system which greatly extend the range of programs which can be generated. Users augment the input specification with a specification of code fragments and how those fragments should interleave with or replace parts of the synthesized filter. This allows users to generate a much wider range of programs without their needing to modify the synthesis system or edit generated code. We demonstrate the usefulness of the approach by applying it to the synthesis of a complex state estimator which combines code from several Kalman filters with user-specified code. The work described in this paper allows the complex design decisions necessary for real-world applications to be reflected in the synthesized code. When executed on simulated input data, the generated state estimator was found to produce comparable estimates to those produced by a handcoded estimator
Fernández-Lansac, Violeta; Crespo, María
2017-07-26
This study introduces a new coding system, the Coding and Assessment System for Narratives of Trauma (CASNOT), to analyse several language domains in narratives of autobiographical memories, especially in trauma narratives. The development of the coding system is described. It was applied to assess positive and traumatic/negative narratives in 50 battered women (trauma-exposed group) and 50 nontrauma-exposed women (control group). Three blind raters coded each narrative. Inter-rater reliability analyses were conducted for the CASNOT language categories (multirater Kfree coefficients) and dimensions (intraclass correlation coefficients). High levels of inter-rater agreement were found for most of the language domains. Categories that did not reach the expected reliability were mainly those related to cognitive processes, which reflects difficulties in operationalizing constructs such as lack of control or helplessness, control or planning, and rationalization or memory elaboration. Applications and limitations of the CASNOT are discussed to enhance narrative measures for autobiographical memories.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code version D is a 3-D numerical electromagnetic scattering code based upon the finite difference time domain technique (FDTD). The manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction; description of the FDTD method; operation; resource requirements; version D code capabilities; a brief description of the default scattering geometry; a brief description of each subroutine; a description of the include file; a section briefly discussing Radar Cross Section computations; a section discussing some scattering results; a sample problem setup section; a new problem checklist; references and figure titles. The FDTD technique models transient electromagnetic scattering and interactions with objects of arbitrary shape and/or material composition. In the FDTD method, Maxwell's curl equations are discretized in time-space and all derivatives (temporal and spatial) are approximated by central differences.
Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers
Wang, Bei; Ethier, Stephane; Tang, William; ...
2017-06-29
The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less
Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bei; Ethier, Stephane; Tang, William
The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less
NASA Technical Reports Server (NTRS)
Sanz, J. M.
1983-01-01
The method of complex characteristics and hodograph transformation for the design of shockless airfoils was extended to design supercritical cascades with high solidities and large inlet angles. This capability was achieved by introducing a conformal mapping of the hodograph domain onto an ellipse and expanding the solution in terms of Tchebycheff polynomials. A computer code was developd based on this idea. A number of airfoils designed with the code are presented. Various supercritical and subcritical compressor, turbine and propeller sections are shown. The lag-entrainment method for the calculation of a turbulent boundary layer was incorporated to the inviscid design code. The results of this calculation are shown for the airfoils described. The elliptic conformal transformation developed to map the hodograph domain onto an ellipse can be used to generate a conformal grid in the physical domain of a cascade of airfoils with open trailing edges with a single transformation. A grid generated with this transformation is shown for the Korn airfoil.
GBS: Global 3D simulation of tokamak edge region
NASA Astrophysics Data System (ADS)
Zhu, Ben; Fisher, Dustin; Rogers, Barrett; Ricci, Paolo
2012-10-01
A 3D two-fluid global code, namely Global Braginskii Solver (GBS), is being developed to explore the physics of turbulent transport, confinement, self-consistent profile formation, pedestal scaling and related phenomena in the edge region of tokamaks. Aimed at solving drift-reduced Braginskii equations [1] in complex magnetic geometry, the GBS is used for turbulence simulation in SOL region. In the recent upgrade, the simulation domain is expanded into close flux region with twist-shift boundary conditions. Hence, the new GBS code is able to explore global transport physics in an annular full-torus domain from the top of the pedestal into the far SOL. We are in the process of identifying and analyzing the linear and nonlinear instabilities in the system using the new GBS code. Preliminary results will be presented and compared with other codes if possible.[4pt] [1] A. Zeiler, J. F. Drake and B. Rogers, Phys. Plasmas 4, 2134 (1997)
Evolution of the snake body form reveals homoplasy in amniote Hox gene function.
Head, Jason J; Polly, P David
2015-04-02
Hox genes regulate regionalization of the axial skeleton in vertebrates, and changes in their expression have been proposed to be a fundamental mechanism driving the evolution of new body forms. The origin of the snake-like body form, with its deregionalized pre-cloacal axial skeleton, has been explained as either homogenization of Hox gene expression domains, or retention of standard vertebrate Hox domains with alteration of downstream expression that suppresses development of distinct regions. Both models assume a highly regionalized ancestor, but the extent of deregionalization of the primaxial domain (vertebrae, dorsal ribs) of the skeleton in snake-like body forms has never been analysed. Here we combine geometric morphometrics and maximum-likelihood analysis to show that the pre-cloacal primaxial domain of elongate, limb-reduced lizards and snakes is not deregionalized compared with limbed taxa, and that the phylogenetic structure of primaxial morphology in reptiles does not support a loss of regionalization in the evolution of snakes. We demonstrate that morphometric regional boundaries correspond to mapped gene expression domains in snakes, suggesting that their primaxial domain is patterned by a normally functional Hox code. Comparison of primaxial osteology in fossil and modern amniotes with Hox gene distributions within Amniota indicates that a functional, sequentially expressed Hox code patterned a subtle morphological gradient along the anterior-posterior axis in stem members of amniote clades and extant lizards, including snakes. The highly regionalized skeletons of extant archosaurs and mammals result from independent evolution in the Hox code and do not represent ancestral conditions for clades with snake-like body forms. The developmental origin of snakes is best explained by decoupling of the primaxial and abaxial domains and by increases in somite number, not by changes in the function of primaxial Hox genes.
Orthogonal Multi-Carrier DS-CDMA with Frequency-Domain Equalization
NASA Astrophysics Data System (ADS)
Tanaka, Ken; Tomeba, Hiromichi; Adachi, Fumiyuki
Orthogonal multi-carrier direct sequence code division multiple access (orthogonal MC DS-CDMA) is a combination of orthogonal frequency division multiplexing (OFDM) and time-domain spreading, while multi-carrier code division multiple access (MC-CDMA) is a combination of OFDM and frequency-domain spreading. In MC-CDMA, a good bit error rate (BER) performance can be achieved by using frequency-domain equalization (FDE), since the frequency diversity gain is obtained. On the other hand, the conventional orthogonal MC DS-CDMA fails to achieve any frequency diversity gain. In this paper, we propose a new orthogonal MC DS-CDMA that can obtain the frequency diversity gain by applying FDE. The conditional BER analysis is presented. The theoretical average BER performance in a frequency-selective Rayleigh fading channel is evaluated by the Monte-Carlo numerical computation method using the derived conditional BER and is confirmed by computer simulation of the orthogonal MC DS-CDMA signal transmission.
An Object-Oriented Approach to Writing Computational Electromagnetics Codes
NASA Technical Reports Server (NTRS)
Zimmerman, Martin; Mallasch, Paul G.
1996-01-01
Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.
42 CFR 24.8 - Applicability of provisions of Title 5, U.S. Code.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 1 2014-10-01 2014-10-01 false Applicability of provisions of Title 5, U.S. Code. 24.8 Section 24.8 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES PERSONNEL SENIOR BIOMEDICAL RESEARCH SERVICE § 24.8 Applicability of provisions of Title 5, U.S. Code. (a...
42 CFR 24.8 - Applicability of provisions of Title 5, U.S. Code.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false Applicability of provisions of Title 5, U.S. Code. 24.8 Section 24.8 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES PERSONNEL SENIOR BIOMEDICAL RESEARCH SERVICE § 24.8 Applicability of provisions of Title 5, U.S. Code. (a...
42 CFR 24.8 - Applicability of provisions of Title 5, U.S. Code.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 1 2012-10-01 2012-10-01 false Applicability of provisions of Title 5, U.S. Code. 24.8 Section 24.8 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES PERSONNEL SENIOR BIOMEDICAL RESEARCH SERVICE § 24.8 Applicability of provisions of Title 5, U.S. Code. (a...
42 CFR 24.8 - Applicability of provisions of Title 5, U.S. Code.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 1 2011-10-01 2011-10-01 false Applicability of provisions of Title 5, U.S. Code. 24.8 Section 24.8 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES PERSONNEL SENIOR BIOMEDICAL RESEARCH SERVICE § 24.8 Applicability of provisions of Title 5, U.S. Code. (a...
42 CFR 24.8 - Applicability of provisions of Title 5, U.S. Code.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 1 2013-10-01 2013-10-01 false Applicability of provisions of Title 5, U.S. Code. 24.8 Section 24.8 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES PERSONNEL SENIOR BIOMEDICAL RESEARCH SERVICE § 24.8 Applicability of provisions of Title 5, U.S. Code. (a...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saad, Tony; Sutherland, James C.
To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less
Saad, Tony; Sutherland, James C.
2016-05-04
To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less
Chen, Huizhong; Li, Xin-Liang; Blum, David L; Ximenes, Eduardo A; Ljungdahl, Lars G
2003-01-01
A cDNA, designated celF, encoding a cellulase (CelF) was isolated from the anaerobic fungus Orpinomyces PC-2. The open reading frame contains regions coding for a signal peptide, a carbohydrate-binding module (CBM), a linker, and a catalytic domain. The catalytic domain was homologous to those of CelA and CelC of the same fungus and to that of the Neocallimastix patriciarum CELA, but CelF lacks a docking domain, characteristic for enzymes of cellulosomes. It was also homologous to the cellobiohydrolase IIs and endoglucanases of aerobic organisms. The gene has a 111-bp intron, located within the CBM-coding region. Some biochemical properties of the purified recombinant enzyme are described.
Certifying Domain-Specific Policies
NASA Technical Reports Server (NTRS)
Lowry, Michael; Pressburger, Thomas; Rosu, Grigore; Koga, Dennis (Technical Monitor)
2001-01-01
Proof-checking code for compliance to safety policies potentially enables a product-oriented approach to certain aspects of software certification. To date, previous research has focused on generic, low-level programming-language properties such as memory type safety. In this paper we consider proof-checking higher-level domain -specific properties for compliance to safety policies. The paper first describes a framework related to abstract interpretation in which compliance to a class of certification policies can be efficiently calculated Membership equational logic is shown to provide a rich logic for carrying out such calculations, including partiality, for certification. The architecture for a domain-specific certifier is described, followed by an implemented case study. The case study considers consistency of abstract variable attributes in code that performs geometric calculations in Aerospace systems.
The electromagnetic modeling of thin apertures using the finite-difference time-domain technique
NASA Technical Reports Server (NTRS)
Demarest, Kenneth R.
1987-01-01
A technique which computes transient electromagnetic responses of narrow apertures in complex conducting scatterers was implemented as an extension of previously developed Finite-Difference Time-Domain (FDTD) computer codes. Although these apertures are narrow with respect to the wavelengths contained within the power spectrum of excitation, this technique does not require significantly more computer resources to attain the increased resolution at the apertures. In the report, an analytical technique which utilizes Babinet's principle to model the apertures is developed, and an FDTD computer code which utilizes this technique is described.
NASA Astrophysics Data System (ADS)
Jia, Shouqing; La, Dongsheng; Ma, Xuelian
2018-04-01
The finite difference time domain (FDTD) algorithm and Green function algorithm are implemented into the numerical simulation of electromagnetic waves in Schwarzschild space-time. FDTD method in curved space-time is developed by filling the flat space-time with an equivalent medium. Green function in curved space-time is obtained by solving transport equations. Simulation results validate both the FDTD code and Green function code. The methods developed in this paper offer a tool to solve electromagnetic scattering problems.
2007-10-16
ABSTRACT c. THIS PAGE ABSTRACT OF Francis Otuonye P U UU24 19b. TELEPHONE NUMBER (Include area code ) 24 931-372-3374 Standard Form 298 (Rev. 8/98...modulation pulse wavefom--sotware defined or cognitive. From a information-theoretical viewpoint, the two parts as a whole form so-called "pre- coding ". I...The time domain system Fig. 2.3(b) is based on digital sampling oscilloscope (DSO), Textronix TDS 7000E3. The time domain sounder has the capability
Weatherson, Katie A; McKay, Rhyann; Gainforth, Heather L; Jung, Mary E
2017-10-23
In British Columbia Canada, a Daily Physical Activity (DPA) policy was mandated that requires elementary school teachers to provide students with opportunities to achieve 30 min of physical activity during the school day. However, the implementation of school-based physical activity policies is influenced by many factors. A theoretical examination of the factors that impede and enhance teachers' implementation of physical activity policies is necessary in order to develop strategies to improve policy practice and achieve desired outcomes. This study used the Theoretical Domains Framework (TDF) to understand teachers' barriers and facilitators to the implementation of the DPA policy in one school district. Additionally, barriers and facilitators were examined and compared according to how the teacher implemented the DPA policy during the instructional school day. Interviews were conducted with thirteen teachers and transcribed verbatim. One researcher performed barrier and facilitator extraction, with double extraction occurring across a third of the interview transcripts by a second researcher. A deductive and inductive analytical approach in a two-stage process was employed whereby barriers and facilitators were deductively coded using TDF domains (content analysis) and analyzed for sub-themes within each domain. Two researchers performed coding. A total of 832 items were extracted from the interview transcripts. Some items were coded into multiple TDF domains, resulting in a total of 1422 observations. The most commonly coded TDF domains accounting for 75% of the total were Environmental context and resources (ECR; n = 250), Beliefs about consequences (n = 225), Social influences (n = 193), Knowledge (n = 100), and Intentions (n = 88). Teachers who implemented DPA during instructional time differed from those who relied on non-instructional time in relation to Goals, Behavioural regulation, Social/professional role and identity, Beliefs about Consequences. Forty-one qualitative sub-themes were identified across the fourteen domains and exemplary quotes were highlighted. Teachers identified barriers and facilitators relating to all TDF domains, with ECR, Beliefs about consequences, Social influences, Knowledge and Intentions being the most often discussed influencers of DPA policy implementation. Use of the TDF to understand the implementation factors can assist with the systematic development of future interventions to improve implementation.
Proceedings of the Third International Workshop on Proof-Carrying Code and Software Certification
NASA Technical Reports Server (NTRS)
Ewen, Denney, W. (Editor); Jensen, Thomas (Editor)
2009-01-01
This NASA conference publication contains the proceedings of the Third International Workshop on Proof-Carrying Code and Software Certification, held as part of LICS in Los Angeles, CA, USA, on August 15, 2009. Software certification demonstrates the reliability, safety, or security of software systems in such a way that it can be checked by an independent authority with minimal trust in the techniques and tools used in the certification process itself. It can build on existing validation and verification (V&V) techniques but introduces the notion of explicit software certificates, Vvilich contain all the information necessary for an independent assessment of the demonstrated properties. One such example is proof-carrying code (PCC) which is an important and distinctive approach to enhancing trust in programs. It provides a practical framework for independent assurance of program behavior; especially where source code is not available, or the code author and user are unknown to each other. The workshop wiII address theoretical foundations of logic-based software certification as well as practical examples and work on alternative application domains. Here "certificate" is construed broadly, to include not just mathematical derivations and proofs but also safety and assurance cases, or any fonnal evidence that supports the semantic analysis of programs: that is, evidence about an intrinsic property of code and its behaviour that can be independently checked by any user, intermediary, or third party. These guarantees mean that software certificates raise trust in the code itself, distinct from and complementary to any existing trust in the creator of the code, the process used to produce it, or its distributor. In addition to the contributed talks, the workshop featured two invited talks, by Kelly Hayhurst and Andrew Appel. The PCC 2009 website can be found at http://ti.arc.nasa.gov /event/pcc 091.
Moving from Batch to Field Using the RT3D Reactive Transport Modeling System
NASA Astrophysics Data System (ADS)
Clement, T. P.; Gautam, T. R.
2002-12-01
The public domain reactive transport code RT3D (Clement, 1997) is a general-purpose numerical code for solving coupled, multi-species reactive transport in saturated groundwater systems. The code uses MODFLOW to simulate flow and several modules of MT3DMS to simulate the advection and dispersion processes. RT3D employs the operator-split strategy which allows the code solve the coupled reactive transport problem in a modular fashion. The coupling between reaction and transport is defined through a separate module where the reaction equations are specified. The code supports a versatile user-defined reaction option that allows users to define their own reaction system through a Fortran-90 subroutine, known as the RT3D-reaction package. Further a utility code, known as BATCHRXN, allows the users to independently test and debug their reaction package. To analyze a new reaction system at a batch scale, users should first run BATCHRXN to test the ability of their reaction package to model the batch data. After testing, the reaction package can simply be ported to the RT3D environment to study the model response under 1-, 2-, or 3-dimensional transport conditions. This paper presents example problems that demonstrate the methods for moving from batch to field-scale simulations using BATCHRXN and RT3D codes. The first example describes a simple first-order reaction system for simulating the sequential degradation of Tetrachloroethene (PCE) and its daughter products. The second example uses a relatively complex reaction system for describing the multiple degradation pathways of Tetrachloroethane (PCA) and its daughter products. References 1) Clement, T.P, RT3D - A modular computer code for simulating reactive multi-species transport in 3-Dimensional groundwater aquifers, Battelle Pacific Northwest National Laboratory Research Report, PNNL-SA-28967, September, 1997. Available at: http://bioprocess.pnl.gov/rt3d.htm.
Makwana, K. D.; Zhdankin, V.; Li, H.; ...
2015-04-10
We performed simulations of decaying magnetohydrodynamic (MHD) turbulence with a fluid and a kinetic code. The initial condition is an ensemble of long-wavelength, counter-propagating, shear-Alfvén waves, which interact and rapidly generate strong MHD turbulence. The total energy is conserved and the rate of turbulent energy decay is very similar in both codes, although the fluid code has numerical dissipation, whereas the kinetic code has kinetic dissipation. The inertial range power spectrum index is similar in both the codes. The fluid code shows a perpendicular wavenumber spectral slope of k-1.3⊥k⊥-1.3. The kinetic code shows a spectral slope of k-1.5⊥k⊥-1.5 for smallermore » simulation domain, and k-1.3⊥k⊥-1.3 for larger domain. We then estimate that collisionless damping mechanisms in the kinetic code can account for the dissipation of the observed nonlinear energy cascade. Current sheets are geometrically characterized. Their lengths and widths are in good agreement between the two codes. The length scales linearly with the driving scale of the turbulence. In the fluid code, their thickness is determined by the grid resolution as there is no explicit diffusivity. In the kinetic code, their thickness is very close to the skin-depth, irrespective of the grid resolution. Finally, this work shows that kinetic codes can reproduce the MHD inertial range dynamics at large scales, while at the same time capturing important kinetic physics at small scales.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makwana, K. D.; Zhdankin, V.; Li, H.
We performed simulations of decaying magnetohydrodynamic (MHD) turbulence with a fluid and a kinetic code. The initial condition is an ensemble of long-wavelength, counter-propagating, shear-Alfvén waves, which interact and rapidly generate strong MHD turbulence. The total energy is conserved and the rate of turbulent energy decay is very similar in both codes, although the fluid code has numerical dissipation, whereas the kinetic code has kinetic dissipation. The inertial range power spectrum index is similar in both the codes. The fluid code shows a perpendicular wavenumber spectral slope of k-1.3⊥k⊥-1.3. The kinetic code shows a spectral slope of k-1.5⊥k⊥-1.5 for smallermore » simulation domain, and k-1.3⊥k⊥-1.3 for larger domain. We then estimate that collisionless damping mechanisms in the kinetic code can account for the dissipation of the observed nonlinear energy cascade. Current sheets are geometrically characterized. Their lengths and widths are in good agreement between the two codes. The length scales linearly with the driving scale of the turbulence. In the fluid code, their thickness is determined by the grid resolution as there is no explicit diffusivity. In the kinetic code, their thickness is very close to the skin-depth, irrespective of the grid resolution. Finally, this work shows that kinetic codes can reproduce the MHD inertial range dynamics at large scales, while at the same time capturing important kinetic physics at small scales.« less
Admiralty Inlet Advanced Turbulence Measurements: final data and code archive
Kilcher, Levi (ORCID:0000000183851131); Thomson, Jim (ORCID:0000000289290088); Harding, Samuel
2011-02-01
Data and code that is not already in a public location that is used in Kilcher, Thomson, Harding, and Nylund (2017) "Turbulence Measurements from Compliant Moorings - Part II: Motion Correction" doi: 10.1175/JTECH-D-16-0213.1. The links point to Python source code used in the publication. All other files are source data used in the publication.
NASA Astrophysics Data System (ADS)
Wünderlich, D.; Mochalskyy, S.; Montellano, I. M.; Revel, A.
2018-05-01
Particle-in-cell (PIC) codes are used since the early 1960s for calculating self-consistently the motion of charged particles in plasmas, taking into account external electric and magnetic fields as well as the fields created by the particles itself. Due to the used very small time steps (in the order of the inverse plasma frequency) and mesh size, the computational requirements can be very high and they drastically increase with increasing plasma density and size of the calculation domain. Thus, usually small computational domains and/or reduced dimensionality are used. In the last years, the available central processing unit (CPU) power strongly increased. Together with a massive parallelization of the codes, it is now possible to describe in 3D the extraction of charged particles from a plasma, using calculation domains with an edge length of several centimeters, consisting of one extraction aperture, the plasma in direct vicinity of the aperture, and a part of the extraction system. Large negative hydrogen or deuterium ion sources are essential parts of the neutral beam injection (NBI) system in future fusion devices like the international fusion experiment ITER and the demonstration reactor (DEMO). For ITER NBI RF driven sources with a source area of 0.9 × 1.9 m2 and 1280 extraction apertures will be used. The extraction of negative ions is accompanied by the co-extraction of electrons which are deflected onto an electron dump. Typically, the maximum negative extracted ion current is limited by the amount and the temporal instability of the co-extracted electrons, especially for operation in deuterium. Different PIC codes are available for the extraction region of large driven negative ion sources for fusion. Additionally, some effort is ongoing in developing codes that describe in a simplified manner (coarser mesh or reduced dimensionality) the plasma of the whole ion source. The presentation first gives a brief overview of the current status of the ion source development for ITER NBI and of the PIC method. Different PIC codes for the extraction region are introduced as well as the coupling to codes describing the whole source (PIC codes or fluid codes). Presented and discussed are different physical and numerical aspects of applying PIC codes to negative hydrogen ion sources for fusion as well as selected code results. The main focus of future calculations will be the meniscus formation and identifying measures for reducing the co-extracted electrons, in particular for deuterium operation. The recent results of the 3D PIC code ONIX (calculation domain: one extraction aperture and its vicinity) for the ITER prototype source (1/8 size of the ITER NBI source) are presented.
Discovery of rare protein-coding genes in model methylotroph Methylobacterium extorquens AM1.
Kumar, Dhirendra; Mondal, Anupam Kumar; Yadav, Amit Kumar; Dash, Debasis
2014-12-01
Proteogenomics involves the use of MS to refine annotation of protein-coding genes and discover genes in a genome. We carried out comprehensive proteogenomic analysis of Methylobacterium extorquens AM1 (ME-AM1) from publicly available proteomics data with a motive to improve annotation for methylotrophs; organisms capable of surviving in reduced carbon compounds such as methanol. Besides identifying 2482(50%) proteins, 29 new genes were discovered and 66 annotated gene models were revised in ME-AM1 genome. One such novel gene is identified with 75 peptides, lacks homolog in other methylobacteria but has glycosyl transferase and lipopolysaccharide biosynthesis protein domains, indicating its potential role in outer membrane synthesis. Many novel genes are present only in ME-AM1 among methylobacteria. Distant homologs of these genes in unrelated taxonomic classes and low GC-content of few genes suggest lateral gene transfer as a potential mode of their origin. Annotations of methylotrophy related genes were also improved by the discovery of a short gene in methylotrophy gene island and redefining a gene important for pyrroquinoline quinone synthesis, essential for methylotrophy. The combined use of proteogenomics and rigorous bioinformatics analysis greatly enhanced the annotation of protein-coding genes in model methylotroph ME-AM1 genome. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Dai, Jin; Niemi, Antti J.; He, Jianfeng
2016-07-01
The Landau-Ginzburg-Wilson paradigm is proposed as a framework, to investigate the conformational landscape of intrinsically unstructured proteins. A universal Cα-trace Landau free energy is deduced from general symmetry considerations, with the ensuing all-atom structure modeled using publicly available reconstruction programs Pulchra and Scwrl. As an example, the conformational stability of an amyloid precursor protein intra-cellular domain (AICD) is inspected; the reference conformation is the crystallographic structure with code 3DXC in Protein Data Bank (PDB) that describes a heterodimer of AICD and a nuclear multi-domain adaptor protein Fe65. Those conformations of AICD that correspond to local or near-local minima of the Landau free energy are identified. For this, the response of the original 3DXC conformation to variations in the ambient temperature is investigated, using the Glauber algorithm. The conclusion is that in isolation the AICD conformation in 3DXC must be unstable. A family of degenerate conformations that minimise the Landau free energy is identified, and it is proposed that the native state of an isolated AICD is a superposition of these conformations. The results are fully in line with the presumed intrinsically unstructured character of isolated AICD and should provide a basis for a systematic analysis of AICD structure in future NMR experiments.
Habraken, Jolanda M.; Kremers, Stef P. J.; van Oers, Hans; Schuit, Albertine J.
2016-01-01
Background. Limited physical activity (PA) is a risk factor for childhood obesity. In Netherlands, as in many other countries worldwide, local policy officials bear responsibility for integrated PA policies, involving both health and nonhealth domains. In practice, its development seems hampered. We explore which obstacles local policy officials perceive in their effort. Methods. Fifteen semistructured interviews were held with policy officials from health and nonhealth policy domains, working at strategic, tactic, and operational level, in three relatively large municipalities. Questions focused on exploring perceived barriers for integrated PA policies. The interviews were deductively coded by applying the Behavior Change Ball framework. Findings. Childhood obesity prevention appeared on the governmental agenda and all officials understood the multicausal nature. However, operational officials had not yet developed a tradition to develop integrated PA policies due to insufficient boundary-spanning skills and structural and cultural differences between the domains. Tactical level officials did not sufficiently support intersectoral collaboration and strategic level officials mainly focused on public-private partnerships. Conclusion. Developing integrated PA policies is a bottom-up innovation process that needs to be supported by governmental leaders through better guiding organizational processes leading to such policies. Operational level officials can assist in this by making progress in intersectoral collaboration visible. PMID:27668255
Hendriks, Anna-Marie; Habraken, Jolanda M; Kremers, Stef P J; Jansen, Maria W J; van Oers, Hans; Schuit, Albertine J
Background . Limited physical activity (PA) is a risk factor for childhood obesity. In Netherlands, as in many other countries worldwide, local policy officials bear responsibility for integrated PA policies, involving both health and nonhealth domains. In practice, its development seems hampered. We explore which obstacles local policy officials perceive in their effort. Methods . Fifteen semistructured interviews were held with policy officials from health and nonhealth policy domains, working at strategic, tactic, and operational level, in three relatively large municipalities. Questions focused on exploring perceived barriers for integrated PA policies. The interviews were deductively coded by applying the Behavior Change Ball framework. Findings . Childhood obesity prevention appeared on the governmental agenda and all officials understood the multicausal nature. However, operational officials had not yet developed a tradition to develop integrated PA policies due to insufficient boundary-spanning skills and structural and cultural differences between the domains. Tactical level officials did not sufficiently support intersectoral collaboration and strategic level officials mainly focused on public-private partnerships. Conclusion . Developing integrated PA policies is a bottom-up innovation process that needs to be supported by governmental leaders through better guiding organizational processes leading to such policies. Operational level officials can assist in this by making progress in intersectoral collaboration visible.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-07
.... 100108014-0121-01] RIN 0694-AE82 Publicly Available Mass Market Encryption Software and Other Specified Publicly Available Encryption Software in Object Code AGENCY: Bureau of Industry and Security, Commerce... encryption object code software with a symmetric key length greater than 64-bits, and ``publicly available...
Identification of two allelic IgG1 C(H) coding regions (Cgamma1) of cat.
Kanai, T H; Ueda, S; Nakamura, T
2000-01-31
Two types of cDNA encoding IgG1 heavy chain (gamma1) were isolated from a single domestic short-hair cat. Sequence analysis indicated a higher level of similarity of these Cgamma1 sequences to human Cgamma1 sequence (76.9 and 77.0%) than to mouse sequence (70.0 and 69.7%) at the nucleotide level. Predicted primary structures of both the feline Cgamma1 genes, designated as Cgamma1a and Cgamma1b, were similar to that of human Cgamma1 gene, for instance, as to the size of constant domains, the presence of six conserved cysteine residues involved in formation of the domain structure, and the location of a conserved N-linked glycosylation site. Sequence comparison between the two alleles showed that 7 out of 10 nucleotide differences were within the C(H)3 domain coding region, all leading to nonsynonymous changes in amino acid residues. Partial sequence analysis of genomic clones showed three nucleotide substitutions between the two Cgamma1 alleles in the intron between the CH2 and C(H)3 domain coding regions. In 12 domestic short-hair cats used in this study, the frequency of Cgamma1a allele (62.5%) was higher than that of the Cgamma1b allele (37.5%).
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.
1987-01-01
The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.
Classifying Chinese Questions Related to Health Care Posted by Consumers Via the Internet.
Guo, Haihong; Na, Xu; Hou, Li; Li, Jiao
2017-06-20
In question answering (QA) system development, question classification is crucial for identifying information needs and improving the accuracy of returned answers. Although the questions are domain-specific, they are asked by non-professionals, making the question classification task more challenging. This study aimed to classify health care-related questions posted by the general public (Chinese speakers) on the Internet. A topic-based classification schema for health-related questions was built by manually annotating randomly selected questions. The Kappa statistic was used to measure the interrater reliability of multiple annotation results. Using the above corpus, we developed a machine-learning method to automatically classify these questions into one of the following six classes: Condition Management, Healthy Lifestyle, Diagnosis, Health Provider Choice, Treatment, and Epidemiology. The consumer health question schema was developed with a four-hierarchical-level of specificity, comprising 48 quaternary categories and 35 annotation rules. The 2000 sample questions were coded with 2000 major codes and 607 minor codes. Using natural language processing techniques, we expressed the Chinese questions as a set of lexical, grammatical, and semantic features. Furthermore, the effective features were selected to improve the question classification performance. From the 6-category classification results, we achieved an average precision of 91.41%, recall of 89.62%, and F 1 score of 90.24%. In this study, we developed an automatic method to classify questions related to Chinese health care posted by the general public. It enables Artificial Intelligence (AI) agents to understand Internet users' information needs on health care. ©Haihong Guo, Xu Na, Li Hou, Jiao Li. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 20.06.2017.
The EMEP MSC-W chemical transport model - technical description
NASA Astrophysics Data System (ADS)
Simpson, D.; Benedictow, A.; Berge, H.; Bergström, R.; Emberson, L. D.; Fagerli, H.; Flechard, C. R.; Hayman, G. D.; Gauss, M.; Jonson, J. E.; Jenkin, M. E.; Nyíri, A.; Richter, C.; Semeena, V. S.; Tsyro, S.; Tuovinen, J.-P.; Valdebenito, Á.; Wind, P.
2012-08-01
The Meteorological Synthesizing Centre-West (MSC-W) of the European Monitoring and Evaluation Programme (EMEP) has been performing model calculations in support of the Convention on Long Range Transboundary Air Pollution (CLRTAP) for more than 30 years. The EMEP MSC-W chemical transport model is still one of the key tools within European air pollution policy assessments. Traditionally, the model has covered all of Europe with a resolution of about 50 km × 50 km, and extending vertically from ground level to the tropopause (100 hPa). The model has changed extensively over the last ten years, however, with flexible processing of chemical schemes, meteorological inputs, and with nesting capability: the code is now applied on scales ranging from local (ca. 5 km grid size) to global (with 1 degree resolution). The model is used to simulate photo-oxidants and both inorganic and organic aerosols. In 2008 the EMEP model was released for the first time as public domain code, along with all required input data for model runs for one year. The second release of the EMEP MSC-W model became available in mid 2011, and a new release is targeted for summer 2012. This publication is intended to document this third release of the EMEP MSC-W model. The model formulations are given, along with details of input data-sets which are used, and a brief background on some of the choices made in the formulation is presented. The model code itself is available at www.emep.int, along with the data required to run for a full year over Europe.
Preparing a collection of radiology examinations for distribution and retrieval.
Demner-Fushman, Dina; Kohli, Marc D; Rosenman, Marc B; Shooshan, Sonya E; Rodriguez, Laritza; Antani, Sameer; Thoma, George R; McDonald, Clement J
2016-03-01
Clinical documents made available for secondary use play an increasingly important role in discovery of clinical knowledge, development of research methods, and education. An important step in facilitating secondary use of clinical document collections is easy access to descriptions and samples that represent the content of the collections. This paper presents an approach to developing a collection of radiology examinations, including both the images and radiologist narrative reports, and making them publicly available in a searchable database. The authors collected 3996 radiology reports from the Indiana Network for Patient Care and 8121 associated images from the hospitals' picture archiving systems. The images and reports were de-identified automatically and then the automatic de-identification was manually verified. The authors coded the key findings of the reports and empirically assessed the benefits of manual coding on retrieval. The automatic de-identification of the narrative was aggressive and achieved 100% precision at the cost of rendering a few findings uninterpretable. Automatic de-identification of images was not quite as perfect. Images for two of 3996 patients (0.05%) showed protected health information. Manual encoding of findings improved retrieval precision. Stringent de-identification methods can remove all identifiers from text radiology reports. DICOM de-identification of images does not remove all identifying information and needs special attention to images scanned from film. Adding manual coding to the radiologist narrative reports significantly improved relevancy of the retrieved clinical documents. The de-identified Indiana chest X-ray collection is available for searching and downloading from the National Library of Medicine (http://openi.nlm.nih.gov/). Published by Oxford University Press on behalf of the American Medical Informatics Association 2015. This work is written by US Government employees and is in the public domain in the US.
Parallelisation study of a three-dimensional environmental flow model
NASA Astrophysics Data System (ADS)
O'Donncha, Fearghal; Ragnoli, Emanuele; Suits, Frank
2014-03-01
There are many simulation codes in the geosciences that are serial and cannot take advantage of the parallel computational resources commonly available today. One model important for our work in coastal ocean current modelling is EFDC, a Fortran 77 code configured for optimal deployment on vector computers. In order to take advantage of our cache-based, blade computing system we restructured EFDC from serial to parallel, thereby allowing us to run existing models more quickly, and to simulate larger and more detailed models that were previously impractical. Since the source code for EFDC is extensive and involves detailed computation, it is important to do such a port in a manner that limits changes to the files, while achieving the desired speedup. We describe a parallelisation strategy involving surgical changes to the source files to minimise error-prone alteration of the underlying computations, while allowing load-balanced domain decomposition for efficient execution on a commodity cluster. The use of conjugate gradient posed particular challenges due to implicit non-local communication posing a hindrance to standard domain partitioning schemes; a number of techniques are discussed to address this in a feasible, computationally efficient manner. The parallel implementation demonstrates good scalability in combination with a novel domain partitioning scheme that specifically handles mixed water/land regions commonly found in coastal simulations. The approach presented here represents a practical methodology to rejuvenate legacy code on a commodity blade cluster with reasonable effort; our solution has direct application to other similar codes in the geosciences.
Ready or not? Pharmacist perceptions of a changing injection scope of practice before it happens
Foong, Esther Ai-Leng; Edwards, David J.; Houle, Sherilyn; Grindrod, Kelly A.
2017-01-01
Background: Since 2012, Ontario pharmacists have been authorized to administer the influenza vaccine. In April 2016, the Ontario College of Pharmacists (OCP) proposed to expand the Pharmacy Act to allow pharmacists to vaccinate against 13 additional conditions. The OCP held an online public consultation and invited pharmacists, members of the public and organizations to weigh in on the proposed changes. Our objective was to explore the factors influencing how Ontario pharmacists may adopt or reject an expanding scope of practice, using data from the public consultation. Methods: We coded the responses to the public consultation in 2 ways: 1) sentiment analysis and 2) an integrative approach to coding using Rogers’s diffusion of innovations theory across 5 domains: relative advantage, compatibility, complexity, trialability and observability. Results: Responses from pharmacists, the public and organizations were moderately positive on average. Pharmacists most commonly mentioned relative advantages, including benefits for patients, pharmacists, physicians and the health system. Positive responses focused on accessibility for patients, improved vaccine coverage, lower health care spending and freed physician time but cited lack of prescribing privileges as a barrier to the proposed changes. Negative responses focused on increased workload, patient safety concerns and the complexity of travel medicine. Conclusions: The expanded immunization services are likely to be well received by most pharmacists. Convenience and accessibility for patients were commonly cited benefits, but the changes will be only a slight improvement over the current system unless pharmacists can prescribe these vaccines. Although employers responded positively, the question remains whether they will support pharmacists in a way that aligns with pharmacists’ values and expectations. Decision makers must pay close attention to the pharmacy infrastructure and how this will affect uptake of these services. Recognition of this, combined with pharmacists’ positive perceptions of the expanded scope, will facilitate smooth integration of this legislation into Ontario pharmacy practice. PMID:29123598
2008-03-01
terms the last time we spoke, I can say without a doubt that he was my favorite cousin. You are both missed, always.... I want to thank my wife for her...IEEE Communications Magazine, 50:S11–S15, September 2005. 3. Haker , M. E. Hardware Realization of a Transform Domain Communication Sys- tem. Master’s
The structure of transcription termination factor Nrd1 reveals an original mode for GUAA recognition
Franco-Echevarría, Elsa; González-Polo, Noelia; Zorrilla, Silvia; Martínez-Lumbreras, Santiago; Santiveri, Clara M.; Campos-Olivas, Ramón; Sánchez, Mar; Calvo, Olga
2017-01-01
Abstract Transcription termination of non-coding RNAs is regulated in yeast by a complex of three RNA binding proteins: Nrd1, Nab3 and Sen1. Nrd1 is central in this process by interacting with Rbp1 of RNA polymerase II, Trf4 of TRAMP and GUAA/G terminator sequences. We lack structural data for the last of these binding events. We determined the structures of Nrd1 RNA binding domain and its complexes with three GUAA-containing RNAs, characterized RNA binding energetics and tested rationally designed mutants in vivo. The Nrd1 structure shows an RRM domain fused with a second α/β domain that we name split domain (SD), because it is formed by two non-consecutive segments at each side of the RRM. The GUAA interacts with both domains and with a pocket of water molecules, trapped between the two stacking adenines and the SD. Comprehensive binding studies demonstrate for the first time that Nrd1 has a slight preference for GUAA over GUAG and genetic and functional studies suggest that Nrd1 RNA binding domain might play further roles in non-coding RNAs transcription termination. PMID:28973465
Mapping of non-numerical domains on space: a systematic review and meta-analysis.
Macnamara, Anne; Keage, Hannah A D; Loetscher, Tobias
2018-02-01
The spatial numerical association of response code (SNARC) effect is characterized by low numbers mapped to the left side of space and high numbers mapped to the right side of space. In addition to numbers, SNARC-like effects have been found in non-numerical magnitude domains such as time, size, letters, luminance, and more, whereby the smaller/earlier and larger/later magnitudes are typically mapped to the left and right of space, respectively. The purpose of this systematic and meta-analytic review was to identify and summarise all empirical papers that have investigated horizontal (left-right) SNARC-like mappings using non-numerical stimuli. A systematic search was conducted using EMBASE, Medline, and PsycINFO, where 2216 publications were identified, with 57 papers meeting the inclusion criteria (representing 112 experiments). Ninety-five of these experiments were included in a meta-analysis, resulting in an overall effect size of d = .488 for a SNARC-like effect. Additional analyses revealed a significant effect size advantage for explicit instruction tasks compared with implicit instructions, yet yielded no difference for the role of expertise on SNARC-like effects. There was clear evidence for a publication bias in the field, but the impact of this bias is likely to be modest, and it is unlikely that the SNARC-like effect is a pure artefact of this bias. The similarities in the response properties for the spatial mappings of numerical and non-numerical domains support the concept of a general higher order magnitude system. Yet, further research will need to be conducted to identify all the factors modulating the strength of the spatial associations.
Content and bibliometric analyses of the Journal of Manual & Manipulative Therapy
Simon, Corey B; Coronado, Rogelio A; Wurtzel, Wendy A; Riddle, Daniel L; George, Steven Z
2014-01-01
Background: Article characteristics and trends have been elucidated for other physical therapy-focused journals using content and bibliometric analysis. These findings are important for assessing the current state of a journal and for guiding future publication of research. To date, these analyses have not been performed for the Journal of Manual & Manipulative Therapy (JMMT). Objective: To describe content and trends for articles published in JMMT over a 20-year period (1993–2012). Methods: Journal articles were coded using previously-established domains (article type, participant type, research design, study purpose, and clinical condition). Total publications and proportion of publications based on domain were described. Articles specific to manual therapy intervention were examined and compared to data from other physical therapy-focused journals. Impact by citation and author was examined using bibliometric software. Results: Journal of Manual & Manipulative Therapy was found to have a recent acceleration in the number of articles published annually. Over time, topical reviews have decreased in favor of research reports. However, rigorous study designs have represented only a small portion of total journal content, and case reports have maintained a consistent publication presence. Manual therapy intervention articles in JMMT are predominantly case designs, however are similar in characteristics to manual therapy intervention articles published in other physical therapy-focused journals. For JMMT articles overall and manual therapy intervention articles across journals, young to middle-aged symptomatic adults with low back and/or neck pain were the most common study participants. Discussion: Increases in the number of papers and a move toward research reports were observed in JMMT over the 20-year period. Considerations for the future were outlined, including the publication of articles with more rigorous research designs. Manual therapy research for adolescents and older adults and for upper and lower extremity conditions should also be considered as priorities for the future. PMID:25395826
Synthesizing Safety Conditions for Code Certification Using Meta-Level Programming
NASA Technical Reports Server (NTRS)
Eusterbrock, Jutta
2004-01-01
In code certification the code consumer publishes a safety policy and the code producer generates a proof that the produced code is in compliance with the published safety policy. In this paper, a novel viewpoint approach towards an implementational re-use oriented framework for code certification is taken. It adopts ingredients from Necula's approach for proof-carrying code, but in this work safety properties can be analyzed on a higher code level than assembly language instructions. It consists of three parts: (1) The specification language is extended to include generic pre-conditions that shall ensure safety at all states that can be reached during program execution. Actual safety requirements can be expressed by providing domain-specific definitions for the generic predicates which act as interface to the environment. (2) The Floyd-Hoare inductive assertion method is refined to obtain proof rules that allow the derivation of the proof obligations in terms of the generic safety predicates. (3) A meta-interpreter is designed and experimentally implemented that enables automatic synthesis of proof obligations for submitted programs by applying the modified Floyd-Hoare rules. The proof obligations have two separate conjuncts, one for functional correctness and another for the generic safety obligations. Proof of the generic obligations, having provided the actual safety definitions as context, ensures domain-specific safety of program execution in a particular environment and is simpler than full program verification.
An Infrastructure for UML-Based Code Generation Tools
NASA Astrophysics Data System (ADS)
Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.
The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.
Making Homes Healthy: International Code Council Processes and Patterns.
Coyle, Edward C; Isett, Kimberley R; Rondone, Joseph; Harris, Rebecca; Howell, M Claire Batten; Brandus, Katherine; Hughes, Gwendolyn; Kerfoot, Richard; Hicks, Diana
2016-01-01
Americans spend more than 90% of their time indoors, so it is important that homes are healthy environments. Yet many homes contribute to preventable illnesses via poor air quality, pests, safety hazards, and others. Efforts have been made to promote healthy housing through code changes, but results have been mixed. In support of such efforts, we analyzed International Code Council's (ICC) building code change process to uncover patterns of content and context that may contribute to successful adoptions of model codes. Discover patterns of facilitators and barriers to code amendments proposals. Mixed methods study of ICC records of past code change proposals. N = 2660. N/A. N/A. There were 4 possible outcomes for each code proposal studied: accepted as submitted, accepted as modified, accepted as modified by public comment, and denied. We found numerous correlates for final adoption of model codes proposed to the ICC. The number of proponents listed on a proposal was inversely correlated with success. Organizations that submitted more than 15 proposals had a higher chance of success than those that submitted fewer than 15. Proposals submitted by federal agencies correlated with a higher chance of success. Public comments in favor of a proposal correlated with an increased chance of success, while negative public comment had an even stronger negative correlation. To increase the chance of success, public health officials should submit their code changes through internal ICC committees or a federal agency, limit the number of cosponsors of the proposal, work with (or become) an active proposal submitter, and encourage public comment in favor of passage through their broader coalition.
A comparison of VLSI architectures for time and transform domain decoding of Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Hsu, I. S.; Truong, T. K.; Deutsch, L. J.; Satorius, E. H.; Reed, I. S.
1988-01-01
It is well known that the Euclidean algorithm or its equivalent, continued fractions, can be used to find the error locator polynomial needed to decode a Reed-Solomon (RS) code. It is shown that this algorithm can be used for both time and transform domain decoding by replacing its initial conditions with the Forney syndromes and the erasure locator polynomial. By this means both the errata locator polynomial and the errate evaluator polynomial can be obtained with the Euclidean algorithm. With these ideas, both time and transform domain Reed-Solomon decoders for correcting errors and erasures are simplified and compared. As a consequence, the architectures of Reed-Solomon decoders for correcting both errors and erasures can be made more modular, regular, simple, and naturally suitable for VLSI implementation.
Optical digital chaos cryptography
NASA Astrophysics Data System (ADS)
Arenas-Pingarrón, Álvaro; González-Marcos, Ana P.; Rivas-Moscoso, José M.; Martín-Pereda, José A.
2007-10-01
In this work we present a new way to mask the data in a one-user communication system when direct sequence - code division multiple access (DS-CDMA) techniques are used. The code is generated by a digital chaotic generator, originally proposed by us and previously reported for a chaos cryptographic system. It is demonstrated that if the user's data signal is encoded with a bipolar phase-shift keying (BPSK) technique, usual in DS-CDMA, it can be easily recovered from a time-frequency domain representation. To avoid this situation, a new system is presented in which a previous dispersive stage is applied to the data signal. A time-frequency domain analysis is performed, and the devices required at the transmitter and receiver end, both user-independent, are presented for the optical domain.
A Concept for Run-Time Support of the Chapel Language
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called "A Semantic Framework for Domains and Distributions in Chapel" and linked to a language specification defined in another separate document called "Chapel Specification 0.3." The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.
Simulation of Spiral Slot Antennas on Composite Platforms
NASA Technical Reports Server (NTRS)
Volakis, John L.
1996-01-01
The project goals, plan and accomplishments up to this point are summarized in the viewgraphs. Among the various accomplishments, the most important have been: the development of the prismatic finite element code for doubly curved platforms and its validation with many different antenna configurations; the design and fabrication of a new slot spiral antennas suitable for automobile cellular, GPS and PCs communications; the investigation and development of various mesh truncation schemes, including the perfectly matched absorber and various fast integral equation methods; and the introduction of a frequency domain extrapolation technique (AWE) for predicting broadband responses using only a few samples of the response. This report contains several individual reports most of which have been submitted for publication to referred journals. For a report on the frequency extrapolation technique, the reader is referred to the UM Radiation Laboratory report A total of 14 papers have been published or accepted for publication with the full or partial support of this grant. Several more papers are in preparation.
NASA Parts Selection List (NPSL) WWW Site http://nepp.nasa.gov/npsl
NASA Technical Reports Server (NTRS)
Brusse, Jay
2000-01-01
The NASA Parts Selection List (NPSL) is an on-line resource for electronic parts selection tailored for use by spaceflight projects. The NPSL provides a list of commonly used electronic parts that have a history of satisfactory use in spaceflight applications. The objective of this www site is to provide NASA projects, contractors, university experimenters, et al with an easy to use resource that provides a baseline of electronic parts from which designers are encouraged to select. The NPSL is an ongoing resource produced by Code 562 in support of the NASA HQ funded NASA Electronic Parts and Packaging (NEPP) Program. The NPSL is produced as an electronic format deliverable made available via the referenced www site administered by Code 562. The NPSL does not provide information pertaining to patented or proprietary information. All of the information contained in the NPSL is available through various other public domain resources such as US Military procurement specifications for electronic parts, NASA GSFC's Preferred Parts List (PPL-21), and NASA's Standard Parts List (MIL-STD975).
NASA Technical Reports Server (NTRS)
Denny, Barbara A.; McKenney, Paul E., Sr.; Lee, Danny
1994-01-01
This document is Volume 3 of the final technical report on the work performed by SRI International (SRI) on SRI Project 8600. The document includes source listings for all software developed by SRI under this effort. Since some of our work involved the use of ST-II and the Sun Microsystems, Inc. (Sun) High-Speed Serial Interface (HSI/S) driver, we have included some of the source developed by LBL and BBN as well. In most cases, our decision to include source developed by other contractors depended on whether it was necessary to modify the original code. If we have modified the software in any way, it is included in this document. In the case of the Traffic Generator (TG), however, we have included all the ST-II software, even though BBN performed the integration, because the ST-II software is part of the standard TG release. It is important to note that all the code developed by other contractors is in the public domain, so that all software developed under this effort can be re-created from the source included here.
The Definition, Dimensions, and Domain of Public Relations.
ERIC Educational Resources Information Center
Hutton, James G.
1999-01-01
Discusses how the field of public relations has left itself vulnerable to other fields that are making inroads into public relations' traditional domain, and to critics who are filling in their own definitions of public relations. Proposes a definition and a three-dimensional framework to compare competing philosophies of public relations and to…
75 FR 56528 - EPA's Role in Advancing Sustainable Products
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-16
... services (NAICS code 72). Other services, except public administration (NAICS code 81). Public... Rm. 3334, EPA West Bldg., 1301 Constitution Ave., NW., Washington, DC. The EPA/DC Public Reading Room... holidays. The telephone number of the EPA/DC Public Reading Room is (202) 566-1744, and the telephone...
[Harassment in the public sector].
Puech, Paloma; Pitcho, Benjamin
2013-01-01
The French Labour Code, which provides full protection against moral and sexual harassment, is not applicable to public sector workers. The public hospital is however not exempt from such behaviour, which could go unpunished. Public sector workers are therefore protected by the French General Civil Service Regulations and the penal code.
1 CFR 8.6 - Forms of publication.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 1 General Provisions 1 2014-01-01 2012-01-01 true Forms of publication. 8.6 Section 8.6 General... FEDERAL REGULATIONS § 8.6 Forms of publication. (a) Under section 1506 of title 44, United States Code, the Administrative Committee authorizes publication of the Code of Federal Regulations in the...
1 CFR 8.6 - Forms of publication.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 1 General Provisions 1 2013-01-01 2012-01-01 true Forms of publication. 8.6 Section 8.6 General... FEDERAL REGULATIONS § 8.6 Forms of publication. (a) Under section 1506 of title 44, United States Code, the Administrative Committee authorizes publication of the Code of Federal Regulations in the...
Selected DOE headquarters publications
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-04-01
This publication provides listings of (mainly policy and programmatic) publications which have been issued by headquarters organizations of the Department of Energy; assigned a DOE/XXX- type report number code, where XXX is the 1- to 4-letter code for the issuing headquarters organization; received by the Energy Library; and made available to the public.
BioC implementations in Go, Perl, Python and Ruby.
Liu, Wanli; Islamaj Doğan, Rezarta; Kwon, Dongseop; Marques, Hernani; Rinaldi, Fabio; Wilbur, W John; Comeau, Donald C
2014-01-01
As part of a communitywide effort for evaluating text mining and information extraction systems applied to the biomedical domain, BioC is focused on the goal of interoperability, currently a major barrier to wide-scale adoption of text mining tools. BioC is a simple XML format, specified by DTD, for exchanging data for biomedical natural language processing. With initial implementations in C++ and Java, BioC provides libraries of code for reading and writing BioC text documents and annotations. We extend BioC to Perl, Python, Go and Ruby. We used SWIG to extend the C++ implementation for Perl and one Python implementation. A second Python implementation and the Ruby implementation use native data structures and libraries. BioC is also implemented in the Google language Go. BioC modules are functional in all of these languages, which can facilitate text mining tasks. BioC implementations are freely available through the BioC site: http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net/ Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.
Are plant formins integral membrane proteins?
Cvrcková, F
2000-01-01
The formin family of proteins has been implicated in signaling pathways of cellular morphogenesis in both animals and fungi; in the latter case, at least, they participate in communication between the actin cytoskeleton and the cell surface. Nevertheless, they appear to be cytoplasmic or nuclear proteins, and it is not clear whether they communicate with the plasma membrane, and if so, how. Because nothing is known about formin function in plants, I performed a systematic search for putative Arabidopsis thaliana formin homologs. I found eight putative formin-coding genes in the publicly available part of the Arabidopsis genome sequence and analyzed their predicted protein sequences. Surprisingly, some of them lack parts of the conserved formin-homology 2 (FH2) domain and the majority of them seem to have signal sequences and putative transmembrane segments that are not found in yeast or animals formins. Plant formins define a distinct subfamily. The presence in most Arabidopsis formins of sequence motifs typical or transmembrane proteins suggests a mechanism of membrane attachment that may be specific to plant formins, and indicates an unexpected evolutionary flexibility of the conserved formin domain.
Services for domain specific developments in the Cloud
NASA Astrophysics Data System (ADS)
Schwichtenberg, Horst; Gemuend, André
2015-04-01
We will discuss and demonstrate the possibilities of new Cloud Services where the complete development of code is in the Cloud. We will discuss the possibilities of such services where the complete development cycle from programing to testing is in the cloud. This can be also combined with dedicated research domain specific services and hide the burden of accessing available infrastructures. As an example, we will show a service that is intended to complement the services of the VERCE projects infrastructure, a service that utilizes Cloud resources to offer simplified execution of data pre- and post-processing scripts. It offers users access to the ObsPy seismological toolbox for processing data with the Python programming language, executed on virtual Cloud resources in a secured sandbox. The solution encompasses a frontend with a modern graphical user interface, a messaging infrastructure as well as Python worker nodes for background processing. All components are deployable in the Cloud and have been tested on different environments based on OpenStack and OpenNebula. Deployments on commercial, public Clouds will be tested in the future.
Locality-preserving logical operators in topological stabilizer codes
NASA Astrophysics Data System (ADS)
Webster, Paul; Bartlett, Stephen D.
2018-01-01
Locality-preserving logical operators in topological codes are naturally fault tolerant, since they preserve the correctability of local errors. Using a correspondence between such operators and gapped domain walls, we describe a procedure for finding all locality-preserving logical operators admitted by a large and important class of topological stabilizer codes. In particular, we focus on those equivalent to a stack of a finite number of surface codes of any spatial dimension, where our procedure fully specifies the group of locality-preserving logical operators. We also present examples of how our procedure applies to codes with different boundary conditions, including color codes and toric codes, as well as more general codes such as Abelian quantum double models and codes with fermionic excitations in more than two dimensions.
The Code of the Street and Romantic Relationships: A dyadic analysis
Barr, Ashley B.; Simons, Ronald L.; Stewart, Eric A.
2012-01-01
Since its publication, Elijah Anderson’s (1999) code of the street thesis has found support in studies connecting disadvantage to the internalization of street-oriented values and an associated lifestyle of violent/deviant behavior. This primary emphasis on deviance in public arenas has precluded researchers from examining the implications of the code of the street for less public arenas, like intimate relationships. In an effort to understand if and how the endorsement of the street code may infiltrate such relationships, the present study examines the associations between the code of the street and relationship satisfaction and commitment among young adults involved in heterosexual romantic relationships. Using a dyadic approach, we find that street code orientation, in general, negatively predicts satisfaction and commitment, in part due to increased relationship hostility/conflict associated with the internalization of the code. Gender differences in these associations are considered and discussed at length. PMID:23504000
Murphy, Nada; Epstein, Amy; Leonard, Helen; Davis, Elise; Reddihough, Dinah; Whitehouse, Andrew; Jacoby, Peter; Bourke, Jenny; Williams, Katrina; Downs, Jenny
There are many challenges to health, functioning, and participation for children with Down syndrome; yet, the quality-of-life (QOL) domains important for this group have never been clearly articulated. This study investigated parental observations to identify QOL domains in children with Down syndrome and determined whether domains differed between children and adolescents. The sample comprised 17 families whose child with Down syndrome was aged 6 to 18 years. Primary caregivers took part in semistructured telephone interviews to explore aspects of their child's life that were satisfying or challenging. Qualitative thematic analysis was implemented using a grounded theory framework to identify domains. The coded data set was divided into 2 groups (childhood and adolescence) at 3 age cut points to observe whether differences existed between the coded domains and domain elements: (1) 6 to 11 years with 12 to 18 years; (2) 6 to 13 years with 14 to 18 years; and (3) 6 to 15 years with 16 to 18 years. Eleven domains were identified: physical health, behavior and emotion, personal value, communication, movement and physical activity, routines and predictability, independence and autonomy, social connectedness and relationships, variety of activities, nature and outdoors, and access to services. No differences in domains and domain elements were identified across childhood and adolescence. Our data form a preliminary framework from which to design investigations of the child's perspectives on life quality and suggest a range of necessary supports and services.
Multi-Zone Liquid Thrust Chamber Performance Code with Domain Decomposition for Parallel Processing
NASA Technical Reports Server (NTRS)
Navaz, Homayun K.
2002-01-01
Computational Fluid Dynamics (CFD) has considerably evolved in the last decade. There are many computer programs that can perform computations on viscous internal or external flows with chemical reactions. CFD has become a commonly used tool in the design and analysis of gas turbines, ramjet combustors, turbo-machinery, inlet ducts, rocket engines, jet interaction, missile, and ramjet nozzles. One of the problems of interest to NASA has always been the performance prediction for rocket and air-breathing engines. Due to the complexity of flow in these engines it is necessary to resolve the flowfield into a fine mesh to capture quantities like turbulence and heat transfer. However, calculation on a high-resolution grid is associated with a prohibitively increasing computational time that can downgrade the value of the CFD for practical engineering calculations. The Liquid Thrust Chamber Performance (LTCP) code was developed for NASA/MSFC (Marshall Space Flight Center) to perform liquid rocket engine performance calculations. This code is a 2D/axisymmetric full Navier-Stokes (NS) solver with fully coupled finite rate chemistry and Eulerian treatment of liquid fuel and/or oxidizer droplets. One of the advantages of this code has been the resemblance of its input file to the JANNAF (Joint Army Navy NASA Air Force Interagency Propulsion Committee) standard TDK code, and its automatic grid generation for JANNAF defined combustion chamber wall geometry. These options minimize the learning effort for TDK users, and make the code a good candidate for performing engineering calculations. Although the LTCP code was developed for liquid rocket engines, it is a general-purpose code and has been used for solving many engineering problems. However, the single zone formulation of the LTCP has limited the code to be applicable to problems with complex geometry. Furthermore, the computational time becomes prohibitively large for high-resolution problems with chemistry, two-equation turbulence model, and two-phase flow. To overcome these limitations, the LTCP code is rewritten to include the multi-zone capability with domain decomposition that makes it suitable for parallel processing, i.e., enabling the code to run every zone or sub-domain on a separate processor. This can reduce the run time by a factor of 6 to 8, depending on the problem.
Genomics and the Public Health Code of Ethics
Thomas, James C.; Irwin, Debra E.; Zuiker, Erin Shaugnessy; Millikan, Robert C.
2005-01-01
We consider the public health applications of genomic technologies as viewed through the lens of the public health code of ethics. We note, for example, the potential for genomics to increase our appreciation for the public health value of interdependence, the potential for some genomic tools to exacerbate health disparities because of their inaccessibility by the poor and the way in which genomics forces public health to refine its notions of prevention. The public health code of ethics sheds light on concerns raised by commercial genomic products that are not discussed in detail by more clinically oriented perspectives. In addition, the concerns raised by genomics highlight areas of our understanding of the ethical principles of public health in which further refinement may be necessary. PMID:16257942
Translating MAPGEN to ASPEN for MER
NASA Technical Reports Server (NTRS)
Rabideau, Gregg R.; Knight, Russell L.; Lenda, Matthew; Maldague, Pierre F.
2013-01-01
This software translates MAPGEN (Europa and APGEN) domains to ASPEN, and the resulting domain can be used to perform planning for the Mars Exploration Rover (MER). In other words, this is a conversion of two distinct planning languages (both declarative and procedural) to a third (declarative) planning language in order to solve the problem of faithful translation from mixed-domain representations into the ASPEN Modeling Language. The MAPGEN planning system is an example of a hybrid procedural/declarative system where the advantages of each are leveraged to produce an effective planner/scheduler for MER tactical planning. The adaptation of the planning system (ASPEN) was investigated, and, with some translation, much of the procedural knowledge encoding is amenable to declarative knowledge encoding. The approach was to compose translators from the core languages used for adapting MAGPEN, which consists of Europa and APGEN. Europa is a constraint- based planner/scheduler where domains are encoded using a declarative model. APGEN is also constraint-based, in that it tracks constraints on resources and states and other variables. Domains are encoded in both constraints and code snippets that execute according to a forward sweep through the plan. Europa and APGEN communicate to each other using proxy activities in APGEN that represent constraints and/or tokens in Europa. The composition of a translator from Europa to ASPEN was fairly straightforward, as ASPEN is also a declarative planning system, and the specific uses of Europa for the MER domain matched ASPEN s native encoding fairly closely. On the other hand, translating from APGEN to ASPEN was considerably more involved. On the surface, the types of activities and resources one encodes in APGEN appear to match oneto- one to the activities, state variables, and resources in ASPEN. But, when looking into the definitions of how resources are profiled and activities are expanded, one sees code snippets that access various information available during planning for the moment in time being planned to decide at the time what the appropriate profile or expansion is. APGEN is actually a forward (in time) sweeping discrete event simulator, where the model is composed of code snippets that are artfully interleaved by the engine to produce a plan/schedule. To solve this problem, representative code is simulated as a declarative series of task expansions. Predominantly, three types of procedural models were translated: loops, if statements, and code blocks. Loops and if statements were handled using controlled task expansion, and code blocks were handled using constraint networks that maintained the generation of results based on what the order of execution would be for a procedural representation. One advantage with respect to performance for MAPGEN is the use of APGEN s GUI. This GUI is written in C++ and Motif, and performs very well for large plans.
Cross-domain expression recognition based on sparse coding and transfer learning
NASA Astrophysics Data System (ADS)
Yang, Yong; Zhang, Weiyi; Huang, Yong
2017-05-01
Traditional facial expression recognition methods usually assume that the training set and the test set are independent and identically distributed. However, in actual expression recognition applications, the conditions of independent and identical distribution are hardly satisfied for the training set and test set because of the difference of light, shade, race and so on. In order to solve this problem and improve the performance of expression recognition in the actual applications, a novel method based on transfer learning and sparse coding is applied to facial expression recognition. First of all, a common primitive model, that is, the dictionary is learnt. Then, based on the idea of transfer learning, the learned primitive pattern is transferred to facial expression and the corresponding feature representation is obtained by sparse coding. The experimental results in CK +, JAFFE and NVIE database shows that the transfer learning based on sparse coding method can effectively improve the expression recognition rate in the cross-domain expression recognition task and is suitable for the practical facial expression recognition applications.
Little, Elizabeth A; Presseau, Justin; Eccles, Martin P
2015-06-17
Behavioural theory can be used to better understand the effects of behaviour change interventions targeting healthcare professional behaviour to improve quality of care. However, the explicit use of theory is rarely reported despite interventions inevitably involving at least an implicit idea of what factors to target to implement change. There is a quality of care gap in the post-fracture investigation (bone mineral density (BMD) scanning) and management (bisphosphonate prescription) of patients at risk of osteoporosis. We aimed to use the Theoretical Domains Framework (TDF) within a systematic review of interventions to improve quality of care in post-fracture investigation. Our objectives were to explore which theoretical factors the interventions in the review may have been targeting and how this might be related to the size of the effect on rates of BMD scanning and osteoporosis treatment with bisphosphonate medication. A behavioural scientist and a clinician independently coded TDF domains in intervention and control groups. Quantitative analyses explored the relationship between intervention effect size and total number of domains targeted, and as number of different domains targeted. Nine randomised controlled trials (RCTs) (10 interventions) were analysed. The five theoretical domains most frequently coded as being targeted by the interventions in the review included "memory, attention and decision processes", "knowledge", "environmental context and resources", "social influences" and "beliefs about consequences". Each intervention targeted a combination of at least four of these five domains. Analyses identified an inverse relationship between both number of times and number of different domains coded and the effect size for BMD scanning but not for bisphosphonate prescription, suggesting that the more domains the intervention targeted, the lower the observed effect size. When explicit use of theory to inform interventions is absent, it is possible to retrospectively identify the likely targeted factors using theoretical frameworks such as the TDF. In osteoporosis management, this suggested that several likely determinants of healthcare professional behaviour appear not yet to have been considered in implementation interventions. This approach may serve as a useful basis for using theory-based frameworks such as the TDF to retrospectively identify targeted factors within systematic reviews of implementation interventions in other implementation contexts.
Single Airfoil Gust Response Problem: Category 3, Problem 1
NASA Technical Reports Server (NTRS)
Scott, James R.
2004-01-01
An unsteady aerodynamic code, called GUST3D (ref. 3), has been developed to solve equation (8) for flows with periodic vortical disturbances. The code uses a frequency-domain approach with second-order central differences and a pressure radiation condition in the far field. GUST3D requires as input certain mean flow quantities which are calculated separately by a potential flow solver. The solver calculates the mean ow using a Gothert's Rule approximation (ref. 3). On the airfoil surface, it uses the solution calculated by the potential code FLO36 (ref. 4). Figures 1-2 show the mean pressure along the airfoil surface for the two airfoil geometries. In Figures 3 - 8, we present the RMS pressure on the airfoil surface. Each figure shows three GUST3D solutions (calculated on grids with different far-field boundary locations). Three solutions are shown to provide some indication of the numerical uncertainty in the results. Figures 9 - 13 present the acoustic intensity. We again show three solutions per case. Note that no results are presented for the k1 = k2 = 2.0 loaded airfoil case, as an acceptable solution could not be obtained. A few comments need to be made about the results shown. First, since the last Workshop, the GUST3D code has been substantially upgraded. This includes implementing a more accurate far-field boundary condition (ref. 5) and developing improved gridding capabilities. This is the reason for any differences that may exist between the present results and results from the last Workshop. Second, the intensity results on the circle R = 4C were obtained using a Kirchoff method (ref. 6). The Kirchoff surface was the circle R = 2C. Finally, the GUST3D code is most accurate for low reduced frequencies. A new domain decomposition approach (ref. 7) has been developed to improve accuracy. Both the single domain and domain decomposition approaches were used in generating the present results.
NASA Astrophysics Data System (ADS)
Grunloh, Timothy P.
The objective of this dissertation is to develop a 3-D domain-overlapping coupling method that leverages the superior flow field resolution of the Computational Fluid Dynamics (CFD) code STAR-CCM+ and the fast execution of the System Thermal Hydraulic (STH) code TRACE to efficiently and accurately model thermal hydraulic transport properties in nuclear power plants under complex conditions of regulatory and economic importance. The primary contribution is the novel Stabilized Inertial Domain Overlapping (SIDO) coupling method, which allows for on-the-fly correction of TRACE solutions for local pressures and velocity profiles inside multi-dimensional regions based on the results of the CFD simulation. The method is found to outperform the more frequently-used domain decomposition coupling methods. An STH code such as TRACE is designed to simulate large, diverse component networks, requiring simplifications to the fluid flow equations for reasonable execution times. Empirical correlations are therefore required for many sub-grid processes. The coarse grids used by TRACE diminish sensitivity to small scale geometric details such as Reactor Pressure Vessel (RPV) internals. A CFD code such as STAR-CCM+ uses much finer computational meshes that are sensitive to the geometric details of reactor internals. In turbulent flows, it is infeasible to fully resolve the flow solution, but the correlations used to model turbulence are at a low level. The CFD code can therefore resolve smaller scale flow processes. The development of a 3-D coupling method was carried out with the intention of improving predictive capabilities of transport properties in the downcomer and lower plenum regions of an RPV in reactor safety calculations. These regions are responsible for the multi-dimensional mixing effects that determine the distribution at the core inlet of quantities with reactivity implications, such as fluid temperature and dissolved neutron absorber concentration.
Liu, Guoyuan; Li, Xue; Guo, Liping; Zhang, Xuexian; Qi, Tingxiang; Wang, Hailin; Tang, Huini; Qiao, Xiuqin; Zhang, Jinfa; Xing, Chaozhu; Wu, Jianyong
2017-01-01
The RNA editing occurring in plant organellar genomes mainly involves the change of cytidine to uridine. This process involves a deamination reaction, with cytidine deaminase as the catalyst. Pentatricopeptide repeat (PPR) proteins with a C-terminal DYW domain are reportedly associated with cytidine deamination, similar to members of the deaminase superfamily. PPR genes are involved in many cellular functions and biological processes including fertility restoration to cytoplasmic male sterility (CMS) in plants. In this study, we identified 227 and 211 DYW deaminase-coding PPR genes for the cultivated tetraploid cotton species G. hirsutum and G. barbadense (2n = 4x = 52), respectively, as well as 126 and 97 DYW deaminase-coding PPR genes in the ancestral diploid species G. raimondii and G. arboreum (2n = 26), respectively. The 227 G. hirsutum PPR genes were predicted to encode 52–2016 amino acids, 203 of which were mapped onto 26 chromosomes. Most DYW deaminase genes lacked introns, and their proteins were predicted to target the mitochondria or chloroplasts. Additionally, the DYW domain differed from the complete DYW deaminase domain, which contained part of the E domain and the entire E+ domain. The types and number of DYW tripeptides may have been influenced by evolutionary processes, with some tripeptides being lost. Furthermore, a gene ontology analysis revealed that DYW deaminase functions were mainly related to binding as well as hydrolase and transferase activities. The G. hirsutum DYW deaminase expression profiles varied among different cotton tissues and developmental stages, and no differentially expressed DYW deaminase-coding PPRs were directly associated with the male sterility and restoration in the CMS-D2 system. Our current study provides an important piece of information regarding the structural and evolutionary characteristics of Gossypium DYW-containing PPR genes coding for deaminases and will be useful for characterizing the DYW deaminase gene family in cotton biology and breeding. PMID:28339482
Causes of deaths data, linkages and big data perspectives.
Rey, Grégoire; Bounebache, Karim; Rondet, Claire
2018-07-01
The study of cause-specific mortality data is one of the main sources of information for public health monitoring. In most industrialized countries, when a death occurs, it is a legal requirement that a medical certificate based on the international form recommended by World Health Organization's (WHO) is filled in by a physician. The physician reports the causes of death that directly led or contributed to the death on the death certificate. The death certificate is then forwarded to a coding office, where each cause is coded, and one underlying cause is defined, using the rules of the International Classification of Diseases and Related Health Problems, now in its 10th Revision (ICD-10). Recently, a growing number of countries have adopted, or have decided to adopt, the coding software Iris, developed and maintained by an international consortium 1 . This whole standardized production process results in a high and constantly increasing international comparability of cause-specific mortality data. While these data could be used for international comparisons and benchmarking of global burden of diseases, quality of care and prevention policies, there are also many other ways and methods to explore their richness, especially when they are linked with other data sources. Some of these methods are potentially referring to the so-called "big data" field. These methods could be applied both to the production of the data, to the statistical processing of the data, and even more to process these data linked to other databases. In the present note, we depict the main domains in which this new field of methods could be applied. We focus specifically on the context of France, a 65 million inhabitants country with a centralized health data system. Finally we will insist on the importance of data quality, and the specific problematics related to death certification in the forensic medicine domain. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
1 CFR 5.4 - Publication not authorized.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 1 General Provisions 1 2014-01-01 2012-01-01 true Publication not authorized. 5.4 Section 5.4... Publication not authorized. (a) Chapter 15 of title 44, United States Code, does not apply to treaties...) Chapter 15 of title 44, United States Code, prohibits the publication in the Federal Register of comments...
1 CFR 5.4 - Publication not authorized.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 1 General Provisions 1 2013-01-01 2012-01-01 true Publication not authorized. 5.4 Section 5.4... Publication not authorized. (a) Chapter 15 of title 44, United States Code, does not apply to treaties...) Chapter 15 of title 44, United States Code, prohibits the publication in the Federal Register of comments...
48 CFR 2501.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Publication and code arrangement. 2501.104-1 Section 2501.104-1 Federal Acquisition Regulations System NATIONAL SCIENCE FOUNDATION... code arrangement. (a) The NSFAR is published in the daily issues of the Federal Register and, in...
48 CFR 2501.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Publication and code arrangement. 2501.104-1 Section 2501.104-1 Federal Acquisition Regulations System NATIONAL SCIENCE FOUNDATION... code arrangement. (a) The NSFAR is published in the daily issues of the Federal Register and, in...
48 CFR 2501.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Publication and code arrangement. 2501.104-1 Section 2501.104-1 Federal Acquisition Regulations System NATIONAL SCIENCE FOUNDATION... code arrangement. (a) The NSFAR is published in the daily issues of the Federal Register and, in...
Gollust, Sarah E; Dwyer, Anne M
2013-12-01
Cancer experts engage in public communication whenever they promote their research or practice, respond to media inquiries, or use social media. In a changing communication landscape characterized by new technologies and heightened attention to cancer controversies, these activities may pose ethical challenges. This study was designed to evaluate existing resources to help clinicians navigate their public communication activities. We conducted a systematic, qualitative content analysis of codes of ethics, policy statements, and similar documents disseminated by professional medical and nursing societies for their members. We examined these documents for four types of content related to public communication: communication via traditional media; communication via social media; other communication to the public, policy, and legal spheres; and nonspecific language regarding public communication. We identified 46 documents from 23 professional societies for analysis. Five societies had language about traditional news media communication, five had guidance about social media, 11 had guidance about other communication domains, and 15 societies offered general language about public communication. The limited existing guidance focused on ethical issues related to patients (such as privacy violations) or clinicians (such as accuracy and professional boundaries), with less attention to population or policy impact of communication. Cancer-related professional societies might consider establishing more specific guidance for clinicians concerning their communication activities in light of changes to the communication landscape. Additional research is warranted to understand the extent to which clinicians face ethical challenges in public communication.
HERCULES: A Pattern Driven Code Transformation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing
2012-01-01
New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less
Progress in The Semantic Analysis of Scientific Code
NASA Technical Reports Server (NTRS)
Stewart, Mark
2000-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.
Energy information data base: report number codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-09-01
Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes eachmore » has used. (RWR)« less
Pan, Mei; Zhu, Yi-Xuan; Wu, Kai; Chen, Ling; Hou, Ya-Jun; Yin, Shao-Yun; Wang, Hai-Ping; Fan, Ya-Nan; Su, Cheng-Yong
2017-11-13
Core-shell or striped heteroatomic lanthanide metal-organic framework hierarchical single crystals were obtained by liquid-phase anisotropic epitaxial growth, maintaining identical periodic organization while simultaneously exhibiting spatially segregated structure. Different types of domain and orientation-controlled multicolor photophysical models are presented, which show either visually distinguishable or visible/near infrared (NIR) emissive colors. This provides a new bottom-up strategy toward the design of hierarchical molecular systems, offering high-throughput and multiplexed luminescence color tunability and readability. The unique capability of combining spectroscopic coding with 3D (three-dimensional) microscale spatial coding is established, providing potential applications in anti-counterfeiting, color barcoding, and other types of integrated and miniaturized optoelectronic materials and devices. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Efficient convolutional sparse coding
Wohlberg, Brendt
2017-06-20
Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.
NASA Technical Reports Server (NTRS)
Shankar, V.; Rowell, C.; Hall, W. F.; Mohammadian, A. H.; Schuh, M.; Taylor, K.
1992-01-01
Accurate and rapid evaluation of radar signature for alternative aircraft/store configurations would be of substantial benefit in the evolution of integrated designs that meet radar cross-section (RCS) requirements across the threat spectrum. Finite-volume time domain methods offer the possibility of modeling the whole aircraft, including penetrable regions and stores, at longer wavelengths on today's gigaflop supercomputers and at typical airborne radar wavelengths on the teraflop computers of tomorrow. A structured-grid finite-volume time domain computational fluid dynamics (CFD)-based RCS code has been developed at the Rockwell Science Center, and this code incorporates modeling techniques for general radar absorbing materials and structures. Using this work as a base, the goal of the CFD-based CEM effort is to define, implement and evaluate various code development issues suitable for rapid prototype signature prediction.
NASA Astrophysics Data System (ADS)
Kim, Sungwon
Ferroelectric LiNbO3 and LiTaO3 crystals have developed, over the last 50 years as key materials for integrated and nonlinear optics due to their large electro-optic and nonlinear optical coefficients and a broad transparency range from 0.4 mum-4.5 mum wavelengths. Applications include high speed optical modulation and switching in 40GHz range, second harmonic generation, optical parametric amplification, pulse compression and so on. Ferroelectric domain microengineering has led to electro-optic scanners, dynamic focusing lenses, total internal reflection switches, and quasi-phase matched (QPM) frequency doublers. Most of these applications have so far been on non-stoichiometric compositions of these crystals. Recent breakthroughs in crystal growth have however opened up an entirely new window of opportunity from both scientific and technological viewpoint. The growth of stoichiometric composition crystals has led to the discovery of many fascinating effects arising from the presence or absence of atomic defects, such as an order of magnitude changes in coercive fields, internal fields, domain backswitching and stabilization phenomenon. On the nanoscale, unexpected features such as the presence of wide regions of optical contrast and strain have been discovered at 180° domain walls. Such strong influence of small amounts of nonstoichiometric defects on material properties has led to new device applications, particularly those involving domain patterning and shaping such as QPM devices in thick bulk crystals and improved photorefractive damage compositions. The central focus of this dissertation is to explore the role of nonstoichiometry and its precise influence on macroscale and nanoscale properties in lithium niobate and tantalate. Macroscale properties are studied using a combination of in-situ and high-speed electro-optic imaging microscopy and electrical switching experiments. Local static and dynamic strain properties at individual domain walls is studied using X-ray synchrotron imaging with and without in-situ electric fields. Nanoscale optical properties are studied using Near Field Scanning Optical Microscopy(NSOM). Finite Difference Time Domain(FDTD) codes, Beam Propagation Method(BPM) codes and X-ray tracing codes have been developed to successfully simulate NSOM images and X-ray topography images to extract the local optical and strain properties, respectively. A 3-D ferroelectric domain simulation code based on Time Dependent Ginzburg Landau(TDGL) theory and group theory has been developed to understand the nature of these local wall strains and the preferred wall orientations. By combining these experimental and numerical tools, We have also proposed a defect-dipole model and a mechanism by which the defect interacts with the domain walls. This thesis has thus built a more comprehensive picture of the influence of defects on domain walls on nanoscale and macroscale, and raises new scientific questions about the exact nature of domain walls-defect interactions. Besides the specific problem of ferroelectrics, the experimental and simulation tools, developed in this thesis will have wider application in the area of materials science.
The moving mesh code SHADOWFAX
NASA Astrophysics Data System (ADS)
Vandenbroucke, B.; De Rijcke, S.
2016-07-01
We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.
High performance Python for direct numerical simulations of turbulent flows
NASA Astrophysics Data System (ADS)
Mortensen, Mikael; Langtangen, Hans Petter
2016-06-01
Direct Numerical Simulations (DNS) of the Navier Stokes equations is an invaluable research tool in fluid dynamics. Still, there are few publicly available research codes and, due to the heavy number crunching implied, available codes are usually written in low-level languages such as C/C++ or Fortran. In this paper we describe a pure scientific Python pseudo-spectral DNS code that nearly matches the performance of C++ for thousands of processors and billions of unknowns. We also describe a version optimized through Cython, that is found to match the speed of C++. The solvers are written from scratch in Python, both the mesh, the MPI domain decomposition, and the temporal integrators. The solvers have been verified and benchmarked on the Shaheen supercomputer at the KAUST supercomputing laboratory, and we are able to show very good scaling up to several thousand cores. A very important part of the implementation is the mesh decomposition (we implement both slab and pencil decompositions) and 3D parallel Fast Fourier Transforms (FFT). The mesh decomposition and FFT routines have been implemented in Python using serial FFT routines (either NumPy, pyFFTW or any other serial FFT module), NumPy array manipulations and with MPI communications handled by MPI for Python (mpi4py). We show how we are able to execute a 3D parallel FFT in Python for a slab mesh decomposition using 4 lines of compact Python code, for which the parallel performance on Shaheen is found to be slightly better than similar routines provided through the FFTW library. For a pencil mesh decomposition 7 lines of code is required to execute a transform.
NASA Technical Reports Server (NTRS)
Kandula, Max; Pearce, Daniel
1989-01-01
A steady incompressible three-dimensional (3-D) viscous flow analysis was conducted for the Space Shuttle Main Propulsion External Tank (ET)/Orbiter (ORB) propellant feed line quick separable 17-inch disconnect flapper valves for liquid oxygen (LO2) and liquid hydrogen (LH2). The main objectives of the analysis were to predict and correlate the hydrodynamic stability of the flappers and pressure drop with available water test data. Computational Fluid Dynamics (CFD) computer codes were procured at no cost from the public domain, and were modified and extended to carry out the disconnect flow analysis. The grid generator codes SVTGD3D and INGRID were obtained. NASA Ames Research Center supplied the flow solution code INS3D, and the color graphics code PLOT3D. A driver routine was developed to automate the grid generation process. Components such as pipes, elbows, and flappers can be generated with simple commands, and flapper angles can be varied easily. The flow solver INS3D code was modified to treat interior flappers, and other interfacing routines were developed, which include a turbulence model, a force/moment routine, a time-step routine, and initial and boundary conditions. In particular, an under-relaxation scheme was implemented to enhance the solution stability. Major physical assumptions and simplifications made in the analysis include the neglect of linkages, slightly reduced flapper diameter, and smooth solid surfaces. A grid size of 54 x 21 x 25 was employed for both the LO2 and LH2 units. Mixing length theory applied to turbulent shear flow in pipes formed the basis for the simple turbulence model. Results of the analysis are presented for LO2 and LH2 disconnects.
Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.
1990-09-01
IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve
Nodes and Codes: The Reality of Cyber Warfare
2012-05-17
Nodes and Codes explores the reality of cyber warfare through the story of Stuxnet, a string of weaponized code that reached through a domain...nodes. Stuxnet served as a proof-of-concept for cyber weapons and provided a comparative laboratory to study the reality of cyber warfare from the...military powers most often associated with advanced, offensive cyber attack capabilities. The reality of cyber warfare holds significant operational
From Theory to Practice: Measuring end-of-life communication quality using multiple goals theory.
Van Scoy, L J; Scott, A M; Reading, J M; Chuang, C H; Chinchilli, V M; Levi, B H; Green, M J
2017-05-01
To describe how multiple goals theory can be used as a reliable and valid measure (i.e., coding scheme) of the quality of conversations about end-of-life issues. We analyzed conversations from 17 conversations in which 68 participants (mean age=51years) played a game that prompted discussion in response to open-ended questions about end-of-life issues. Conversations (mean duration=91min) were audio-recorded and transcribed. Communication quality was assessed by three coders who assigned numeric scores rating how well individuals accomplished task, relational, and identity goals in the conversation. The coding measure, which results in a quantifiable outcome, yielded strong reliability (intra-class correlation range=0.73-0.89 and Cronbach's alpha range=0.69-0.89 for each of the coded domains) and validity (using multilevel nonlinear modeling, we detected significant variability in scores between games for each of the coded domains, all p-values <0.02). Our coding scheme provides a theory-based measure of end-of-life conversation quality that is superior to other methods of measuring communication quality. Our description of the coding method enables researches to adapt and apply this measure to communication interventions in other clinical contexts. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Dimitrieva, Slavica; Anisimova, Maria
2014-01-01
In protein-coding genes, synonymous mutations are often thought not to affect fitness and therefore are not subject to natural selection. Yet increasingly, cases of non-neutral evolution at certain synonymous sites were reported over the last decade. To evaluate the extent and the nature of site-specific selection on synonymous codons, we computed the site-to-site synonymous rate variation (SRV) and identified gene properties that make SRV more likely in a large database of protein-coding gene families and protein domains. To our knowledge, this is the first study that explores the determinants and patterns of the SRV in real data. We show that the SRV is widespread in the evolution of protein-coding sequences, putting in doubt the validity of the synonymous rate as a standard neutral proxy. While protein domains rarely undergo adaptive evolution, the SRV appears to play important role in optimizing the domain function at the level of DNA. In contrast, protein families are more likely to evolve by positive selection, but are less likely to exhibit SRV. Stronger SRV was detected in genes with stronger codon bias and tRNA reusage, those coding for proteins with larger number of interactions or forming larger number of structures, located in intracellular components and those involved in typically conserved complex processes and functions. Genes with extreme SRV show higher expression levels in nearly all tissues. This indicates that codon bias in a gene, which often correlates with gene expression, may often be a site-specific phenomenon regulating the speed of translation along the sequence, consistent with the co-translational folding hypothesis. Strikingly, genes with SRV were strongly overrepresented for metabolic pathways and those associated with several genetic diseases, particularly cancers and diabetes.
NASA Astrophysics Data System (ADS)
Khadra, Wisam M.; Stuyfzand, Pieter J.
2018-03-01
To date, there has been no agreement on the best way to simulate saltwater intrusion (SWI) in karst aquifers. An equivalent porous medium (EPM) is usually assumed without justification of its applicability. In this paper, SWI in a poorly karstified aquifer in Lebanon is simulated in various ways and compared to measurements. Time series analysis of rainfall and aquifer response is recommended to decide whether quickflow through conduits can be safely ignored. This aids in justifying the selection of the exemplified EPM model. To examine the improvement of SWI representation when discrete features (DFs) are embedded in the model domain, the results of a coupled discrete-continuum (CDC) approach (a hybrid EPM-DF approach) are compared to the EPM model. The two approaches yielded reasonable patterns of hydraulic head and groundwater salinity, which seem trustworthy enough for management purposes. The CDC model also reproduced some local anomalous chloride patterns, being more adaptable with respect to the measurements. It improved the overall accuracy of salinity predictions at wells and better represented the fresh-brackish water interface. Therefore, the CDC approach can be beneficial in modeling SWI in poorly karstified aquifers, and should be compared with the results of the EPM method to decide whether the differences in the outcome at local scale warrant its (more complicated) application. The simulation utilized the SEAWAT code since it is density dependent and public domain, and it enjoys widespread application. Including DFs necessitated manual handling because the selected code has no built-in option for such features.
Fox, Michael H; Krahn, Gloria L; Sinclair, Lisa B; Cahill, Anthony
2015-07-01
Surveillance on paralysis prevalence has been conceptually and methodologically challenging. Numerous methods have been used to approximate population-level paralysis prevalence estimates leading to widely divergent prevalence estimates. To describe three phases in use of the International Classification of Functioning, Disability and Health (ICF) as a framework and planning tool for defining paralysis and developing public health surveillance of this condition. Description of the surveillance methodology covers four steps: an assessment of prior data collection efforts that included a review of existing surveys, registries and other data collection efforts designed to capture both case definitions in use and prevalence of paralysis; use of a consensus conference of experts to develop a case definition of paralysis based on the ICF rather than medical diagnostic criteria; explanation of use of the ICF framework for domains of interest to develop, cognitively test, validate and administer a brief self-report questionnaire for telephone administration on a population; and development and administration of a Paralysis Prevalence and Health Disparities Survey that used content mapping to back code items from existing national surveys to operationalize key domains. ICF coding led to a national population-based survey of paralysis that produced accurate estimates of prevalence and identification of factors related to the health of people in the U.S. living with paralysis. The ICF can be a useful tool for developing valid and reliable surveillance strategies targeting subgroups of individuals with functional disabilities such as people with paralysis and others. Published by Elsevier Inc.
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
Yafremava, Liudmila S; Di Giulio, Massimo; Caetano-Anollés, Gustavo
2013-01-01
Amino acid substitution patterns between the nonbarophilic Pyrococcus furiosus and its barophilic relative P. abyssi confirm that hydrostatic pressure asymmetry indices reflect the extent to which amino acids are preferred by barophilic archaeal organisms. Substitution patterns in entire protein sequences, shared protein domains defined at fold superfamily level, domains in homologous sequence pairs, and domains of very ancient and very recent origin now provide further clues about the environment that led to the genetic code and diversified life. The pyrococcal proteomes are very similar and share a very early ancestor. Relative amino acid abundance analyses showed that biases in the use of amino acids are due to their shared fold superfamilies. Within these repertoires, only two of the five amino acids that are preferentially barophilic, aspartic acid and arginine, displayed this preference significantly and consistently across structure and in domains appearing in the ancestor. The more primordial asparagine, lysine and threonine displayed a consistent preference for nonbarophily across structure and in the ancestor. Since barophilic preferences are already evident in ancient domains that are at least ~3 billion year old, we conclude that barophily is a very ancient trait that unfolded concurrently with genetic idiosyncrasies in convergence towards a universal code.
48 CFR 401.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...
48 CFR 2501.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Publication and code arrangement. 2501.104-1 Section 2501.104-1 Federal Acquisition Regulations System NATIONAL SCIENCE FOUNDATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 2501.104-1 Publication and...
48 CFR 1201.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Publication and code arrangement. 1201.105-1 Section 1201.105-1 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1201.105-1 Publication and...
48 CFR 401.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...
48 CFR 1201.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Publication and code arrangement. 1201.105-1 Section 1201.105-1 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1201.105-1 Publication and...
48 CFR 401.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...
48 CFR 401.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...
48 CFR 2501.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 2501.104-1 Section 2501.104-1 Federal Acquisition Regulations System NATIONAL SCIENCE FOUNDATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 2501.104-1 Publication and...
The Role of Ontologies in Schema-based Program Synthesis
NASA Technical Reports Server (NTRS)
Bures, Tomas; Denney, Ewen; Fischer, Bernd; Nistor, Eugen C.
2004-01-01
Program synthesis is the process of automatically deriving executable code from (non-executable) high-level specifications. It is more flexible and powerful than conventional code generation techniques that simply translate algorithmic specifications into lower-level code or only create code skeletons from structural specifications (such as UML class diagrams). Key to building a successful synthesis system is specializing to an appropriate application domain. The AUTOBAYES and AUTOFILTER systems, under development at NASA Ames, operate in the two domains of data analysis and state estimation, respectively. The central concept of both systems is the schema, a representation of reusable computational knowledge. This can take various forms, including high-level algorithm templates, code optimizations, datatype refinements, or architectural information. A schema also contains applicability conditions that are used to determine when it can be applied safely. These conditions can refer to the initial specification, to intermediate results, or to elements of the partially-instantiated code. Schema-based synthesis uses AI technology to recursively apply schemas to gradually refine a specification into executable code. This process proceeds in two main phases. A front-end gradually transforms the problem specification into a program represented in an abstract intermediate code. A backend then compiles this further down into a concrete target programming language of choice. A core engine applies schemas on the initial problem specification, then uses the output of those schemas as the input for other schemas, until the full implementation is generated. Since there might be different schemas that implement different solutions to the same problem this process can generate an entire solution tree. AUTOBAYES and AUTOFILTER have reached the level of maturity where they enable users to solve interesting application problems, e.g., the analysis of Hubble Space Telescope images. They are large (in total around 100kLoC Prolog), knowledge intensive systems that employ complex symbolic reasoning to generate a wide range of non-trivial programs for complex application do- mains. Their schemas can have complex interactions, which make it hard to change them in isolation or even understand what an existing schema actually does. Adding more capabilities by increasing the number of schemas will only worsen this situation, ultimately leading to the entropy death of the synthesis system. The root came of this problem is that the domain knowledge is scattered throughout the entire system and only represented implicitly in the schema implementations. In our current work, we are addressing this problem by making explicit the knowledge from Merent parts of the synthesis system. Here; we discuss how Gruber's definition of an ontology as an explicit specification of a conceptualization matches our efforts in identifying and explicating the domain-specific concepts. We outline the dual role ontologies play in schema-based synthesis and argue that they address different audiences and serve different purposes. Their first role is descriptive: they serve as explicit documentation, and help to understand the internal structure of the system. Their second role is prescriptive: they provide the formal basis against which the other parts of the system (e.g., schemas) can be checked. Their final role is referential: ontologies also provide semantically meaningful "hooks" which allow schemas and tools to access the internal state of the program derivation process (e.g., fragments of the generated code) in domain-specific rather than language-specific terms, and thus to modify it in a controlled fashion. For discussion purposes we use AUTOLINEAR, a small synthesis system we are currently experimenting with, which can generate code for solving a system of linear equations, Az = b.
NASA Astrophysics Data System (ADS)
Elgaud, M. M.; Zan, M. S. D.; Abushagur, A. G.; Bakar, A. Ashrif A.
2017-07-01
This paper reports the employment of autocorrelation properties of Golay complementary codes (GCC) to enhance the performance of the time domain multiplexing fiber Bragg grating (TDM-FBG) sensing network. By encoding the light from laser with a stream of non-return-to-zero (NRZ) form of GCC and launching it into the sensing area that consists of the FBG sensors, we have found that the FBG signals can be decoded correctly with the autocorrelation calculations, confirming the successful demonstration of coded TDM-FBG sensor network. OptiGrating and OptiSystem simulators were used to design customized FBG sensors and perform the coded TDM-FBG sensor simulations, respectively. Results have substantiated the theoretical dependence of SNR enhancement on the code length of GCC, where the maximum SNR improvement of about 9 dB is achievable with the use of 256 bits of GCC compared to that of 4 bits case. Furthermore, the GCC has also extended the strain exposure up to 30% higher compared to the maximum of the conventional single pulse case. The employment of GCC in the TDM-FBG sensor system provides overall performance enhancement over the conventional single pulse case, under the same conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Analysis of Search Results for the Clarification and Identification of Technology Emergence (AR-CITE) computer code examines a scientometric model that tracks the emergence of an identified technology from initial discovery (via original scientific and conference literature), through critical discoveries (via original scientific, conference literature and patents), transitioning through Technology Readiness Levels (TRLs) and ultimately on to commercial currency of citations, collaboration indicators, and on-line news patterns are identified. The combinations of four distinct and separate searchable on-line networked sources (i.e. scholarly publications and citation, world patents, news archives, and on-line mapping networks) are assembled to become one collective networkmore » (a dataset for analysis of relations). This established network becomes the basis from which to quickly analyze the temporal flow of activity (searchable events) for the subject domain to be clarified and identified.« less
Michigan Magnetic and Gravity Maps and Data: A Website for the Distribution of Data
Daniels, David L.; Kucks, Robert P.; Hill, Patricia L.; Snyder, Stephen L.
2009-01-01
This web site provides the best available, public-domain, aeromagnetic and gravity data in the State of Michigan and merges these data into composite grids that are available for downloading. The magnetic grid is compiled from 25 separate magnetic surveys that have been knit together to form a single composite digital grid and map. The magnetic survey grids have been continued to 305 meters (1,000 feet) above ground and merged together to form the State compilation. A separate map shows the location of the aeromagnetic surveys, color-coded to the survey flight-line spacing. In addition, a complete Bouguer gravity anomaly grid and map were generated from more than 20,000 gravity station measurements from 33 surveys. A table provides the facts about each gravity survey where known.
Initial Low-Reynolds Number Iced Aerodynamic Performance for CRM Wing
NASA Technical Reports Server (NTRS)
Woodard, Brian; Diebold, Jeff; Broeren, Andy; Potapczuk, Mark; Lee, Sam; Bragg, Michael
2015-01-01
NASA, FAA, ONERA, and other partner organizations have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large scale, three-dimensional swept wings. These are extremely complex phenomena important to the design, certification and safe operation of small and large transport aircraft. There is increasing demand to balance trade-offs in aircraft efficiency, cost and noise that tend to compete directly with allowable performance degradations over an increasing range of icing conditions. Computational fluid dynamics codes have reached a level of maturity that they are being proposed by manufacturers for use in certification of aircraft for flight in icing. However, sufficient high-quality data to evaluate their performance on iced swept wings are not currently available in the public domain and significant knowledge gaps remain.
Probing the Milky Way electron density using multi-messenger astronomy
NASA Astrophysics Data System (ADS)
Breivik, Katelyn; Larson, Shane
2015-04-01
Multi-messenger observations of ultra-compact binaries in both gravitational waves and electromagnetic radiation supply highly complementary information, providing new ways of characterizing the internal dynamics of these systems, as well as new probes of the galaxy itself. Electron density models, used in pulsar distance measurements via the electron dispersion measure, are currently not well constrained. Simultaneous radio and gravitational wave observations of pulsars in binaries provide a method of measuring the average electron density along the line of sight to the pulsar, thus giving a new method for constraining current electron density models. We present this method and assess its viability with simulations of the compact binary component of the Milky Way using the public domain binary evolution code, BSE. This work is supported by NASA Award NNX13AM10G.
Proceedings of the Workshop on software tools for distributed intelligent control systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herget, C.J.
1990-09-01
The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less
Trends in newspaper coverage of mental illness in Canada: 2005-2010.
Whitley, Rob; Berry, Sarah
2013-02-01
Much research suggests that the general public relies on the popular media as a primary source of information about mental illness. We assessed the broad content of articles relating to mental illness in major Canadian newspapers over a 6-year period. We also sought to assess if such content has changed over time. We conducted a retrospective analysis of Canadian newspaper coverage from 2005 to 2010. Research assistants used a standardized guide to code 11 263 newspaper articles that mention the terms mental health, mental illness, schizophrenia, or schizophrenic. Once the articles were coded, descriptive statistics were produced for overarching themes and time trend analyses from 2005 to 2010. Danger, violence, and criminality were direct themes in 40% of newspaper articles. Treatment for a mental illness was discussed in only 19% of newspaper articles, and in only 18% was recovery or rehabilitation a significant theme. Eighty-three per cent of articles coded lacked a quotation from someone with a mental illness. We did not observe any significant changes over time from 2005 to 2010 in any domain measured. There is scope for more balanced, accurate, and informative coverage of mental health issues in Canada. Newspaper articles infrequently reflect the common realities of mental illness phenomenology, course, and outcome. Currently, clinicians may direct patients and family members to other resources for more comprehensive and accurate information about mental illness.
Bergna, Miguel A; García, Gabriel R; Alchapar, Ramon; Altieri, Hector; Casas, Juan C Figueroa; Larrateguy, Luis; Nannini, Luis J; Pascansky, Daniel; Grabre, Pedro; Zabert, Gustavo; Miravitlles, Marc
2015-06-01
The CODE questionnaire (COPD detection questionnaire), a simple, binary response scale (yes/no), screening questionnaire, was developed for the identification of patients with chronic obstructive pulmonary disease (COPD). We conducted a survey of 468 subjects with a smoking history in 10 public hospitals in Argentina. Patients with a previous diagnosis of COPD, asthma and other respiratory illness were excluded. Items that measured conceptual domains in terms of characteristics of symptoms, smoking history and demographics data were considered. 96 (20.5%) subjects had a diagnosis of COPD according to the 2010 Global Initiative for Chronic Obstructive Lung Disease strategy document. The variables selected for the final questionnaire were based on univariate and multivariate analyses and clinical criteria. Finally, we selected the presence or absence of six variables (age ≥50 years, smoking history ≥30 pack-years, male sex, chronic cough, chronic phlegm and dyspnoea). Of patients without any of these six variables (0 points), none had COPD. The ability of the CODE questionnaire to discriminate between subjects with and without COPD was good (the area under the receiver operating characteristic curve was 0.75). Higher scores were associated with a greater probability of COPD. The CODE questionnaire is a brief, accurate questionnaire that can identify smoking individuals likely to have COPD. Copyright ©ERS 2015.
Data publishing - visions of the future
NASA Astrophysics Data System (ADS)
Schäfer, Leonie; Klump, Jens; Bertelmann, Roland; Klar, Jochen; Enke, Harry; Rathmann, Torsten; Koudela, Daniela; Köhler, Klaus; Müller-Pfefferkorn, Ralph; van Uytvanck, Dieter; Strathmann, Stefan; Engelhardt, Claudia
2013-04-01
This poster describes future scenarios of information infrastructures in science and other fields of research. The scenarios presented are based on practical experience resulting from interaction with research data in a research center and its library, and further enriched by the results of a baseline study of existing data repositories and data infrastructures. The baseline study was conducted as part of the project "Requirements for a multi-disciplinary research data infrastructure (Radieschen)", which is funded by the German Research Foundation (DFG). Current changes in information infrastructures pose new challenges to libraries and scientific journals, which both act as information service providers, facilitating access to digital media, support publications of research data and enable their long-term archiving. Digital media and research data open new aspects in the field of activity of libraries and scientific journals. What will a library of the future look like? Will a library purely serve as interface to data centres? Will libraries and data centres merge into a new service unit? Will a future library be the interface to academic cloud services? Scientific journals already converted from mostly print editions to print and e-journals. What type of journals will emerge in the future? Is there a role for data-centred journals? Will there be journals to publish software code to make this type of research result citable and a part of the record of science? Just as users evolve from being consumers of information into producers, the role of information service providers, such as libraries, changes from a purely supporting to a contributing role. Furthermore, the role of the library changes from a central point of access for the search of publications to an important link in the value-adding chain from author to publication. Journals for software publication might be another vision for the future in data publishing. Software forms the missing link between big data collected by experiments, monitoring or simulation. In order to verify the results presented, a paper should also report on the process of data analysis applied to the data sets stored at data centers. In this case data, software, and interpretation supplement each other as a trustworthy, reproducible presentation of research results. Another approach is suggested by researchers of the EU-funded project "Liquid Publications" (1). Instead of traditional publications the researchers propose liquid journals as evolving collections of links and material, and recommend new methods in reviewing and assessing publications. Another point of interest are workflows in data publication. The commonly used model to depict the data life cycle might look appealing but does not necessarily represent reality. The model proposed by Treloar et. al. (2) offers a better approach to depict transition of research data between different domains of use, e.g. from the group domain to the public domain. However, several questions need to be addressed, such as how to avoid the loss of contextual information during transitions between domains, and the influence of the size of the data on the workflow process. This poster aims to present different scenarios of the future - from the point of view of researchers, libraries and scientific journals and will invite for further discussion. (1) LiquidPub Green Paper, https://dev.liquidpub.org/svn/liquidpub/papers/deliverables/LPGreenPaper.pdf (2) Treloar, A., Harboe-Ree, C. (2008). Data management and the curation continuum: how the Monash experience is informing repository relationships. In VALA2008, Melbourne, Australia. Retrieved from http://www.valaconf.org.au/vala2008/papers2008/111_Treloar_Final.pdf
Fukui, Sadaaki; Matthias, Marianne S; Salyers, Michelle P
2015-01-01
Shared decision-making (SDM) is imperative to person-centered care, yet little is known about what aspects of SDM are targeted during psychiatric visits. This secondary data analysis (191 psychiatric visits with 11 providers, coded with a validated SDM coding system) revealed two factors (scientific and preference-based discussions) underlying SDM communication. Preference-based discussion occurred less. Both provider and consumer initiation of SDM elements and decision complexity were associated with greater discussions in both factors, but were more strongly associated with scientific discussion. Longer visit length correlated with only scientific discussion. Providers' understanding of core domains could facilitate engaging consumers in SDM.
Wakefield Computations for the CLIC PETS using the Parallel Finite Element Time-Domain Code T3P
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candel, A; Kabel, A.; Lee, L.
In recent years, SLAC's Advanced Computations Department (ACD) has developed the high-performance parallel 3D electromagnetic time-domain code, T3P, for simulations of wakefields and transients in complex accelerator structures. T3P is based on advanced higher-order Finite Element methods on unstructured grids with quadratic surface approximation. Optimized for large-scale parallel processing on leadership supercomputing facilities, T3P allows simulations of realistic 3D structures with unprecedented accuracy, aiding the design of the next generation of accelerator facilities. Applications to the Compact Linear Collider (CLIC) Power Extraction and Transfer Structure (PETS) are presented.
A mirror code for protein-cholesterol interactions in the two leaflets of biological membranes
NASA Astrophysics Data System (ADS)
Fantini, Jacques; di Scala, Coralie; Evans, Luke S.; Williamson, Philip T. F.; Barrantes, Francisco J.
2016-02-01
Cholesterol controls the activity of a wide range of membrane receptors through specific interactions and identifying cholesterol recognition motifs is therefore critical for understanding signaling receptor function. The membrane-spanning domains of the paradigm neurotransmitter receptor for acetylcholine (AChR) display a series of cholesterol consensus domains (referred to as “CARC”). Here we use a combination of molecular modeling, lipid monolayer/mutational approaches and NMR spectroscopy to study the binding of cholesterol to a synthetic CARC peptide. The CARC-cholesterol interaction is of high affinity, lipid-specific, concentration-dependent, and sensitive to single-point mutations. The CARC motif is generally located in the outer membrane leaflet and its reverse sequence CRAC in the inner one. Their simultaneous presence within the same transmembrane domain obeys a “mirror code” controlling protein-cholesterol interactions in the outer and inner membrane leaflets. Deciphering this code enabled us to elaborate guidelines for the detection of cholesterol-binding motifs in any membrane protein. Several representative examples of neurotransmitter receptors and ABC transporters with the dual CARC/CRAC motifs are presented. The biological significance and potential clinical applications of the mirror code are discussed.
Wang, Jiajia; Li, Hu; Dai, Renhuai
2017-12-01
Here, we describe the first complete mitochondrial genome (mitogenome) sequence of the leafhopper Taharana fasciana (Coelidiinae). The mitogenome sequence contains 15,161 bp with an A + T content of 77.9%. It includes 13 protein-coding genes, two ribosomal RNA genes, 22 transfer RNA genes, and one non-coding (A + T-rich) region; in addition, a repeat region is also present (GenBank accession no. KY886913). These genes/regions are in the same order as in the inferred insect ancestral mitogenome. All protein-coding genes have ATN as the start codon, and TAA or single T as the stop codons, except the gene ND3, which ends with TAG. Furthermore, we predicted the secondary structures of the rRNAs in T. fasciana. Six domains (domain III is absent in arthropods) and 41 helices were predicted for 16S rRNA, and 12S rRNA comprised three structural domains and 24 helices. Phylogenetic tree analysis confirmed that T. fasciana and other members of the Cicadellidae are clustered into a clade, and it identified the relationships among the subfamilies Deltocephalinae, Coelidiinae, Idiocerinae, Cicadellinae, and Typhlocybinae.
Practices in source code sharing in astrophysics
NASA Astrophysics Data System (ADS)
Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly
2013-02-01
While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.
Database Resources of the BIG Data Center in 2018
Xu, Xingjian; Hao, Lili; Zhu, Junwei; Tang, Bixia; Zhou, Qing; Song, Fuhai; Chen, Tingting; Zhang, Sisi; Dong, Lili; Lan, Li; Wang, Yanqing; Sang, Jian; Hao, Lili; Liang, Fang; Cao, Jiabao; Liu, Fang; Liu, Lin; Wang, Fan; Ma, Yingke; Xu, Xingjian; Zhang, Lijuan; Chen, Meili; Tian, Dongmei; Li, Cuiping; Dong, Lili; Du, Zhenglin; Yuan, Na; Zeng, Jingyao; Zhang, Zhewen; Wang, Jinyue; Shi, Shuo; Zhang, Yadong; Pan, Mengyu; Tang, Bixia; Zou, Dong; Song, Shuhui; Sang, Jian; Xia, Lin; Wang, Zhennan; Li, Man; Cao, Jiabao; Niu, Guangyi; Zhang, Yang; Sheng, Xin; Lu, Mingming; Wang, Qi; Xiao, Jingfa; Zou, Dong; Wang, Fan; Hao, Lili; Liang, Fang; Li, Mengwei; Sun, Shixiang; Zou, Dong; Li, Rujiao; Yu, Chunlei; Wang, Guangyu; Sang, Jian; Liu, Lin; Li, Mengwei; Li, Man; Niu, Guangyi; Cao, Jiabao; Sun, Shixiang; Xia, Lin; Yin, Hongyan; Zou, Dong; Xu, Xingjian; Ma, Lina; Chen, Huanxin; Sun, Yubin; Yu, Lei; Zhai, Shuang; Sun, Mingyuan; Zhang, Zhang; Zhao, Wenming; Xiao, Jingfa; Bao, Yiming; Song, Shuhui; Hao, Lili; Li, Rujiao; Ma, Lina; Sang, Jian; Wang, Yanqing; Tang, Bixia; Zou, Dong; Wang, Fan
2018-01-01
Abstract The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. PMID:29036542
Multivariate assessment of event-related potentials with the t-CWT method.
Bostanov, Vladimir
2015-11-05
Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.
Fast ITTBC using pattern code on subband segmentation
NASA Astrophysics Data System (ADS)
Koh, Sung S.; Kim, Hanchil; Lee, Kooyoung; Kim, Hongbin; Jeong, Hun; Cho, Gangseok; Kim, Chunghwa
2000-06-01
Iterated Transformation Theory-Based Coding suffers from very high computational complexity in encoding phase. This is due to its exhaustive search. In this paper, our proposed image coding algorithm preprocess an original image to subband segmentation image by wavelet transform before image coding to reduce encoding complexity. A similar block is searched by using the 24 block pattern codes which are coded by the edge information in the image block on the domain pool of the subband segmentation. As a result, numerical data shows that the encoding time of the proposed coding method can be reduced to 98.82% of that of Joaquin's method, while the loss in quality relative to the Jacquin's is about 0.28 dB in PSNR, which is visually negligible.
GNormPlus: An Integrative Approach for Tagging Genes, Gene Families, and Protein Domains
Lu, Zhiyong
2015-01-01
The automatic recognition of gene names and their associated database identifiers from biomedical text has been widely studied in recent years, as these tasks play an important role in many downstream text-mining applications. Despite significant previous research, only a small number of tools are publicly available and these tools are typically restricted to detecting only mention level gene names or only document level gene identifiers. In this work, we report GNormPlus: an end-to-end and open source system that handles both gene mention and identifier detection. We created a new corpus of 694 PubMed articles to support our development of GNormPlus, containing manual annotations for not only gene names and their identifiers, but also closely related concepts useful for gene name disambiguation, such as gene families and protein domains. GNormPlus integrates several advanced text-mining techniques, including SimConcept for resolving composite gene names. As a result, GNormPlus compares favorably to other state-of-the-art methods when evaluated on two widely used public benchmarking datasets, achieving 86.7% F1-score on the BioCreative II Gene Normalization task dataset and 50.1% F1-score on the BioCreative III Gene Normalization task dataset. The GNormPlus source code and its annotated corpus are freely available, and the results of applying GNormPlus to the entire PubMed are freely accessible through our web-based tool PubTator. PMID:26380306
Sztuba-Solinska, Joanna; Diaz, Larissa; Kumar, Mia R; Kolb, Gaëlle; Wiley, Michael R; Jozwick, Lucas; Kuhn, Jens H; Palacios, Gustavo; Radoshitzky, Sheli R; J Le Grice, Stuart F; Johnson, Reed F
2016-11-16
Ebola virus (EBOV) is a single-stranded negative-sense RNA virus belonging to the Filoviridae family. The leader and trailer non-coding regions of the EBOV genome likely regulate its transcription, replication, and progeny genome packaging. We investigated the cis-acting RNA signals involved in RNA-RNA and RNA-protein interactions that regulate replication of eGFP-encoding EBOV minigenomic RNA and identified heat shock cognate protein family A (HSC70) member 8 (HSPA8) as an EBOV trailer-interacting host protein. Mutational analysis of the trailer HSPA8 binding motif revealed that this interaction is essential for EBOV minigenome replication. Selective 2'-hydroxyl acylation analyzed by primer extension analysis of the secondary structure of the EBOV minigenomic RNA indicates formation of a small stem-loop composed of the HSPA8 motif, a 3' stem-loop (nucleotides 1868-1890) that is similar to a previously identified structure in the replicative intermediate (RI) RNA and a panhandle domain involving a trailer-to-leader interaction. Results of minigenome assays and an EBOV reverse genetic system rescue support a role for both the panhandle domain and HSPA8 motif 1 in virus replication. Published by Oxford University Press on behalf of Nucleic Acids Research 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Performance evaluation of MPEG internet video coding
NASA Astrophysics Data System (ADS)
Luo, Jiajia; Wang, Ronggang; Fan, Kui; Wang, Zhenyu; Li, Ge; Wang, Wenmin
2016-09-01
Internet Video Coding (IVC) has been developed in MPEG by combining well-known existing technology elements and new coding tools with royalty-free declarations. In June 2015, IVC project was approved as ISO/IEC 14496-33 (MPEG- 4 Internet Video Coding). It is believed that this standard can be highly beneficial for video services in the Internet domain. This paper evaluates the objective and subjective performances of IVC by comparing it against Web Video Coding (WVC), Video Coding for Browsers (VCB) and AVC High Profile. Experimental results show that IVC's compression performance is approximately equal to that of the AVC High Profile for typical operational settings, both for streaming and low-delay applications, and is better than WVC and VCB.
NASA Technical Reports Server (NTRS)
Farassat, F.; Dunn, M. H.; Padula, S. L.
1986-01-01
The development of a high speed propeller noise prediction code at Langley Research Center is described. The code utilizes two recent acoustic formulations in the time domain for subsonic and supersonic sources. The structure and capabilities of the code are discussed. Grid size study for accuracy and speed of execution on a computer is also presented. The code is tested against an earlier Langley code. Considerable increase in accuracy and speed of execution are observed. Some examples of noise prediction of a high speed propeller for which acoustic test data are available are given. A brisk derivation of formulations used is given in an appendix.
Online Information About Harmful Tobacco Constituents: A Content Analysis.
Margolis, Katherine A; Bernat, Jennifer K; Keely O'Brien, Erin; Delahanty, Janine C
2017-10-01
Tobacco products and smoke contain more than 7000 chemicals (ie, constituents). Research shows that consumers have poor understanding of tobacco constituents and find communication about them to be confusing. The current content analysis describes how information is communicated about tobacco constituents online in terms of source, target audience, and message. A search was conducted in September 2015 using tobacco constituent and tobacco terms and identified 226 relevant Web sites for coding. Web sites were coded for type, target audience, reading level, constituent information, type of tobacco product, health effects, and emotional valence by two coders who independently coded half of the sample. There was a 20% overlap to assess interrater reliability, which was high (κ = .83, p < .001). The mean reading grade level of information online was 8.2 (SD = 2.8) with 81.7% of Web sites above the sixth grade reading level. Nearly all Web sites presented information in a qualitative narrative format (93%) and almost half (48.2%) presented information in a quantitative format. Nicotine (59.3%) and nitrosamines (28.8%) were the mostly frequently mentioned tobacco constituents. Cancer was the most frequently mentioned health effect (51.3%). Nearly a quarter (23%) of the Web sites did not explicitly state that tobacco constituents or tobacco products are associated with health effects. Large gaps exist in online information about tobacco constituents including incomplete information about tobacco constituent-related health effects and limited information about tobacco products other than cigarettes and smokeless tobacco. This study highlights opportunities to improve the content and presentation of information related to tobacco constituents. The US Food and Drug Administration (FDA) is required to publicly display a list of tobacco constituents in tobacco products and tobacco smoke by brand. However, little is known about tobacco constituent information available to the public. This is the first systematic content analysis of online information about tobacco constituents. The analysis reveals that although information about tobacco constituents is available online, large information gaps exist, including incomplete information about tobacco constituent-related health effects. This study highlights opportunities to improve the content and presentation of public information related to tobacco constituents. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Parallel Adaptive Mesh Refinement Library
NASA Technical Reports Server (NTRS)
Mac-Neice, Peter; Olson, Kevin
2005-01-01
Parallel Adaptive Mesh Refinement Library (PARAMESH) is a package of Fortran 90 subroutines designed to provide a computer programmer with an easy route to extension of (1) a previously written serial code that uses a logically Cartesian structured mesh into (2) a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, PARAMESH can operate as a domain-decomposition tool for users who want to parallelize their serial codes but who do not wish to utilize adaptivity. The package builds a hierarchy of sub-grids to cover the computational domain of a given application program, with spatial resolution varying to satisfy the demands of the application. The sub-grid blocks form the nodes of a tree data structure (a quad-tree in two or an oct-tree in three dimensions). Each grid block has a logically Cartesian mesh. The package supports one-, two- and three-dimensional models.
A numerical algorithm for MHD of free surface flows at low magnetic Reynolds numbers
NASA Astrophysics Data System (ADS)
Samulyak, Roman; Du, Jian; Glimm, James; Xu, Zhiliang
2007-10-01
We have developed a numerical algorithm and computational software for the study of magnetohydrodynamics (MHD) of free surface flows at low magnetic Reynolds numbers. The governing system of equations is a coupled hyperbolic-elliptic system in moving and geometrically complex domains. The numerical algorithm employs the method of front tracking and the Riemann problem for material interfaces, second order Godunov-type hyperbolic solvers, and the embedded boundary method for the elliptic problem in complex domains. The numerical algorithm has been implemented as an MHD extension of FronTier, a hydrodynamic code with free interface support. The code is applicable for numerical simulations of free surface flows of conductive liquids or weakly ionized plasmas. The code has been validated through the comparison of numerical simulations of a liquid metal jet in a non-uniform magnetic field with experiments and theory. Simulations of the Muon Collider/Neutrino Factory target have also been discussed.
CUDA Fortran acceleration for the finite-difference time-domain method
NASA Astrophysics Data System (ADS)
Hadi, Mohammed F.; Esmaeili, Seyed A.
2013-05-01
A detailed description of programming the three-dimensional finite-difference time-domain (FDTD) method to run on graphical processing units (GPUs) using CUDA Fortran is presented. Two FDTD-to-CUDA thread-block mapping designs are investigated and their performances compared. Comparative assessment of trade-offs between GPU's shared memory and L1 cache is also discussed. This presentation is for the benefit of FDTD programmers who work exclusively with Fortran and are reluctant to port their codes to C in order to utilize GPU computing. The derived CUDA Fortran code is compared with an optimized CPU version that runs on a workstation-class CPU to present a realistic GPU to CPU run time comparison and thus help in making better informed investment decisions on FDTD code redesigns and equipment upgrades. All analyses are mirrored with CUDA C simulations to put in perspective the present state of CUDA Fortran development.
... Public Home » Hepatitis C » Hepatitis C Treatment Viral Hepatitis Menu Menu Viral Hepatitis Viral Hepatitis Home For ... Enter ZIP code here Enter ZIP code here Hepatitis C Treatment for Veterans and the Public Treatment ...
Laser-plasma interactions with a Fourier-Bessel particle-in-cell method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andriyash, Igor A., E-mail: igor.andriyash@gmail.com; LOA, ENSTA ParisTech, CNRS, Ecole polytechnique, Université Paris-Saclay, 828 bd des Maréchaux, 91762 Palaiseau cedex; Lehe, Remi
A new spectral particle-in-cell (PIC) method for plasma modeling is presented and discussed. In the proposed scheme, the Fourier-Bessel transform is used to translate the Maxwell equations to the quasi-cylindrical spectral domain. In this domain, the equations are solved analytically in time, and the spatial derivatives are approximated with high accuracy. In contrast to the finite-difference time domain (FDTD) methods, that are used commonly in PIC, the developed method does not produce numerical dispersion and does not involve grid staggering for the electric and magnetic fields. These features are especially valuable in modeling the wakefield acceleration of particles in plasmas.more » The proposed algorithm is implemented in the code PLARES-PIC, and the test simulations of laser plasma interactions are compared to the ones done with the quasi-cylindrical FDTD PIC code CALDER-CIRC.« less
NASA Astrophysics Data System (ADS)
Yamamoto, Tetsuya; Takeda, Kazuki; Adachi, Fumiyuki
Frequency-domain equalization (FDE) based on the minimum mean square error (MMSE) criterion can provide a better bit error rate (BER) performance than rake combining. To further improve the BER performance, cyclic delay transmit diversity (CDTD) can be used. CDTD simultaneously transmits the same signal from different antennas after adding different cyclic delays to increase the number of equivalent propagation paths. Although a joint use of CDTD and MMSE-FDE for direct sequence code division multiple access (DS-CDMA) achieves larger frequency diversity gain, the BER performance improvement is limited by the residual inter-chip interference (ICI) after FDE. In this paper, we propose joint FDE and despreading for DS-CDMA using CDTD. Equalization and despreading are simultaneously performed in the frequency-domain to suppress the residual ICI after FDE. A theoretical conditional BER analysis is presented for the given channel condition. The BER analysis is confirmed by computer simulation.
Generating Code Review Documentation for Auto-Generated Mission-Critical Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2009-01-01
Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.
Deciphering the BAR code of membrane modulators.
Salzer, Ulrich; Kostan, Julius; Djinović-Carugo, Kristina
2017-07-01
The BAR domain is the eponymous domain of the "BAR-domain protein superfamily", a large and diverse set of mostly multi-domain proteins that play eminent roles at the membrane cytoskeleton interface. BAR domain homodimers are the functional units that peripherally associate with lipid membranes and are involved in membrane sculpting activities. Differences in their intrinsic curvatures and lipid-binding properties account for a large variety in membrane modulating properties. Membrane activities of BAR domains are further modified and regulated by intramolecular or inter-subunit domains, by intermolecular protein interactions, and by posttranslational modifications. Rather than providing detailed cell biological information on single members of this superfamily, this review focuses on biochemical, biophysical, and structural aspects and on recent findings that paradigmatically promote our understanding of processes driven and modulated by BAR domains.
Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold
1997-01-01
The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.
Uniform Policy/Dress Codes: School Staff and Parent Perceptions of Need and Impact.
ERIC Educational Resources Information Center
Stevenson, Zollie, Jr.; Chunn, Eva Wells
This study examines the impact of uniform/dress codes and practices on school climate, educational attainment, and student affective and cognitive domains in Washington (District of Columbia) schools. Information was drawn from surveys of 301 principals and teachers and 268 parents. The following findings are presented: (1) reasons cited for…
Advanced turboprop noise prediction based on recent theoretical results
NASA Technical Reports Server (NTRS)
Farassat, F.; Padula, S. L.; Dunn, M. H.
1987-01-01
The development of a high speed propeller noise prediction code at Langley Research Center is described. The code utilizes two recent acoustic formulations in the time domain for subsonic and supersonic sources. The structure and capabilities of the code are discussed. Grid size study for accuracy and speed of execution on a computer is also presented. The code is tested against an earlier Langley code. Considerable increase in accuracy and speed of execution are observed. Some examples of noise prediction of a high speed propeller for which acoustic test data are available are given. A brisk derivation of formulations used is given in an appendix.
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Srivastava, R.
1996-01-01
This guide describes the input data required for using MSAP2D (Multi Stage Aeroelastic analysis Program - Two Dimensional) computer code. MSAP2D can be used for steady, unsteady aerodynamic, and aeroelastic (flutter and forced response) analysis of bladed disks arranged in multiple blade rows such as those found in compressors, turbines, counter rotating propellers or propfans. The code can also be run for single blade row. MSAP2D code is an extension of the original NPHASE code for multiblade row aerodynamic and aeroelastic analysis. Euler equations are used to obtain aerodynamic forces. The structural dynamic equations are written for a rigid typical section undergoing pitching (torsion) and plunging (bending) motion. The aeroelastic equations are solved in time domain. For single blade row analysis, frequency domain analysis is also provided to obtain unsteady aerodynamic coefficients required in an eigen analysis for flutter. In this manual, sample input and output are provided for a single blade row example, two blade row example with equal and unequal number of blades in the blade rows.
41 CFR 102-173.50 - What is the naming convention for States?
Code of Federal Regulations, 2014 CFR
2014-01-01
...-INTERNET GOV DOMAIN Registration § 102-173.50 What is the naming convention for States? (a) To register any second-level domain within dot-gov, State government entities must register the full State name or clearly indicate the State postal code within the name. Examples of acceptable names include virginia.gov...
41 CFR 102-173.50 - What is the naming convention for States?
Code of Federal Regulations, 2011 CFR
2011-01-01
...-INTERNET GOV DOMAIN Registration § 102-173.50 What is the naming convention for States? (a) To register any second-level domain within dot-gov, State government entities must register the full State name or clearly indicate the State postal code within the name. Examples of acceptable names include virginia.gov...
41 CFR 102-173.50 - What is the naming convention for States?
Code of Federal Regulations, 2010 CFR
2010-07-01
...-INTERNET GOV DOMAIN Registration § 102-173.50 What is the naming convention for States? (a) To register any second-level domain within dot-gov, State government entities must register the full State name or clearly indicate the State postal code within the name. Examples of acceptable names include virginia.gov...
41 CFR 102-173.50 - What is the naming convention for States?
Code of Federal Regulations, 2013 CFR
2013-07-01
...-INTERNET GOV DOMAIN Registration § 102-173.50 What is the naming convention for States? (a) To register any second-level domain within dot-gov, State government entities must register the full State name or clearly indicate the State postal code within the name. Examples of acceptable names include virginia.gov...
41 CFR 102-173.50 - What is the naming convention for States?
Code of Federal Regulations, 2012 CFR
2012-01-01
...-INTERNET GOV DOMAIN Registration § 102-173.50 What is the naming convention for States? (a) To register any second-level domain within dot-gov, State government entities must register the full State name or clearly indicate the State postal code within the name. Examples of acceptable names include virginia.gov...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erez, Mattan; Yelick, Katherine; Sarkar, Vivek
The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. Our approach is to provide an efficient and scalable programming model that can be adapted to application needs through the use of dynamic runtime features and domain-specific languages for computational kernels. We address the following technical challenges: Programmability: Rich set of programming constructs based on a Hierarchical Partitioned Global Address Space (HPGAS) model, demonstrated in UPC++. Scalability: Hierarchical locality control, lightweight communication (extended GASNet), and ef- ficient synchronization mechanisms (Phasers). Performance Portability:more » Just-in-time specialization (SEJITS) for generating hardware-specific code and scheduling libraries for domain-specific adaptive runtimes (Habanero). Energy Efficiency: Communication-optimal code generation to optimize energy efficiency by re- ducing data movement. Resilience: Containment Domains for flexible, domain-specific resilience, using state capture mechanisms and lightweight, asynchronous recovery mechanisms. Interoperability: Runtime and language interoperability with MPI and OpenMP to encourage broad adoption.« less
Selected DOE headquarters publications
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-07-01
Selected DOE Headquarters Publications provides cumulative listings, from October 1, 1977 onward, of two groups of publications issued by headquarters organizations of the Department of Energy, and an index to their title keywords. The two groups consist of publications assigned a DOE/XXX-type report number code and headquarters contractor publications, prepared by contractors (and published by DOE) to describe research and development work they have performed for the Department. Publications such as pamphlets, fact sheets, bulletins, newsletters, and telephone directories, are omitted, as are publications issued under the DOE-tr, CONF, DOE/JPL, and DOE/NASA codes. (RWR)
Accuracy of injury coding under ICD‐9 for New Zealand public hospital discharges
Langley, J; Stephenson, S; Thorpe, C; Davie, G
2006-01-01
Objective To determine the level of accuracy in coding for injury principal diagnosis and the first external cause code for public hospital discharges in New Zealand and determine how these levels vary by hospital size. Method A simple random sample of 1800 discharges was selected from the period 1996–98 inclusive. Records were obtained from hospitals and an accredited coder coded the discharge independently of the codes already recorded in the national database. Results Five percent of the principal diagnoses, 18% of the first four digits of the E‐codes, and 8% of the location codes (5th digit of the E‐code), were incorrect. There were no substantive differences in the level of incorrect coding between large and small hospitals. Conclusions Users of New Zealand public hospital discharge data can have a high degree of confidence in the injury diagnoses coded under ICD‐9‐CM‐A. A similar degree of confidence is warranted for E‐coding at the group level (for example, fall), but not, in general, at higher levels of specificity (for example, type of fall). For those countries continuing to use ICD‐9 the study provides insight into potential problems of coding and thus guidance on where the focus of coder training should be placed. For those countries that have historical data coded according to ICD‐9 it suggests that some specific injury and external cause incidence estimates may need to be treated with more caution. PMID:16461421
A novel potential/viscous flow coupling technique for computing helicopter flow fields
NASA Technical Reports Server (NTRS)
Summa, J. Michael; Strash, Daniel J.; Yoo, Sungyul
1993-01-01
The primary objective of this work was to demonstrate the feasibility of a new potential/viscous flow coupling procedure for reducing computational effort while maintaining solution accuracy. This closed-loop, overlapped velocity-coupling concept has been developed in a new two-dimensional code, ZAP2D (Zonal Aerodynamics Program - 2D), a three-dimensional code for wing analysis, ZAP3D (Zonal Aerodynamics Program - 3D), and a three-dimensional code for isolated helicopter rotors in hover, ZAPR3D (Zonal Aerodynamics Program for Rotors - 3D). Comparisons with large domain ARC3D solutions and with experimental data for a NACA 0012 airfoil have shown that the required domain size can be reduced to a few tenths of a percent chord for the low Mach and low angle of attack cases and to less than 2-5 chords for the high Mach and high angle of attack cases while maintaining solution accuracies to within a few percent. This represents CPU time reductions by a factor of 2-4 compared with ARC2D. The current ZAP3D calculation for a rectangular plan-form wing of aspect ratio 5 with an outer domain radius of about 1.2 chords represents a speed-up in CPU time over the ARC3D large domain calculation by about a factor of 2.5 while maintaining solution accuracies to within a few percent. A ZAPR3D simulation for a two-bladed rotor in hover with a reduced grid domain of about two chord lengths was able to capture the wake effects and compared accurately with the experimental pressure data. Further development is required in order to substantiate the promise of computational improvements due to the ZAPR3D coupling concept.
Patient Self-Defined Goals: Essentials of Person-Centered Care for Serious Illness.
Schellinger, Sandra Ellen; Anderson, Eric Worden; Frazer, Monica Schmitz; Cain, Cindy Lynn
2018-01-01
This research, a descriptive qualitative analysis of self-defined serious illness goals, expands the knowledge of what goals are important beyond the physical-making existing disease-specific guidelines more holistic. Integration of goals of care discussions and documentation is standard for quality palliative care but not consistently executed into general and specialty practice. Over 14 months, lay health-care workers (care guides) provided monthly supportive visits for 160 patients with advanced heart failure, cancer, and dementia expected to die in 2 to 3 years. Care guides explored what was most important to patients and documented their self-defined goals on a medical record flow sheet. Using definitions of an expanded set of whole-person domains adapted from the National Consensus Project (NCP) Clinical Practice Guidelines for Quality Palliative Care, 999 goals and their associated plans were deductively coded and examined. Four themes were identified-medical, nonmedical, multiple, and global. Forty percent of goals were coded into the medical domain; 40% were coded to nonmedical domains-social (9%), ethical (7%), family (6%), financial/legal (5%), psychological (5%), housing (3%), legacy/bereavement (3%), spiritual (1%), and end-of-life care (1%). Sixteen percent of the goals were complex and reflected a mix of medical and nonmedical domains, "multiple" goals. The remaining goals (4%) were too global to attribute to an NCP domain. Self-defined serious illness goals express experiences beyond physical health and extend into all aspects of whole person. It is feasible to elicit and record serious illness goals. This approach to goals can support meaningful person-centered care, decision-making, and planning that accords with individual preferences of late life.
Coded Statutory Data Sets for Evaluation of Public Health Law
ERIC Educational Resources Information Center
Costich, Julia Field
2012-01-01
Background and objectives: The evaluation of public health law requires reliable accounts of underlying statutes and regulations. States often enact public health-related statutes with nonuniform provisions, and variation in the structure of state legal codes can foster inaccuracy in evaluating the impact of specific categories of law. The optimal…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-24
... announces a public meeting to receive comments and recommendations (including accompanying data on which recommendations are based) from the public on the appropriate basis for establishing payment amounts for new or substantially revised Healthcare Common Procedure Coding System (HCPCS) codes being considered for Medicare...
NASA Astrophysics Data System (ADS)
Barranco, Joseph
2006-03-01
We have developed a three-dimensional (3D) spectral hydrodynamic code to study vortex dynamics in rotating, shearing, stratified systems (eg, the atmosphere of gas giant planets, protoplanetary disks around newly forming protostars). The time-independent background state is stably stratified in the vertical direction and has a unidirectional linear shear flow aligned with one horizontal axis. Superposed on this background state is an unsteady, subsonic flow that is evolved with the Euler equations subject to the anelastic approximation to filter acoustic phenomena. A Fourier-Fourier basis in a set of quasi-Lagrangian coordinates that advect with the background shear is used for spectral expansions in the two horizontal directions. For the vertical direction, two different sets of basis functions have been implemented: (1) Chebyshev polynomials on a truncated, finite domain, and (2) rational Chebyshev functions on an infinite domain. Use of this latter set is equivalent to transforming the infinite domain to a finite one with a cotangent mapping, and using cosine and sine expansions in the mapped coordinate. The nonlinear advection terms are time integrated explicitly, whereas the Coriolis force, buoyancy terms, and pressure/enthalpy gradient are integrated semi- implicitly. We show that internal gravity waves can be damped by adding new terms to the Euler equations. The code exhibits excellent parallel performance with the Message Passing Interface (MPI). As a demonstration of the code, we simulate vortex dynamics in protoplanetary disks and the Kelvin-Helmholtz instability in the dusty midplanes of protoplanetary disks.
A 3D spectral anelastic hydrodynamic code for shearing, stratified flows
NASA Astrophysics Data System (ADS)
Barranco, Joseph A.; Marcus, Philip S.
2006-11-01
We have developed a three-dimensional (3D) spectral hydrodynamic code to study vortex dynamics in rotating, shearing, stratified systems (e.g., the atmosphere of gas giant planets, protoplanetary disks around newly forming protostars). The time-independent background state is stably stratified in the vertical direction and has a unidirectional linear shear flow aligned with one horizontal axis. Superposed on this background state is an unsteady, subsonic flow that is evolved with the Euler equations subject to the anelastic approximation to filter acoustic phenomena. A Fourier Fourier basis in a set of quasi-Lagrangian coordinates that advect with the background shear is used for spectral expansions in the two horizontal directions. For the vertical direction, two different sets of basis functions have been implemented: (1) Chebyshev polynomials on a truncated, finite domain, and (2) rational Chebyshev functions on an infinite domain. Use of this latter set is equivalent to transforming the infinite domain to a finite one with a cotangent mapping, and using cosine and sine expansions in the mapped coordinate. The nonlinear advection terms are time-integrated explicitly, the pressure/enthalpy terms are integrated semi-implicitly, and the Coriolis force and buoyancy terms are treated semi-analytically. We show that internal gravity waves can be damped by adding new terms to the Euler equations. The code exhibits excellent parallel performance with the message passing interface (MPI). As a demonstration of the code, we simulate the merger of two 3D vortices in the midplane of a protoplanetary disk.
Testing First-Order Logic Axioms in AutoCert
NASA Technical Reports Server (NTRS)
Ahn, Ki Yung; Denney, Ewen
2009-01-01
AutoCert [2] is a formal verification tool for machine generated code in safety critical domains, such as aerospace control code generated from MathWorks Real-Time Workshop. AutoCert uses Automated Theorem Provers (ATPs) [5] based on First-Order Logic (FOL) to formally verify safety and functional correctness properties of the code. These ATPs try to build proofs based on user provided domain-specific axioms, which can be arbitrary First-Order Formulas (FOFs). These axioms are the most crucial part of the trusted base, since proofs can be submitted to a proof checker removing the need to trust the prover and AutoCert itself plays the part of checking the code generator. However, formulating axioms correctly (i.e. precisely as the user had really intended) is non-trivial in practice. The challenge of axiomatization arise from several dimensions. First, the domain knowledge has its own complexity. AutoCert has been used to verify mathematical requirements on navigation software that carries out various geometric coordinate transformations involving matrices and quaternions. Axiomatic theories for such constructs are complex enough that mistakes are not uncommon. Second, adjusting axioms for ATPs can add even more complexity. The axioms frequently need to be modified in order to have them in a form suitable for use with ATPs. Such modifications tend to obscure the axioms further. Thirdly, speculating validity of the axioms from the output of existing ATPs is very hard since theorem provers typically do not give any examples or counterexamples.
Selected DOE Headquarters publications, October 1977-September 1979
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-11-01
This sixth issue of cumulative listings of DOE Headquarters publications covers the first two years of the Department's operation (October 1, 1977 - September 30, 1979). It lists two groups of publications issued by then-existing Headquarters organizations and provides an index to their title keywords. The two groups of publications are publications assigned a DOE/XXX-type report number code and Headquarters contractor reports prepared by contractors (and published by DOE) to describe research and development work they have performed for the Department. Certain publications are omitted. They include such items as pamphlets, fact sheets, bulletins, newsletters, and telephone directories, as wellmore » as headquarters publications issued under the DOE-tr (DOE translation) and CONF (conference proceedings) codes, and technical reports from the Jet Propulsion Laboratory and NASA issued under DOE/JPL and DOE/NASA codes. The contents of this issue will not be repeated in subsequent issues of DOE/AD-0010. (RWR)« less
Parser for Sabin-to-Mahoney Transition Model of Quasispecies Replication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ecale Zhou, Carol
2016-01-03
This code is a data parse for preparing output from the Qspp agent-based stochastic simulation model for plotting in Excel. This code is specific to a set of simulations that were run for the purpose of preparing data for a publication. It is necessary to make this code open-source in order to publish the model code (Qspp), which has already been released. There is a necessity of assuring that results from using Qspp for a publication
26 CFR 1.0-1 - Internal Revenue Code of 1954 and regulations.
Code of Federal Regulations, 2011 CFR
2011-04-01
... INCOME TAXES § 1.0-1 Internal Revenue Code of 1954 and regulations. (a) Enactment of law. The Internal Revenue Code of 1954 which became law upon enactment of Public Law 591, 83d Congress, approved August 16... references. The date of enactment, bill number, public law number, and chapter number, shall be printed as a...
26 CFR 1.0-1 - Internal Revenue Code of 1954 and regulations.
Code of Federal Regulations, 2014 CFR
2014-04-01
... INCOME TAXES § 1.0-1 Internal Revenue Code of 1954 and regulations. (a) Enactment of law. The Internal Revenue Code of 1954 which became law upon enactment of Public Law 591, 83d Congress, approved August 16... references. The date of enactment, bill number, public law number, and chapter number, shall be printed as a...
26 CFR 1.0-1 - Internal Revenue Code of 1954 and regulations.
Code of Federal Regulations, 2012 CFR
2012-04-01
... INCOME TAXES § 1.0-1 Internal Revenue Code of 1954 and regulations. (a) Enactment of law. The Internal Revenue Code of 1954 which became law upon enactment of Public Law 591, 83d Congress, approved August 16... references. The date of enactment, bill number, public law number, and chapter number, shall be printed as a...
26 CFR 1.0-1 - Internal Revenue Code of 1954 and regulations.
Code of Federal Regulations, 2010 CFR
2010-04-01
... INCOME TAXES § 1.0-1 Internal Revenue Code of 1954 and regulations. (a) Enactment of law. The Internal Revenue Code of 1954 which became law upon enactment of Public Law 591, 83d Congress, approved August 16... references. The date of enactment, bill number, public law number, and chapter number, shall be printed as a...
26 CFR 1.0-1 - Internal Revenue Code of 1954 and regulations.
Code of Federal Regulations, 2013 CFR
2013-04-01
... INCOME TAXES § 1.0-1 Internal Revenue Code of 1954 and regulations. (a) Enactment of law. The Internal Revenue Code of 1954 which became law upon enactment of Public Law 591, 83d Congress, approved August 16... references. The date of enactment, bill number, public law number, and chapter number, shall be printed as a...
45 CFR Appendix B to Part 73 - Code of Ethics for Government Service
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Code of Ethics for Government Service B Appendix B to Part 73 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION STANDARDS OF CONDUCT Pt. 73, App. B Appendix B to Part 73—Code of Ethics for Government Service Any person in...
45 CFR Appendix B to Part 73 - Code of Ethics for Government Service
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Code of Ethics for Government Service B Appendix B to Part 73 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION STANDARDS OF CONDUCT Pt. 73, App. B Appendix B to Part 73—Code of Ethics for Government Service Any person in...
45 CFR Appendix B to Part 73 - Code of Ethics for Government Service
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Code of Ethics for Government Service B Appendix B to Part 73 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION STANDARDS OF CONDUCT Pt. 73, App. B Appendix B to Part 73—Code of Ethics for Government Service Any person in...
45 CFR Appendix B to Part 73 - Code of Ethics for Government Service
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Code of Ethics for Government Service B Appendix B to Part 73 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION STANDARDS OF CONDUCT Pt. 73, App. B Appendix B to Part 73—Code of Ethics for Government Service Any person in...
45 CFR Appendix B to Part 73 - Code of Ethics for Government Service
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Code of Ethics for Government Service B Appendix B to Part 73 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION STANDARDS OF CONDUCT Pt. 73, App. B Appendix B to Part 73—Code of Ethics for Government Service Any person in...
Game-Coding Workshops in New Zealand Public Libraries: Evaluation of a Pilot Project
ERIC Educational Resources Information Center
Bolstad, Rachel
2016-01-01
This report evaluates a game coding workshop offered to young people and adults in seven public libraries round New Zealand. Participants were taken step by step through the process of creating their own simple 2D videogame, learning the basics of coding, computational thinking, and digital game design. The workshops were free and drew 426 people…
Haeny, Angela M.
2014-01-01
Previous literature has documented the general issues psychologists often face while balancing their personal and professional lives. The struggle stems from attempting to satisfy the need to maintain a life outside of work while having the professional obligation to follow the American Psychological Association’s (APA’s) Ethical Principles of Psychologists and Code of Conduct (Ethics Code) to prevent their personal lives from interfering with their professional roles and relationships. The present paper analyzes the subject of psychologists taking a public position on controversial public issues. Although the APA Ethics Code does not restrict how psychologists conduct themselves during their personal time, taking a public stance on a controversial issue could potentially strain professional relationships and inadvertently reflect negatively on the profession. The present paper examines ethical issues that a) should be taken into account before psychologists take a public position on a controversial issue, and b) are in conflict with APA’s Ethics Code or current research. PMID:25342876
Haeny, Angela M
2014-07-01
Previous literature has documented the general issues psychologists often face while balancing their personal and professional lives. The struggle stems from attempting to satisfy the need to maintain a life outside of work while having the professional obligation to follow the American Psychological Association's (APA's) Ethical Principles of Psychologists and Code of Conduct (Ethics Code) to prevent their personal lives from interfering with their professional roles and relationships. The present paper analyzes the subject of psychologists taking a public position on controversial public issues. Although the APA Ethics Code does not restrict how psychologists conduct themselves during their personal time, taking a public stance on a controversial issue could potentially strain professional relationships and inadvertently reflect negatively on the profession. The present paper examines ethical issues that a) should be taken into account before psychologists take a public position on a controversial issue, and b) are in conflict with APA's Ethics Code or current research.
Imran, Noreen; Seet, Boon-Chong; Fong, A C M
2015-01-01
Distributed video coding (DVC) is a relatively new video coding architecture originated from two fundamental theorems namely, Slepian-Wolf and Wyner-Ziv. Recent research developments have made DVC attractive for applications in the emerging domain of wireless video sensor networks (WVSNs). This paper reviews the state-of-the-art DVC architectures with a focus on understanding their opportunities and gaps in addressing the operational requirements and application needs of WVSNs.
DCU@TRECMed 2012: Using Ad-Hoc Baselines for Domain-Specific Retrieval
2012-11-01
description to extend the query, for example: Patients with complicated GERD who receive endoscopy will be extended with Gastroesophageal reflux disease ... Diseases and Related Health Problems, version 9) for the patient’s admission or discharge status [1, 5]; treating negation (e.g. negative test results or...codes were mapped to a description of the code, usually a short phrase/sentence. For instance, the ICD9 code 253.5 corresponds to the disease Diabetes
Identification of pneumonia and influenza deaths using the death certificate pipeline
2012-01-01
Background Death records are a rich source of data, which can be used to assist with public surveillance and/or decision support. However, to use this type of data for such purposes it has to be transformed into a coded format to make it computable. Because the cause of death in the certificates is reported as free text, encoding the data is currently the single largest barrier of using death certificates for surveillance. Therefore, the purpose of this study was to demonstrate the feasibility of using a pipeline, composed of a detection rule and a natural language processor, for the real time encoding of death certificates using the identification of pneumonia and influenza cases as an example and demonstrating that its accuracy is comparable to existing methods. Results A Death Certificates Pipeline (DCP) was developed to automatically code death certificates and identify pneumonia and influenza cases. The pipeline used MetaMap to code death certificates from the Utah Department of Health for the year 2008. The output of MetaMap was then accessed by detection rules which flagged pneumonia and influenza cases based on the Centers of Disease and Control and Prevention (CDC) case definition. The output from the DCP was compared with the current method used by the CDC and with a keyword search. Recall, precision, positive predictive value and F-measure with respect to the CDC method were calculated for the two other methods considered here. The two different techniques compared here with the CDC method showed the following recall/ precision results: DCP: 0.998/0.98 and keyword searching: 0.96/0.96. The F-measure were 0.99 and 0.96 respectively (DCP and keyword searching). Both the keyword and the DCP can run in interactive form with modest computer resources, but DCP showed superior performance. Conclusion The pipeline proposed here for coding death certificates and the detection of cases is feasible and can be extended to other conditions. This method provides an alternative that allows for coding free-text death certificates in real time that may increase its utilization not only in the public health domain but also for biomedical researchers and developers. Trial Registration This study did not involved any clinical trials. PMID:22569097
NASA Astrophysics Data System (ADS)
Chakravarthi, V.; Sastry, S. Rajeswara; Ramamma, B.
2013-07-01
Based on the principles of modeling and inversion, two interpretation methods are developed in the space domain along with a GUI based JAVA code, MODTOHAFSD, to analyze the gravity anomalies of strike limited sedimentary basins using a prescribed exponential density contrast-depth function. A stack of vertical prisms all having equal widths, but each one possesses its own limited strike length and thickness, describes the structure of a sedimentary basin above the basement complex. The thicknesses of prisms represent the depths to the basement and are the unknown parameters to be estimated from the observed gravity anomalies. Forward modeling is realized in the space domain using a combination of analytical and numerical approaches. The algorithm estimates the initial depths of a sedimentary basin and improves them, iteratively, based on the differences between the observed and modeled gravity anomalies within the specified convergence criteria. The present code, works on Model-View-Controller (MVC) pattern, reads the Bouguer gravity anomalies, constructs/modifies regional gravity background in an interactive approach, estimates residual gravity anomalies and performs automatic modeling or inversion based on user specification for basement topography. Besides generating output in both ASCII and graphical forms, the code displays (i) the changes in the depth structure, (ii) nature of fit between the observed and modeled gravity anomalies, (iii) changes in misfit, and (iv) variation of density contrast with iteration in animated forms. The code is used to analyze both synthetic and real field gravity anomalies. The proposed technique yielded information that is consistent with the assumed parameters in case of synthetic structure and with available drilling depths in case of field example. The advantage of the code is that it can be used to analyze the gravity anomalies of sedimentary basins even when the profile along which the interpretation is intended fails to bisect the strike length.
Jeong, Dahn; Presseau, Justin; ElChamaa, Rima; Naumann, Danielle N; Mascaro, Colin; Luconi, Francesca; Smith, Karen M; Kitto, Simon
2018-04-10
This scoping review explored the barriers and facilitators that influence engagement in and implementation of self-directed learning (SDL) in continuing professional development (CPD) for physicians in Canada. This review followed the six-stage scoping review framework of Arksey and O'Malley and of Daudt et al. In 2015, the authors searched eight online databases for English-language Canadian articles published January 2005-December 2015. To chart and analyze the data from the 17 included studies, they employed two-step analysis process of conventional content analysis followed by directed coding guided by the Theoretical Domains Framework (TDF). Conventional content analysis generated five categories of barriers and facilitators: individual, program, technological, environmental, and workplace/organizational. Directed coding guided by the TDF allowed analysis of barriers and facilitators to behavior change according to two key groups: physicians engaging in SDL and SDL developers designing and implementing SDL programs. Of the 318 total barriers and facilitators coded, 290 (91.2%) were coded for physicians and 28 (8.8%) for SDL developers. The majority (209; 65.7%) were coded in four key TDF domains: environmental context and resources, social influences, beliefs about consequences, and behavioral regulation. This scoping review identified five categories of barriers and facilitators in the literature and four key TDF domains where most factors related to behavior change of physicians and SDL developers regarding SDL programs in CPD were coded. There was a significant gap in the literature about factors that may contribute to SDL developers' capacity to design and implement SDL programs in CPD.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
Graham-Rowe, E; Lorencatto, F; Lawrenson, J G; Burr, J M; Grimshaw, J M; Ivers, N M; Presseau, J; Vale, L; Peto, T; Bunce, C; Francis, J J
2018-05-23
To identify and synthesize studies reporting modifiable barriers/enablers associated with retinopathy screening attendance in people with Type 1 or Type 2 diabetes, and to identify those most likely to influence attendance. We searched MEDLINE, EMBASE, PsycINFO, Cochrane Library and the 'grey literature' for quantitative and qualitative studies to February 2017. Data (i.e. participant quotations, interpretive summaries, survey results) reporting barriers/enablers were extracted and deductively coded into domains from the Theoretical Domains Framework; with domains representing categories of theoretical barriers/enablers proposed to mediate behaviour change. Inductive thematic analysis was conducted within domains to describe the role each domain plays in facilitating or hindering screening attendance. Domains that were more frequently coded and for which more themes were generated were judged more likely to influence attendance. Sixty-nine primary studies were included. We identified six theoretical domains ['environmental context and resources' (75% of included studies), 'social influences' (51%), 'knowledge' (50%), 'memory, attention, decision processes' (50%), 'beliefs about consequences' (38%) and 'emotions' (33%)] as the key mediators of diabetic retinopathy screening attendance. Examples of barriers populating these domains included inaccurate diabetic registers and confusion between routine eye care and retinopathy screening. Recommendations by healthcare professionals and community-level media coverage acted as enablers. Across a variety of contexts, we found common barriers to and enablers of retinopathy screening that could be targeted in interventions aiming to increase screening attendance. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Malay public attitudes toward epilepsy (PATE) scale: translation and psychometric evaluation.
Lim, Kheng Seang; Choo, Wan Yuen; Wu, Cathie; Tan, Chong Tin
2013-11-01
None of the quantitative scales for public attitudes toward epilepsy had been translated to Malay language. This study aimed to translate and test the validity and reliability of a Malay version of the Public Attitudes Toward Epilepsy (PATE) scale. The translation was performed according to standard principles and tested in 140 Malay-speaking adults aged more than 18 years for psychometric validation. The items in each domain had similar standard deviations (equal item variance), ranging from 0.90 to 1.00 in the personal domain and from 0.87 to 1.23 in the general domain. The correlation between an item and its domain was 0.4 and above for all items and was higher than the correlation with the other domain. Multitrait analysis showed that the Malay PATE had a similar variance, floor and ceiling effects, and relative relationship between the domains as the original PATE. The Malay PATE scale showed a similar correlation with almost all demographic variables except age. Item means were generally clustered in the factor analysis as the hypothesized domains, except those for items 1 and 2. The Cronbach's α values were within acceptable range (0.757 and 0.716 for the general and personal domains, respectively). The Malay PATE scale is a validated and reliable translated version for measuring public attitudes toward epilepsy. © 2013.
Congdon, Lauren M; Sims, Jennifer K; Tuzon, Creighton T; Rice, Judd C
2014-04-01
PR-Set7/Set8/KMT5a is the sole histone H4 lysine 20 monomethyltransferase (H4K20me1) in metazoans and is essential for proper cell division and genomic stability. We unexpectedly discovered that normal cellular levels of monomethylated histone H3 lysine 9 (H3K9me1) were also dependent on PR-Set7, but independent of its catalytic activity. This observation suggested that PR-Set7 interacts with an H3K9 monomethyltransferase to establish the previously reported H4K20me1-H3K9me1 trans-tail 'histone code'. Here we show that PR-Set7 specifically and directly binds the C-terminus of the Riz1/PRDM2/KMT8 tumor suppressor and demonstrate that the N-terminal PR/SET domain of Riz1 preferentially monomethylates H3K9. The PR-Set7 binding domain was required for Riz1 nuclear localization and maintenance of the H4K20me1-H3K9me1 trans-tail 'histone code'. Although Riz1 can function as a repressor, Riz1/H3K9me1 was dispensable for the repression of genes regulated by PR-Set7/H4K20me1. Frameshift mutations resulting in a truncated Riz1 incapable of binding PR-Set7 occur frequently in various aggressive cancers. In these cancer cells, expression of wild-type Riz1 restored tumor suppression by decreasing proliferation and increasing apoptosis. These phenotypes were not observed in cells expressing either the Riz1 PR/SET domain or PR-Set7 binding domain indicating that Riz1 methyltransferase activity and PR-Set7 binding domain are both essential for Riz1 tumor suppressor function.
An Experiment in Scientific Program Understanding
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.; Owen, Karl (Technical Monitor)
2000-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. Results are shown for three intensively studied codes and seven blind test cases; all test cases are state of the art scientific codes. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.
Sunita, S; Schwartz, Samantha L; Conn, Graeme L
2015-11-20
Double-stranded RNA (dsRNA)-activated protein kinase (PKR) is an important component of the innate immune system that presents a crucial first line of defense against viral infection. PKR has a modular architecture comprising a regulatory N-terminal dsRNA binding domain and a C-terminal kinase domain interposed by an unstructured ∼80-residue interdomain linker (IDL). Guided by sequence alignment, we created IDL deletions in human PKR (hPKR) and regulatory/kinase domain swap human-rat chimeric PKRs to assess the contributions of each domain and the IDL to regulation of the kinase activity by RNA. Using circular dichroism spectroscopy, limited proteolysis, kinase assays, and isothermal titration calorimetry, we show that each PKR protein is properly folded with similar domain boundaries and that each exhibits comparable polyinosinic-cytidylic (poly(rI:rC)) dsRNA activation profiles and binding affinities for adenoviral virus-associated RNA I (VA RNAI) and HIV-1 trans-activation response (TAR) RNA. From these results we conclude that the IDL of PKR is not required for RNA binding or mediating changes in protein conformation or domain interactions necessary for PKR regulation by RNA. In contrast, inhibition of rat PKR by VA RNAI and TAR RNA was found to be weaker than for hPKR by 7- and >300-fold, respectively, and each human-rat chimeric domain-swapped protein showed intermediate levels of inhibition. These findings indicate that PKR sequence or structural elements in the kinase domain, present in hPKR but absent in rat PKR, are exploited by viral non-coding RNAs to accomplish efficient inhibition of PKR. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Identification and correction of abnormal, incomplete and mispredicted proteins in public databases.
Nagy, Alinda; Hegyi, Hédi; Farkas, Krisztina; Tordai, Hedvig; Kozma, Evelin; Bányai, László; Patthy, László
2008-08-27
Despite significant improvements in computational annotation of genomes, sequences of abnormal, incomplete or incorrectly predicted genes and proteins remain abundant in public databases. Since the majority of incomplete, abnormal or mispredicted entries are not annotated as such, these errors seriously affect the reliability of these databases. Here we describe the MisPred approach that may provide an efficient means for the quality control of databases. The current version of the MisPred approach uses five distinct routines for identifying abnormal, incomplete or mispredicted entries based on the principle that a sequence is likely to be incorrect if some of its features conflict with our current knowledge about protein-coding genes and proteins: (i) conflict between the predicted subcellular localization of proteins and the absence of the corresponding sequence signals; (ii) presence of extracellular and cytoplasmic domains and the absence of transmembrane segments; (iii) co-occurrence of extracellular and nuclear domains; (iv) violation of domain integrity; (v) chimeras encoded by two or more genes located on different chromosomes. Analyses of predicted EnsEMBL protein sequences of nine deuterostome (Homo sapiens, Mus musculus, Rattus norvegicus, Monodelphis domestica, Gallus gallus, Xenopus tropicalis, Fugu rubripes, Danio rerio and Ciona intestinalis) and two protostome species (Caenorhabditis elegans and Drosophila melanogaster) have revealed that the absence of expected signal peptides and violation of domain integrity account for the majority of mispredictions. Analyses of sequences predicted by NCBI's GNOMON annotation pipeline show that the rates of mispredictions are comparable to those of EnsEMBL. Interestingly, even the manually curated UniProtKB/Swiss-Prot dataset is contaminated with mispredicted or abnormal proteins, although to a much lesser extent than UniProtKB/TrEMBL or the EnsEMBL or GNOMON-predicted entries. MisPred works efficiently in identifying errors in predictions generated by the most reliable gene prediction tools such as the EnsEMBL and NCBI's GNOMON pipelines and also guides the correction of errors. We suggest that application of the MisPred approach will significantly improve the quality of gene predictions and the associated databases.
78 FR 13338 - Exposure Modeling Public Meeting; Notice of Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-27
... code 22 Professional, Scientific and Technical NAICS code 54 B. How can I get copies of this document... dates and abstract requests are announced through the ``empmlist'' forum on the LYRIS list server at...
NASA Astrophysics Data System (ADS)
He, Lirong; Cui, Guangmang; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting
2015-03-01
Coded exposure photography makes the motion de-blurring a well-posed problem. The integration pattern of light is modulated using the method of coded exposure by opening and closing the shutter within the exposure time, changing the traditional shutter frequency spectrum into a wider frequency band in order to preserve more image information in frequency domain. The searching method of optimal code is significant for coded exposure. In this paper, an improved criterion of the optimal code searching is proposed by analyzing relationship between code length and the number of ones in the code, considering the noise effect on code selection with the affine noise model. Then the optimal code is obtained utilizing the method of genetic searching algorithm based on the proposed selection criterion. Experimental results show that the time consuming of searching optimal code decreases with the presented method. The restoration image is obtained with better subjective experience and superior objective evaluation values.
The complete mitochondrial genome of Pholis nebulosus (Perciformes: Pholidae).
Wang, Zhongquan; Qin, Kaili; Liu, Jingxi; Song, Na; Han, Zhiqiang; Gao, Tianxiang
2016-11-01
In this study, the complete mitochondrial genome (mitogenome) sequence of Pholis nebulosus has been determined by long polymerase chain reaction and primer-walking methods. The mitogenome is a circular molecule of 16 524 bp in length, including the typical structure of 13 protein-coding genes, 2 ribosomal RNA genes, 22 transfer RNA genes and 2 non-coding regions (L-strand replication origin and control region), the gene contents of which are identical to those observed in most bony fishes. Within the control region, we identified the termination-associated sequence domain (TAS), and the conserved sequence block domain (CSB-F, CSB-E, CSB-D, CSB-C, CSB-B, CSB-A, CSB-1, CSB-2, CSB-3).
Galerkin-collocation domain decomposition method for arbitrary binary black holes
NASA Astrophysics Data System (ADS)
Barreto, W.; Clemente, P. C. M.; de Oliveira, H. P.; Rodriguez-Mueller, B.
2018-05-01
We present a new computational framework for the Galerkin-collocation method for double domain in the context of ADM 3 +1 approach in numerical relativity. This work enables us to perform high resolution calculations for initial sets of two arbitrary black holes. We use the Bowen-York method for binary systems and the puncture method to solve the Hamiltonian constraint. The nonlinear numerical code solves the set of equations for the spectral modes using the standard Newton-Raphson method, LU decomposition and Gaussian quadratures. We show convergence of our code for the conformal factor and the ADM mass. Thus, we display features of the conformal factor for different masses, spins and linear momenta.
Ab initio Study of Transition metal binding to the Prion Protein
NASA Astrophysics Data System (ADS)
Cox, Daniel L.; Singh, Rajiv R. P.; Pan, Jianping
2004-03-01
Fundamental understanding of the prion protein (PrP) is of critical public health importance in view of mad cow and chronic wasting diseases. In recent years, it has been shown that the normal form (PrP^c) binds copper^1), and the structure of the copper binding domain has been elaborated. Hypotheses about toxicity associated with binding of other metals (notably manganese) have been put forward, Accordingly, using the ab initio SIESTA density functional theory code^2), we calculated the binding energy E_B(M) of M-(PrP) complexes relative to initially uncomplexed M ions, with M=Cu,Ni,Zn,Mn and (PrP)^* the minimal binding domain. The binding energy trend is E_B(Ni)>E_B(Cu)>E_B(Zn)>E_B(Mn), consistent with recent experiments apart from the surprising stability of Ni. We will also present preliminary results for binding of initially complexed M ions. *-Supported by U.S. DOE, Office of Basic Energy Sciences, Division of Materials Research 1) G.S. Jackson et al., Proc. Nat. Acad. Sci. (USA) 98, 8531 (2001). 2) P. Ordejón, et al., Phys. Rev. B53, R10441 (1996); J.M. Soler et al., J. Phys. Cond. Matt. 14, 2745 (2002).
Retained energy-based coding for EEG signals.
Bazán-Prieto, Carlos; Blanco-Velasco, Manuel; Cárdenas-Barrera, Julián; Cruz-Roldán, Fernando
2012-09-01
The recent use of long-term records in electroencephalography is becoming more frequent due to its diagnostic potential and the growth of novel signal processing methods that deal with these types of recordings. In these cases, the considerable volume of data to be managed makes compression necessary to reduce the bit rate for transmission and storage applications. In this paper, a new compression algorithm specifically designed to encode electroencephalographic (EEG) signals is proposed. Cosine modulated filter banks are used to decompose the EEG signal into a set of subbands well adapted to the frequency bands characteristic of the EEG. Given that no regular pattern may be easily extracted from the signal in time domain, a thresholding-based method is applied for quantizing samples. The method of retained energy is designed for efficiently computing the threshold in the decomposition domain which, at the same time, allows the quality of the reconstructed EEG to be controlled. The experiments are conducted over a large set of signals taken from two public databases available at Physionet and the results show that the compression scheme yields better compression than other reported methods. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.
A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.
Goldstein, Markus; Uchida, Seiichi
2016-01-01
Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks.
NASA Astrophysics Data System (ADS)
Miki, Nobuhiko; Kishiyama, Yoshihisa; Higuchi, Kenichi; Sawahashi, Mamoru; Nakagawa, Masao
In the Evolved UTRA (UMTS Terrestrial Radio Access) downlink, Orthogonal Frequency Division Multiplexing (OFDM) based radio access was adopted because of its inherent immunity to multipath interference and flexible accommodation of different spectrum arrangements. This paper presents the optimum adaptive modulation and channel coding (AMC) scheme when resource blocks (RBs) is simultaneously assigned to the same user when frequency and time domain channel-dependent scheduling is assumed in the downlink OFDMA radio access with single-antenna transmission. We start by presenting selection methods for the modulation and coding scheme (MCS) employing mutual information both for RB-common and RB-dependent modulation schemes. Simulation results show that, irrespective of the application of power adaptation to RB-dependent modulation, the improvement in the achievable throughput of the RB-dependent modulation scheme compared to that for the RB-common modulation scheme is slight, i.e., 4 to 5%. In addition, the number of required control signaling bits in the RB-dependent modulation scheme becomes greater than that for the RB-common modulation scheme. Therefore, we conclude that the RB-common modulation and channel coding rate scheme is preferred, when multiple RBs of the same coded stream are assigned to one user in the case of single-antenna transmission.
Engineering High Assurance Distributed Cyber Physical Systems
2015-01-15
decisions: number of interacting agents and co-dependent decisions made in real-time without causing interference . To engineer a high assurance DART...environment specification, architecture definition, domain-specific languages, design patterns, code - generation, analysis, test-generation, and simulation...include synchronization between the models and source code , debugging at the model level, expression of the design intent, and quality of service
Coding SNP in tenascin-C Fn-III-D domain associates with adult asthma.
Matsuda, Akira; Hirota, Tomomitsu; Akahoshi, Mitsuteru; Shimizu, Makiko; Tamari, Mayumi; Miyatake, Akihiko; Takahashi, Atsushi; Nakashima, Kazuko; Takahashi, Naomi; Obara, Kazuhiko; Yuyama, Noriko; Doi, Satoru; Kamogawa, Yumiko; Enomoto, Tadao; Ohshima, Koichi; Tsunoda, Tatsuhiko; Miyatake, Shoichiro; Fujita, Kimie; Kusakabe, Moriaki; Izuhara, Kenji; Nakamura, Yusuke; Hopkin, Julian; Shirakawa, Taro
2005-10-01
The extracellular matrix glycoprotein tenascin-C (TNC) has been accepted as a valuable histopathological subepithelial marker for evaluating the severity of asthmatic disease and the therapeutic response to drugs. We found an association between an adult asthma and an SNP encoding TNC fibronectin type III-D (Fn-III-D) domain in a case-control study between a Japanese population including 446 adult asthmatic patients and 658 normal healthy controls. The SNP (44513A/T in exon 17) strongly associates with adult bronchial asthma (chi2 test, P=0.00019, Odds ratio=1.76, 95% confidence interval=1.31-2.36). This coding SNP induces an amino acid substitution (Leu1677Ile) within the Fn-III-D domain of the alternative splicing region. Computer-assisted protein structure modeling suggests that the substituted amino acid locates at the outer edge of the beta-sheet in Fn-III-D domain and causes instability of this beta-sheet. As the TNC fibronectin-III domain has molecular elasticity, the structural change may affect the integrity and stiffness of asthmatic airways. In addition, TNC expression in lung fibroblasts increases with Th2 immune cytokine stimulation. Thus, Leu1677Ile may be valuable marker for evaluating the risk for developing asthma and plays a role in its pathogenesis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borziak, Kirill; Jouline, Igor B
2007-01-01
Motivation: Sensory domains that are conserved among Bacteria, Archaea and Eucarya are important detectors of common signals detected by living cells. Due to their high sequence divergence, sensory domains are difficult to identify. We systematically look for novel sensory domains using sensitive profile-based searches initi-ated with regions of signal transduction proteins where no known domains can be identified by current domain models. Results: Using profile searches followed by multiple sequence alignment, structure prediction, and domain architecture analysis, we have identified a novel sensory domain termed FIST, which is present in signal transduction proteins from Bacteria, Archaea and Eucarya. Remote similaritymore » to a known ligand-binding fold and chromosomal proximity of FIST-encoding genes to those coding for proteins involved in amino acid metabolism and transport suggest that FIST domains bind small ligands, such as amino acids.« less
Varieties of Musical Experience
ERIC Educational Resources Information Center
Bharucha, J. Jamshed; Curtis, Meagan; Paroo, Kaivon
2006-01-01
In this paper, we argue that music cognition involves the use of acoustic and auditory codes to evoke a variety of conscious experiences. The variety of domains that are encompassed by music is so diverse that it is unclear whether a single domain of structure or experience is defining. Music is best understood as a form of communication in which…
Reactor Dosimetry Applications Using RAPTOR-M3G:. a New Parallel 3-D Radiation Transport Code
NASA Astrophysics Data System (ADS)
Longoni, Gianluca; Anderson, Stanwood L.
2009-08-01
The numerical solution of the Linearized Boltzmann Equation (LBE) via the Discrete Ordinates method (SN) requires extensive computational resources for large 3-D neutron and gamma transport applications due to the concurrent discretization of the angular, spatial, and energy domains. This paper will discuss the development RAPTOR-M3G (RApid Parallel Transport Of Radiation - Multiple 3D Geometries), a new 3-D parallel radiation transport code, and its application to the calculation of ex-vessel neutron dosimetry responses in the cavity of a commercial 2-loop Pressurized Water Reactor (PWR). RAPTOR-M3G is based domain decomposition algorithms, where the spatial and angular domains are allocated and processed on multi-processor computer architectures. As compared to traditional single-processor applications, this approach reduces the computational load as well as the memory requirement per processor, yielding an efficient solution methodology for large 3-D problems. Measured neutron dosimetry responses in the reactor cavity air gap will be compared to the RAPTOR-M3G predictions. This paper is organized as follows: Section 1 discusses the RAPTOR-M3G methodology; Section 2 describes the 2-loop PWR model and the numerical results obtained. Section 3 addresses the parallel performance of the code, and Section 4 concludes this paper with final remarks and future work.
NASA,FAA,ONERA Swept-Wing Icing and Aerodynamics: Summary of Research and Current Status
NASA Technical Reports Server (NTRS)
Broeren, Andy
2015-01-01
NASA, FAA, ONERA, and other partner organizations have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large scale, three-dimensional swept wings. These are extremely complex phenomena important to the design, certification and safe operation of small and large transport aircraft. There is increasing demand to balance trade-offs in aircraft efficiency, cost and noise that tend to compete directly with allowable performance degradations over an increasing range of icing conditions. Computational fluid dynamics codes have reached a level of maturity that they are being proposed by manufacturers for use in certification of aircraft for flight in icing. However, sufficient high-quality data to evaluate their performance on iced swept wings are not currently available in the public domain and significant knowledge gaps remain.
NASA Astrophysics Data System (ADS)
Basu, Sukanta; Nunalee, Christopher G.; He, Ping; Fiorino, Steven T.; Vorontsov, Mikhail A.
2014-10-01
In this paper, we reconstruct the meteorological and optical environment during the time of Titanic's disaster utilizing a state-of-the-art meteorological model, a ray-tracing code, and a unique public-domain dataset called the Twentieth Century Global Reanalysis. With high fidelity, our simulation captured the occurrence of an unusually high Arctic pressure system over the disaster site with calm wind. It also reproduced the movement of a polar cold front through the region bringing a rapid drop in air temperature. The simulated results also suggest that unusual meteorological conditions persisted several hours prior to the Titanic disaster which contributed to super-refraction and intermittent optical turbulence. However, according to the simulations, such anomalous conditions were not present at the time of the collision of Titanic with an iceberg.
Klein, Max; Sharma, Rati; Bohrer, Chris H; Avelis, Cameron M; Roberts, Elijah
2017-01-15
Data-parallel programming techniques can dramatically decrease the time needed to analyze large datasets. While these methods have provided significant improvements for sequencing-based analyses, other areas of biological informatics have not yet adopted them. Here, we introduce Biospark, a new framework for performing data-parallel analysis on large numerical datasets. Biospark builds upon the open source Hadoop and Spark projects, bringing domain-specific features for biology. Source code is licensed under the Apache 2.0 open source license and is available at the project website: https://www.assembla.com/spaces/roberts-lab-public/wiki/Biospark CONTACT: eroberts@jhu.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Oregon Magnetic and Gravity Maps and Data: A Web Site for Distribution of Data
Roberts, Carter W.; Kucks, Robert P.; Hill, Patricia L.
2008-01-01
This web site gives the results of a USGS project to acquire the best available, public-domain, aeromagnetic and gravity data in the United States and merge these data into uniform, composite grids for each State. The results for the State of Oregon are presented here on this site. Files of aeromagnetic and gravity grids and images are available for these States for downloading. In Oregon, 49 magnetic surveys have been knit together to form a single digital grid and map. Also, a complete Bouguer gravity anomaly grid and map was generated from 40,665 gravity station measurements in and adjacent to Oregon. In addition, a map shows the location of the aeromagnetic surveys, color-coded to the survey flight-line spacing. This project was supported by the Mineral Resource Program of the USGS.
2017-01-01
This research proposes an innovative data model to determine the landscape of emerging technologies. It is based on a competitive technology intelligence methodology that incorporates the assessment of scientific publications and patent analysis production, and is further supported by experts’ feedback. It enables the definition of the growth rate of scientific and technological output in terms of the top countries, institutions and journals producing knowledge within the field as well as the identification of main areas of research and development by analyzing the International Patent Classification codes including keyword clusterization and co-occurrence of patent assignees and patent codes. This model was applied to the evolving domain of 3D bioprinting. Scientific documents from the Scopus and Web of Science databases, along with patents from 27 authorities and 140 countries, were retrieved. In total, 4782 scientific publications and 706 patents were identified from 2000 to mid-2016. The number of scientific documents published and patents in the last five years showed an annual average growth of 20% and 40%, respectively. Results indicate that the most prolific nations and institutions publishing on 3D bioprinting are the USA and China, including the Massachusetts Institute of Technology (USA), Nanyang Technological University (Singapore) and Tsinghua University (China), respectively. Biomaterials and Biofabrication are the predominant journals. The most prolific patenting countries are China and the USA; while Organovo Holdings Inc. (USA) and Tsinghua University (China) are the institutions leading. International Patent Classification codes reveal that most 3D bioprinting inventions intended for medical purposes apply porous or cellular materials or biologically active materials. Knowledge clusters and expert drivers indicate that there is a research focus on tissue engineering including the fabrication of organs, bioinks and new 3D bioprinting systems. Our model offers a guide to researchers to understand the knowledge production of pioneering technologies, in this case 3D bioprinting. PMID:28662187
Rodríguez-Salvador, Marisela; Rio-Belver, Rosa María; Garechana-Anacabe, Gaizka
2017-01-01
This research proposes an innovative data model to determine the landscape of emerging technologies. It is based on a competitive technology intelligence methodology that incorporates the assessment of scientific publications and patent analysis production, and is further supported by experts' feedback. It enables the definition of the growth rate of scientific and technological output in terms of the top countries, institutions and journals producing knowledge within the field as well as the identification of main areas of research and development by analyzing the International Patent Classification codes including keyword clusterization and co-occurrence of patent assignees and patent codes. This model was applied to the evolving domain of 3D bioprinting. Scientific documents from the Scopus and Web of Science databases, along with patents from 27 authorities and 140 countries, were retrieved. In total, 4782 scientific publications and 706 patents were identified from 2000 to mid-2016. The number of scientific documents published and patents in the last five years showed an annual average growth of 20% and 40%, respectively. Results indicate that the most prolific nations and institutions publishing on 3D bioprinting are the USA and China, including the Massachusetts Institute of Technology (USA), Nanyang Technological University (Singapore) and Tsinghua University (China), respectively. Biomaterials and Biofabrication are the predominant journals. The most prolific patenting countries are China and the USA; while Organovo Holdings Inc. (USA) and Tsinghua University (China) are the institutions leading. International Patent Classification codes reveal that most 3D bioprinting inventions intended for medical purposes apply porous or cellular materials or biologically active materials. Knowledge clusters and expert drivers indicate that there is a research focus on tissue engineering including the fabrication of organs, bioinks and new 3D bioprinting systems. Our model offers a guide to researchers to understand the knowledge production of pioneering technologies, in this case 3D bioprinting.
Hedegaard, Holly; Schoenbaum, Michael; Claassen, Cynthia; Crosby, Alex; Holland, Kristin; Proescholdbell, Scott
2018-02-01
Suicide and intentional self-harm are among the leading causes of death in the United States. To study this public health issue, epidemiologists and researchers often analyze data coded using the International Classification of Diseases (ICD). Prior to October 1, 2015, health care organizations and providers used the clinical modification of the Ninth Revision of ICD (ICD-9-CM) to report medical information in electronic claims data. The transition in October 2015 to use of the clinical modification of the Tenth Revision of ICD (ICD-10-CM) resulted in the need to update methods and selection criteria previously developed for ICD-9-CM coded data. This report provides guidance on the use of ICD-10-CM codes to identify cases of nonfatal suicide attempts and intentional self-harm in ICD-10-CM coded data sets. ICD-10-CM codes for nonfatal suicide attempts and intentional self-harm include: X71-X83, intentional self-harm due to drowning and submersion, firearms, explosive or thermal material, sharp or blunt objects, jumping from a high place, jumping or lying in front of a moving object, crashing of motor vehicle, and other specified means; T36-T50 with a 6th character of 2 (except for T36.9, T37.9, T39.9, T41.4, T42.7, T43.9, T45.9, T47.9, and T49.9, which are included if the 5th character is 2), intentional self-harm due to drug poisoning (overdose); T51-T65 with a 6th character of 2 (except for T51.9, T52.9, T53.9, T54.9, T56.9, T57.9, T58.0, T58.1, T58.9, T59.9, T60.9, T61.0, T61.1, T61.9, T62.9, T63.9, T64.0, T64.8, and T65.9, which are included if the 5th character is 2), intentional self-harm due to toxic effects of nonmedicinal substances; T71 with a 6th character of 2, intentional self-harm due to asphyxiation, suffocation, strangulation; and T14.91, Suicide attempt. Issues to consider when selecting records for nonfatal suicide attempts and intentional self-harm from ICD-10-CM coded administrative data sets are also discussed. All material appearing in this report is in the public domain and may be reproduced or copied without permission; citation as to source, however, is appreciated.
CPIC: a curvilinear Particle-In-Cell code for plasma-material interaction studies
NASA Astrophysics Data System (ADS)
Delzanno, G.; Camporeale, E.; Moulton, J. D.; Borovsky, J. E.; MacDonald, E.; Thomsen, M. F.
2012-12-01
We present a recently developed Particle-In-Cell (PIC) code in curvilinear geometry called CPIC (Curvilinear PIC) [1], where the standard PIC algorithm is coupled with a grid generation/adaptation strategy. Through the grid generator, which maps the physical domain to a logical domain where the grid is uniform and Cartesian, the code can simulate domains of arbitrary complexity, including the interaction of complex objects with a plasma. At present the code is electrostatic. Poisson's equation (in logical space) can be solved with either an iterative method based on the Conjugate Gradient (CG) or the Generalized Minimal Residual (GMRES) coupled with a multigrid solver used as a preconditioner, or directly with multigrid. The multigrid strategy is critical for the solver to perform optimally or nearly optimally as the dimension of the problem increases. CPIC also features a hybrid particle mover, where the computational particles are characterized by position in logical space and velocity in physical space. The advantage of a hybrid mover, as opposed to more conventional movers that move particles directly in the physical space, is that the interpolation of the particles in logical space is straightforward and computationally inexpensive, since one does not have to track the position of the particle. We will present our latest progress on the development of the code and document the code performance on standard plasma-physics tests. Then we will present the (preliminary) application of the code to a basic dynamic-charging problem, namely the charging and shielding of a spherical spacecraft in a magnetized plasma for various level of magnetization and including the pulsed emission of an electron beam from the spacecraft. The dynamical evolution of the sheath and the time-dependent current collection will be described. This study is in support of the ConnEx mission concept to use an electron beam from a magnetospheric spacecraft to trace magnetic field lines from the magnetosphere to the ionosphere [2]. [1] G.L. Delzanno, E. Camporeale, "CPIC: a new Particle-in-Cell code for plasma-material interaction studies", in preparation (2012). [2] J.E. Borovsky, D.J. McComas, M.F. Thomsen, J.L. Burch, J. Cravens, C.J. Pollock, T.E. Moore, and S.B. Mende, "Magnetosphere-Ionosphere Observatory (MIO): A multisatellite mission designed to solve the problem of what generates auroral arcs," Eos. Trans. Amer. Geophys. Union 79 (45), F744 (2000).
NASA Astrophysics Data System (ADS)
Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard
2017-03-01
Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.
Alternative Fuels Data Center: Federal Laws and Incentives for Electricity
Improvement Program website. (Reference Public Law 112-141, 23 U.S. Code 149, and 23 U.S. Code 151) Clean information, see the DOT Public Law 114-94) Electric Vehicle Charging on Federal Property The U.S. General the status of requests for EVSE from other federal agencies. (Reference Public Law 114-94) Alternative
Schroedinger’s code: Source code availability and transparency in astrophysics
NASA Astrophysics Data System (ADS)
Ryan, PW; Allen, Alice; Teuben, Peter
2018-01-01
Astronomers use software for their research, but how many of the codes they use are available as source code? We examined a sample of 166 papers from 2015 for clearly identified software use, then searched for source code for the software packages mentioned in these research papers. We categorized the software to indicate whether source code is available for download and whether there are restrictions to accessing it, and if source code was not available, whether some other form of the software, such as a binary, was. Over 40% of the source code for the software used in our sample was not available for download.As URLs have often been used as proxy citations for software, we also extracted URLs from one journal’s 2015 research articles, removed those from certain long-term, reliable domains, and tested the remainder to determine what percentage of these URLs were still accessible in September and October, 2017.
ELEFANT: a user-friendly multipurpose geodynamics code
NASA Astrophysics Data System (ADS)
Thieulot, C.
2014-07-01
A new finite element code for the solution of the Stokes and heat transport equations is presented. It has purposely been designed to address geological flow problems in two and three dimensions at crustal and lithospheric scales. The code relies on the Marker-in-Cell technique and Lagrangian markers are used to track materials in the simulation domain which allows recording of the integrated history of deformation; their (number) density is variable and dynamically adapted. A variety of rheologies has been implemented including nonlinear thermally activated dislocation and diffusion creep and brittle (or plastic) frictional models. The code is built on the Arbitrary Lagrangian Eulerian kinematic description: the computational grid deforms vertically and allows for a true free surface while the computational domain remains of constant width in the horizontal direction. The solution to the large system of algebraic equations resulting from the finite element discretisation and linearisation of the set of coupled partial differential equations to be solved is obtained by means of the efficient parallel direct solver MUMPS whose performance is thoroughly tested, or by means of the WISMP and AGMG iterative solvers. The code accuracy is assessed by means of many geodynamically relevant benchmark experiments which highlight specific features or algorithms, e.g., the implementation of the free surface stabilisation algorithm, the (visco-)plastic rheology implementation, the temperature advection, the capacity of the code to handle large viscosity contrasts. A two-dimensional application to salt tectonics presented as case study illustrates the potential of the code to model large scale high resolution thermo-mechanically coupled free surface flows.
Reuse: A knowledge-based approach
NASA Technical Reports Server (NTRS)
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui
1992-01-01
This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.
ATHENA 3D: A finite element code for ultrasonic wave propagation
NASA Astrophysics Data System (ADS)
Rose, C.; Rupin, F.; Fouquet, T.; Chassignole, B.
2014-04-01
The understanding of wave propagation phenomena requires use of robust numerical models. 3D finite element (FE) models are generally prohibitively time consuming. However, advances in computing processor speed and memory allow them to be more and more competitive. In this context, EDF R&D developed the 3D version of the well-validated FE code ATHENA2D. The code is dedicated to the simulation of wave propagation in all kinds of elastic media and in particular, heterogeneous and anisotropic materials like welds. It is based on solving elastodynamic equations in the calculation zone expressed in terms of stress and particle velocities. The particularity of the code relies on the fact that the discretization of the calculation domain uses a Cartesian regular 3D mesh while the defect of complex geometry can be described using a separate (2D) mesh using the fictitious domains method. This allows combining the rapidity of regular meshes computation with the capability of modelling arbitrary shaped defects. Furthermore, the calculation domain is discretized with a quasi-explicit time evolution scheme. Thereby only local linear systems of small size have to be solved. The final step to reduce the computation time relies on the fact that ATHENA3D has been parallelized and adapted to the use of HPC resources. In this paper, the validation of the 3D FE model is discussed. A cross-validation of ATHENA 3D and CIVA is proposed for several inspection configurations. The performances in terms of calculation time are also presented in the cases of both local computer and computation cluster use.
Mutational analysis in a patient with a variant form of Gaucher disease caused by SAP-2 deficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rafi, M.A.; Gala, G. de; Xunling Zhang
1993-01-01
It is now clear that the lysosomal hydrolysis of sphingolipids requires both lysosomal enzymes and so-called sphingolipid activator proteins (SAPs). One gene, called prosaposin, codes for a precursor protein that is proteolytically cut into four putative SAPs. These four SAPs, of about 80 amino acids, share some structural features but differ somewhat in their specificity. Domain 3 of prosaposin mRNA contains the coding region for SAP-2, an activator of glucocerebrosidase. While most patients with Gaucher disease store glucosylceramide due to defects in glucocerebrosidase, a few patients store this lipid in the presence of normal enzyme levels. In this paper themore » authors describe the identification of a point mutation in domain 3 of a patient who died with this variant form of Gaucher disease. Polymerase chain reaction amplification was performed in the small amount of genomic DNA available using primers generated from the intronic sequence surrounding domain 3. The patient was found to have a T-to-G substitution at position 1144 (counting from the A of ATG initiation codon) in half of the M13 recombinant clones. This changes the codon for cysteine[sub 382] to glycine. His father and unaffected brother also had this mutation, but his mother did not. She was found to have half of the normal amount of mRNA for prosaposin in her cultured skin fibroblasts. Therefore, this child inherited a point mutation in domain 3 from his father and a deficiency of all four SAPs coded for by prosaposin from his mother. 29 refs., 3 figs., 1 tab.« less
41 CFR 105-72.502 - Codes of conduct.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Codes of conduct. 105-72.502 Section 105-72.502 Public Contracts and Property Management Federal Property Management Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services...
48 CFR 1601.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Publication and code arrangement. 1601.104-1 Section 1601.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS...
48 CFR 1601.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Publication and code arrangement. 1601.104-1 Section 1601.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS...
48 CFR 1601.104-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Publication and code arrangement. 1601.104-1 Section 1601.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS...
48 CFR 501.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES ADMINISTRATION GENERAL GENERAL SERVICES ADMINISTRATION ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance...
Grid Generation Techniques Utilizing the Volume Grid Manipulator
NASA Technical Reports Server (NTRS)
Alter, Stephen J.
1998-01-01
This paper presents grid generation techniques available in the Volume Grid Manipulation (VGM) code. The VGM code is designed to manipulate existing line, surface and volume grids to improve the quality of the data. It embodies an easy to read rich language of commands that enables such alterations as topology changes, grid adaption and smoothing. Additionally, the VGM code can be used to construct simplified straight lines, splines, and conic sections which are common curves used in the generation and manipulation of points, lines, surfaces and volumes (i.e., grid data). These simple geometric curves are essential in the construction of domain discretizations for computational fluid dynamic simulations. By comparison to previously established methods of generating these curves interactively, the VGM code provides control of slope continuity and grid point-to-point stretchings as well as quick changes in the controlling parameters. The VGM code offers the capability to couple the generation of these geometries with an extensive manipulation methodology in a scripting language. The scripting language allows parametric studies of a vehicle geometry to be efficiently performed to evaluate favorable trends in the design process. As examples of the powerful capabilities of the VGM code, a wake flow field domain will be appended to an existing X33 Venturestar volume grid; negative volumes resulting from grid expansions to enable flow field capture on a simple geometry, will be corrected; and geometrical changes to a vehicle component of the X33 Venturestar will be shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MORIDIS, GEORGE
2016-05-02
MeshMaker v1.5 is a code that describes the system geometry and discretizes the domain in problems of flow and transport through porous and fractured media that are simulated using the TOUGH+ [Moridis and Pruess, 2014] or TOUGH2 [Pruess et al., 1999; 2012] families of codes. It is a significantly modified and drastically enhanced version of an earlier simpler facility that was embedded in the TOUGH2 codes [Pruess et al., 1999; 2012], from which it could not be separated. The code (MeshMaker.f90) is a stand-alone product written in FORTRAN 95/2003, is written according to the tenets of Object-Oriented Programming, has amore » modular structure and can perform a number of mesh generation and processing operations. It can generate two-dimensional radially symmetric (r,z) meshes, and one-, two-, and three-dimensional rectilinear (Cartesian) grids in (x,y,z). The code generates the file MESH, which includes all the elements and connections that describe the discretized simulation domain and conforming to the requirements of the TOUGH+ and TOUGH2 codes. Multiple-porosity processing for simulation of flow in naturally fractured reservoirs can be invoked by means of a keyword MINC, which stands for Multiple INteracting Continua. The MINC process operates on the data of the primary (porous medium) mesh as provided on disk file MESH, and generates a secondary mesh containing fracture and matrix elements with identical data formats on file MINC.« less
QSL Squasher: A Fast Quasi-separatrix Layer Map Calculator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tassev, Svetlin; Savcheva, Antonia, E-mail: svetlin.tassev@cfa.harvard.edu
Quasi-Separatrix Layers (QSLs) are a useful proxy for the locations where current sheets can develop in the solar corona, and give valuable information about the connectivity in complicated magnetic field configurations. However, calculating QSL maps, even for two-dimensional slices through three-dimensional models of coronal magnetic fields, is a non-trivial task, as it usually involves tracing out millions of magnetic field lines with immense precision. Thus, extending QSL calculations to three dimensions has rarely been done until now. In order to address this challenge, we present QSL Squasher—a public, open-source code, which is optimized for calculating QSL maps in both twomore » and three dimensions on graphics processing units. The code achieves large processing speeds for three reasons, each of which results in an order-of-magnitude speed-up. (1) The code is parallelized using OpenCL. (2) The precision requirements for the QSL calculation are drastically reduced by using perturbation theory. (3) A new boundary detection criterion between quasi-connectivity domains is used, which quickly identifies possible QSL locations that need to be finely sampled by the code. That boundary detection criterion relies on finding the locations of abrupt field-line length changes, which we do by introducing a new Field-line Length Edge (FLEDGE) map. We find FLEDGE maps useful on their own as a quick-and-dirty substitute for QSL maps. QSL Squasher allows construction of high-resolution 3D FLEDGE maps in a matter of minutes, which is two orders of magnitude faster than calculating the corresponding 3D QSL maps. We include a sample of calculations done using QSL Squasher to demonstrate its capabilities as a QSL calculator, as well as to compare QSL and FLEDGE maps.« less
QSL Squasher: A Fast Quasi-separatrix Layer Map Calculator
NASA Astrophysics Data System (ADS)
Tassev, Svetlin; Savcheva, Antonia
2017-05-01
Quasi-Separatrix Layers (QSLs) are a useful proxy for the locations where current sheets can develop in the solar corona, and give valuable information about the connectivity in complicated magnetic field configurations. However, calculating QSL maps, even for two-dimensional slices through three-dimensional models of coronal magnetic fields, is a non-trivial task, as it usually involves tracing out millions of magnetic field lines with immense precision. Thus, extending QSL calculations to three dimensions has rarely been done until now. In order to address this challenge, we present QSL Squasher—a public, open-source code, which is optimized for calculating QSL maps in both two and three dimensions on graphics processing units. The code achieves large processing speeds for three reasons, each of which results in an order-of-magnitude speed-up. (1) The code is parallelized using OpenCL. (2) The precision requirements for the QSL calculation are drastically reduced by using perturbation theory. (3) A new boundary detection criterion between quasi-connectivity domains is used, which quickly identifies possible QSL locations that need to be finely sampled by the code. That boundary detection criterion relies on finding the locations of abrupt field-line length changes, which we do by introducing a new Field-line Length Edge (FLEDGE) map. We find FLEDGE maps useful on their own as a quick-and-dirty substitute for QSL maps. QSL Squasher allows construction of high-resolution 3D FLEDGE maps in a matter of minutes, which is two orders of magnitude faster than calculating the corresponding 3D QSL maps. We include a sample of calculations done using QSL Squasher to demonstrate its capabilities as a QSL calculator, as well as to compare QSL and FLEDGE maps.
Ndah, Elvis; Jonckheere, Veronique
2017-01-01
Proteogenomics is an emerging research field yet lacking a uniform method of analysis. Proteogenomic studies in which N-terminal proteomics and ribosome profiling are combined, suggest that a high number of protein start sites are currently missing in genome annotations. We constructed a proteogenomic pipeline specific for the analysis of N-terminal proteomics data, with the aim of discovering novel translational start sites outside annotated protein coding regions. In summary, unidentified MS/MS spectra were matched to a specific N-terminal peptide library encompassing protein N termini encoded in the Arabidopsis thaliana genome. After a stringent false discovery rate filtering, 117 protein N termini compliant with N-terminal methionine excision specificity and indicative of translation initiation were found. These include N-terminal protein extensions and translation from transposable elements and pseudogenes. Gene prediction provided supporting protein-coding models for approximately half of the protein N termini. Besides the prediction of functional domains (partially) contained within the newly predicted ORFs, further supporting evidence of translation was found in the recently released Araport11 genome re-annotation of Arabidopsis and computational translations of sequences stored in public repositories. Most interestingly, complementary evidence by ribosome profiling was found for 23 protein N termini. Finally, by analyzing protein N-terminal peptides, an in silico analysis demonstrates the applicability of our N-terminal proteogenomics strategy in revealing protein-coding potential in species with well- and poorly-annotated genomes. PMID:28432195
Willems, Patrick; Ndah, Elvis; Jonckheere, Veronique; Stael, Simon; Sticker, Adriaan; Martens, Lennart; Van Breusegem, Frank; Gevaert, Kris; Van Damme, Petra
2017-06-01
Proteogenomics is an emerging research field yet lacking a uniform method of analysis. Proteogenomic studies in which N-terminal proteomics and ribosome profiling are combined, suggest that a high number of protein start sites are currently missing in genome annotations. We constructed a proteogenomic pipeline specific for the analysis of N-terminal proteomics data, with the aim of discovering novel translational start sites outside annotated protein coding regions. In summary, unidentified MS/MS spectra were matched to a specific N-terminal peptide library encompassing protein N termini encoded in the Arabidopsis thaliana genome. After a stringent false discovery rate filtering, 117 protein N termini compliant with N-terminal methionine excision specificity and indicative of translation initiation were found. These include N-terminal protein extensions and translation from transposable elements and pseudogenes. Gene prediction provided supporting protein-coding models for approximately half of the protein N termini. Besides the prediction of functional domains (partially) contained within the newly predicted ORFs, further supporting evidence of translation was found in the recently released Araport11 genome re-annotation of Arabidopsis and computational translations of sequences stored in public repositories. Most interestingly, complementary evidence by ribosome profiling was found for 23 protein N termini. Finally, by analyzing protein N-terminal peptides, an in silico analysis demonstrates the applicability of our N-terminal proteogenomics strategy in revealing protein-coding potential in species with well- and poorly-annotated genomes. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Forde, Arnell S.; Flocks, James G.; Wiese, Dana S.; Fredericks, Jake J.
2016-03-29
The archived trace data are in standard SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files are available on the DVD version of this report or online, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. The Web version of this archive does not contain the SEG Y trace files. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles are provided as Graphics Interchange Format (GIF) images processed and gained using SU software and can be viewed from theProfiles page or by using the links located on the trackline maps; refer to the Software page for links to example SU processing scripts.
Sonnenberg, A; Turner, K O; Genta, R M
2017-01-01
Inflammatory bowel disease (IBD) and microscopic colitis are characterized by different geographical distributions across the USA. In this cross-sectional study we utilized demographic and socio-economic information associated with individual ZIP codes to further delineate the epidemiological characteristics of the two diseases. A total of 813 057 patients who underwent colonoscopy between 2008 and 2014 were extracted from an electronic database of histopathology reports. The prevalence of patients with IBD or microscopic colitis was expressed as percentage of the population associated with specific demographic (age, sex, ethnicity) and socio-economic characteristics (population size, housing value, annual income, tertiary education). Both diseases were more common among subjects from ZIP codes with predominantly White residents and less common among subjects from ZIP codes with predominantly non-White residents such as Black, Hispanic and Asian. These ethnic variations were more pronounced in microscopic colitis than IBD. Markers of affluence, such as average residential house value and annual income, were positively associated with IBD and negatively with microscopic colitis. The prevalence of both diseases was positively correlated with tertiary education. The occurrence of both IBD and microscopic colitis is influenced by environmental risk factors. The differences in the demographic, ethnic and socio-economic distributions of the two diseases suggest that different sets of risk factors affect the two diseases and that their aetiology is unrelated. Published [2016]. This article is a U.S. Government work and is in the public domain in the USA.
The coupling of fluids, dynamics, and controls on advanced architecture computers
NASA Technical Reports Server (NTRS)
Atwood, Christopher
1995-01-01
This grant provided for the demonstration of coupled controls, body dynamics, and fluids computations in a workstation cluster environment; and an investigation of the impact of peer-peer communication on flow solver performance and robustness. The findings of these investigations were documented in the conference articles.The attached publication, 'Towards Distributed Fluids/Controls Simulations', documents the solution and scaling of the coupled Navier-Stokes, Euler rigid-body dynamics, and state feedback control equations for a two-dimensional canard-wing. The poor scaling shown was due to serialized grid connectivity computation and Ethernet bandwidth limits. The scaling of a peer-to-peer communication flow code on an IBM SP-2 was also shown. The scaling of the code on the switched fabric-linked nodes was good, with a 2.4 percent loss due to communication of intergrid boundary point information. The code performance on 30 worker nodes was 1.7 (mu)s/point/iteration, or a factor of three over a Cray C-90 head. The attached paper, 'Nonlinear Fluid Computations in a Distributed Environment', documents the effect of several computational rate enhancing methods on convergence. For the cases shown, the highest throughput was achieved using boundary updates at each step, with the manager process performing communication tasks only. Constrained domain decomposition of the implicit fluid equations did not degrade the convergence rate or final solution. The scaling of a coupled body/fluid dynamics problem on an Ethernet-linked cluster was also shown.
Domain Decomposition By the Advancing-Partition Method for Parallel Unstructured Grid Generation
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.; Zagaris, George
2009-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
Domain Decomposition By the Advancing-Partition Method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2008-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
Rapid Aeroelastic Analysis of Blade Flutter in Turbomachines
NASA Technical Reports Server (NTRS)
Trudell, J. J.; Mehmed, O.; Stefko, G. L.; Bakhle, M. A.; Reddy, T. S. R.; Montgomery, M.; Verdon, J.
2006-01-01
The LINFLUX-AE computer code predicts flutter and forced responses of blades and vanes in turbomachines under subsonic, transonic, and supersonic flow conditions. The code solves the Euler equations of unsteady flow in a blade passage under the assumption that the blades vibrate harmonically at small amplitudes. The steady-state nonlinear Euler equations are solved by a separate program, then equations for unsteady flow components are obtained through linearization around the steady-state solution. A structural-dynamics analysis (see figure) is performed to determine the frequencies and mode shapes of blade vibrations, a preprocessor interpolates mode shapes from the structural-dynamics mesh onto the LINFLUX computational-fluid-dynamics mesh, and an interface code is used to convert the steady-state flow solution to a form required by LINFLUX. Then LINFLUX solves the linearized equations in the frequency domain to calculate the unsteady aerodynamic pressure distribution for a given vibration mode, frequency, and interblade phase angle. A post-processor uses the unsteady pressures to calculate generalized aerodynamic forces, response amplitudes, and eigenvalues (which determine the flutter frequency and damping). In comparison with the TURBO-AE aeroelastic-analysis code, which solves the equations in the time domain, LINFLUX-AE is 6 to 7 times faster.
Varela, Andrea Ramirez; Pratt, Michael; Harris, Jenine; Lecy, Jesse; Salvo, Deborah; Brownson, Ross C; Hallal, Pedro C
2018-06-01
Little has been published about the historical development of scientific evidence in the physical activity (PA) and public health research field. The study aimed to examine the evolution of knowledge in this field. A structured literature review using formal citation network analysis methods was conducted in June-2016. Using a list of influential PA publications identified by domain experts, a snowball sampling technique was used to build a compact citation network of 141 publications that represents the backbone of the field. Articles were coded by study type and research team characteristics, then analyzed by visualizing the citation network and identifying research clusters to trace the evolution of the field. The field started in the 1950s, with a health sciences focus and strong North American and European leadership. Health outcome studies appeared most frequently in the network and policy and interventions least. Critical articles on objective measurement and public policy have influenced the progress from an emphasis on health outcomes research at early stages in the field to the more recent emerging built environment and global monitoring foci. There is only modest cross-citation across types of study. To our knowledge, this paper is the first to systematically describe the development of research on PA and public health. The key publications include fundamental ideas that remain citable over time, but notable research and dissemination gaps exist and should be addressed. Increasing collaboration and communication between study areas, encouraging female researchers, and increasing studies on interventions, evaluation of interventions and policy are recommended. Copyright © 2017 Elsevier Inc. All rights reserved.
PHoToNs–A parallel heterogeneous and threads oriented code for cosmological N-body simulation
NASA Astrophysics Data System (ADS)
Wang, Qiao; Cao, Zong-Yan; Gao, Liang; Chi, Xue-Bin; Meng, Chen; Wang, Jie; Wang, Long
2018-06-01
We introduce a new code for cosmological simulations, PHoToNs, which incorporates features for performing massive cosmological simulations on heterogeneous high performance computer (HPC) systems and threads oriented programming. PHoToNs adopts a hybrid scheme to compute gravitational force, with the conventional Particle-Mesh (PM) algorithm to compute the long-range force, the Tree algorithm to compute the short range force and the direct summation Particle-Particle (PP) algorithm to compute gravity from very close particles. A self-similar space filling a Peano-Hilbert curve is used to decompose the computing domain. Threads programming is advantageously used to more flexibly manage the domain communication, PM calculation and synchronization, as well as Dual Tree Traversal on the CPU+MIC platform. PHoToNs scales well and efficiency of the PP kernel achieves 68.6% of peak performance on MIC and 74.4% on CPU platforms. We also test the accuracy of the code against the much used Gadget-2 in the community and found excellent agreement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldridge, David Franklin; Collier, Sandra L.; Marlin, David H.
2005-05-01
This document is intended to serve as a users guide for the time-domain atmospheric acoustic propagation suite (TDAAPS) program developed as part of the Department of Defense High-Performance Modernization Office (HPCMP) Common High-Performance Computing Scalable Software Initiative (CHSSI). TDAAPS performs staggered-grid finite-difference modeling of the acoustic velocity-pressure system with the incorporation of spatially inhomogeneous winds. Wherever practical the control structure of the codes are written in C++ using an object oriented design. Sections of code where a large number of calculations are required are written in C or F77 in order to enable better compiler optimization of these sections. Themore » TDAAPS program conforms to a UNIX style calling interface. Most of the actions of the codes are controlled by adding flags to the invoking command line. This document presents a large number of examples and provides new users with the necessary background to perform acoustic modeling with TDAAPS.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
TESP combines existing domain simulators in the electric power grid, with new transactive agents, growth models and evaluation scripts. The existing domain simulators include GridLAB-D for the distribution grid and single-family residential buildings, MATPOWER for transmission and bulk generation, and EnergyPlus for large buildings. More are planned for subsequent versions of TESP. The new elements are: TEAgents - simulate market participants and transactive systems for market clearing. Some of this functionality was extracted from GridLAB-D and implemented in Python for customization by PNNL and others; Growth Model - a means for simulating system changes over a multiyear period, including bothmore » normal load growth and specific investment decisions. Customizable in Python code; and Evaluation Script - a means of evaluating different transactive systems through customizable post-processing in Python code. TESP provides a method for other researchers and vendors to design transactive systems, and test them in a virtual environment. It allows customization of the key components by modifying Python code.« less
QR Codes: Outlook for Food Science and Nutrition.
Sanz-Valero, Javier; Álvarez Sabucedo, Luis M; Wanden-Berghe, Carmina; Santos Gago, Juan M
2016-01-01
QR codes opens up the possibility to develop simple-to-use, cost-effective-cost, and functional systems based on the optical recognition of inexpensive tags attached to physical objects. These systems, combined with Web platforms, can provide us with advanced services that are already currently broadly used on many contexts of the common life. Due to its philosophy, based on the automatic recognition of messages embedded on simple graphics by means of common devices such as mobile phones, QR codes are very convenient for the average user. Regretfully, its potential has not yet been fully exploited in the domains of food science and nutrition. This paper points out some applications to make the most of this technology for these domains in a straightforward manner. For its characteristics, we are addressing systems with low barriers to entry and high scalability for its deployment. Therefore, its launching among professional and final users is quite simple. The paper also provides high-level indications for the evaluation of the technological frame required to implement the identified possibilities of use.
Open-source chemogenomic data-driven algorithms for predicting drug-target interactions.
Hao, Ming; Bryant, Stephen H; Wang, Yanli
2018-02-06
While novel technologies such as high-throughput screening have advanced together with significant investment by pharmaceutical companies during the past decades, the success rate for drug development has not yet been improved prompting researchers looking for new strategies of drug discovery. Drug repositioning is a potential approach to solve this dilemma. However, experimental identification and validation of potential drug targets encoded by the human genome is both costly and time-consuming. Therefore, effective computational approaches have been proposed to facilitate drug repositioning, which have proved to be successful in drug discovery. Doubtlessly, the availability of open-accessible data from basic chemical biology research and the success of human genome sequencing are crucial to develop effective in silico drug repositioning methods allowing the identification of potential targets for existing drugs. In this work, we review several chemogenomic data-driven computational algorithms with source codes publicly accessible for predicting drug-target interactions (DTIs). We organize these algorithms by model properties and model evolutionary relationships. We re-implemented five representative algorithms in R programming language, and compared these algorithms by means of mean percentile ranking, a new recall-based evaluation metric in the DTI prediction research field. We anticipate that this review will be objective and helpful to researchers who would like to further improve existing algorithms or need to choose appropriate algorithms to infer potential DTIs in the projects. The source codes for DTI predictions are available at: https://github.com/minghao2016/chemogenomicAlg4DTIpred. Published by Oxford University Press 2018. This work is written by US Government employees and is in the public domain in the US.
Lausberg, Hedda; Sloetjes, Han
2016-09-01
As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.
Gradus, Jaimie L; Antonsen, Sussie; Svensson, Elisabeth; Lash, Timothy L; Resick, Patricia A; Hansen, Jens Georg
2015-09-01
Longitudinal outcomes following stress or trauma diagnoses are receiving attention, yet population-based studies are few. The aims of the present cohort study were to examine the cumulative incidence of traumatic events and psychiatric diagnoses following diagnoses of severe stress and adjustment disorders categorized using International Classification of Diseases, Tenth Revision, codes and to examine associations of these diagnoses with all-cause mortality and suicide. Data came from a longitudinal cohort of all Danes who received a diagnosis of reaction to severe stress or adjustment disorders (International Classification of Diseases, Tenth Revision, code F43.x) between 1995 and 2011, and they were compared with data from a general-population cohort. Cumulative incidence curves were plotted to examine traumatic experiences and psychiatric diagnoses during the study period. A Cox proportional hazards regression model was used to examine the associations of the disorders with mortality and suicide. Participants with stress diagnoses had a higher incidence of traumatic events and psychiatric diagnoses than did the comparison group. Each disorder was associated with a higher rate of all-cause mortality than that seen in the comparison cohort, and strong associations with suicide were found after adjustment. This study provides a comprehensive assessment of the associations of stress disorders with a variety of outcomes, and we found that stress diagnoses may have long-lasting and potentially severe consequences. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Database Resources of the BIG Data Center in 2018.
2018-01-04
The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Huntley, Stuart; Baggott, Daniel M.; Hamilton, Aaron T.; Tran-Gyamfi, Mary; Yang, Shan; Kim, Joomyeong; Gordon, Laurie; Branscomb, Elbert; Stubbs, Lisa
2006-01-01
Krüppel-type zinc finger (ZNF) motifs are prevalent components of transcription factor proteins in all eukaryotes. KRAB-ZNF proteins, in which a potent repressor domain is attached to a tandem array of DNA-binding zinc-finger motifs, are specific to tetrapod vertebrates and represent the largest class of ZNF proteins in mammals. To define the full repertoire of human KRAB-ZNF proteins, we searched the genome sequence for key motifs and then constructed and manually curated gene models incorporating those sequences. The resulting gene catalog contains 423 KRAB-ZNF protein-coding loci, yielding alternative transcripts that altogether predict at least 742 structurally distinct proteins. Active rounds of segmental duplication, involving single genes or larger regions and including both tandem and distributed duplication events, have driven the expansion of this mammalian gene family. Comparisons between the human genes and ZNF loci mined from the draft mouse, dog, and chimpanzee genomes not only identified 103 KRAB-ZNF genes that are conserved in mammals but also highlighted a substantial level of lineage-specific change; at least 136 KRAB-ZNF coding genes are primate specific, including many recent duplicates. KRAB-ZNF genes are widely expressed and clustered genes are typically not coregulated, indicating that paralogs have evolved to fill roles in many different biological processes. To facilitate further study, we have developed a Web-based public resource with access to gene models, sequences, and other data, including visualization tools to provide genomic context and interaction with other public data sets. PMID:16606702
The ADVANCE Code of Conduct for collaborative vaccine studies.
Kurz, Xavier; Bauchau, Vincent; Mahy, Patrick; Glismann, Steffen; van der Aa, Lieke Maria; Simondon, François
2017-04-04
Lessons learnt from the 2009 (H1N1) flu pandemic highlighted factors limiting the capacity to collect European data on vaccine exposure, safety and effectiveness, including lack of rapid access to available data sources or expertise, difficulties to establish efficient interactions between multiple parties, lack of confidence between private and public sectors, concerns about possible or actual conflicts of interest (or perceptions thereof) and inadequate funding mechanisms. The Innovative Medicines Initiative's Accelerated Development of VAccine benefit-risk Collaboration in Europe (ADVANCE) consortium was established to create an efficient and sustainable infrastructure for rapid and integrated monitoring of post-approval benefit-risk of vaccines, including a code of conduct and governance principles for collaborative studies. The development of the code of conduct was guided by three core and common values (best science, strengthening public health, transparency) and a review of existing guidance and relevant published articles. The ADVANCE Code of Conduct includes 45 recommendations in 10 topics (Scientific integrity, Scientific independence, Transparency, Conflicts of interest, Study protocol, Study report, Publication, Subject privacy, Sharing of study data, Research contract). Each topic includes a definition, a set of recommendations and a list of additional reading. The concept of the study team is introduced as a key component of the ADVANCE Code of Conduct with a core set of roles and responsibilities. It is hoped that adoption of the ADVANCE Code of Conduct by all partners involved in a study will facilitate and speed-up its initiation, design, conduct and reporting. Adoption of the ADVANCE Code of Conduct should be stated in the study protocol, study report and publications and journal editors are encouraged to use it as an indication that good principles of public health, science and transparency were followed throughout the study. Copyright © 2017. Published by Elsevier Ltd.
DNA Multiple Sequence Alignment Guided by Protein Domains: The MSA-PAD 2.0 Method.
Balech, Bachir; Monaco, Alfonso; Perniola, Michele; Santamaria, Monica; Donvito, Giacinto; Vicario, Saverio; Maggi, Giorgio; Pesole, Graziano
2018-01-01
Multiple sequence alignment (MSA) is a fundamental component in many DNA sequence analyses including metagenomics studies and phylogeny inference. When guided by protein profiles, DNA multiple alignments assume a higher precision and robustness. Here we present details of the use of the upgraded version of MSA-PAD (2.0), which is a DNA multiple sequence alignment framework able to align DNA sequences coding for single/multiple protein domains guided by PFAM or user-defined annotations. MSA-PAD has two alignment strategies, called "Gene" and "Genome," accounting for coding domains order and genomic rearrangements, respectively. Novel options were added to the present version, where the MSA can be guided by protein profiles provided by the user. This allows MSA-PAD 2.0 to run faster and to add custom protein profiles sometimes not present in PFAM database according to the user's interest. MSA-PAD 2.0 is currently freely available as a Web application at https://recasgateway.cloud.ba.infn.it/ .
The Argonaute CSR-1 and its 22G-RNA cofactors are required for holocentric chromosome segregation.
Claycomb, Julie M; Batista, Pedro J; Pang, Ka Ming; Gu, Weifeng; Vasale, Jessica J; van Wolfswinkel, Josien C; Chaves, Daniel A; Shirayama, Masaki; Mitani, Shohei; Ketting, René F; Conte, Darryl; Mello, Craig C
2009-10-02
RNAi-related pathways regulate diverse processes, from developmental timing to transposon silencing. Here, we show that in C. elegans the Argonaute CSR-1, the RNA-dependent RNA polymerase EGO-1, the Dicer-related helicase DRH-3, and the Tudor-domain protein EKL-1 localize to chromosomes and are required for proper chromosome segregation. In the absence of these factors chromosomes fail to align at the metaphase plate and kinetochores do not orient to opposing spindle poles. Surprisingly, the CSR-1-interacting small RNAs (22G-RNAs) are antisense to thousands of germline-expressed protein-coding genes. Nematodes assemble holocentric chromosomes in which continuous kinetochores must span the expressed domains of the genome. We show that CSR-1 interacts with chromatin at target loci but does not downregulate target mRNA or protein levels. Instead, our findings support a model in which CSR-1 complexes target protein-coding domains to promote their proper organization within the holocentric chromosomes of C. elegans.
Compressed domain indexing of losslessly compressed images
NASA Astrophysics Data System (ADS)
Schaefer, Gerald
2001-12-01
Image retrieval and image compression have been pursued separately in the past. Only little research has been done on a synthesis of the two by allowing image retrieval to be performed directly in the compressed domain of images without the need to uncompress them first. In this paper methods for image retrieval in the compressed domain of losslessly compressed images are introduced. While most image compression techniques are lossy, i.e. discard visually less significant information, lossless techniques are still required in fields like medical imaging or in situations where images must not be changed due to legal reasons. The algorithms in this paper are based on predictive coding methods where a pixel is encoded based on the pixel values of its (already encoded) neighborhood. The first method is based on an understanding that predictively coded data is itself indexable and represents a textural description of the image. The second method operates directly on the entropy encoded data by comparing codebooks of images. Experiments show good image retrieval results for both approaches.
Chang, Hang; Han, Ju; Zhong, Cheng; Snijders, Antoine M.; Mao, Jian-Hua
2017-01-01
The capabilities of (I) learning transferable knowledge across domains; and (II) fine-tuning the pre-learned base knowledge towards tasks with considerably smaller data scale are extremely important. Many of the existing transfer learning techniques are supervised approaches, among which deep learning has the demonstrated power of learning domain transferrable knowledge with large scale network trained on massive amounts of labeled data. However, in many biomedical tasks, both the data and the corresponding label can be very limited, where the unsupervised transfer learning capability is urgently needed. In this paper, we proposed a novel multi-scale convolutional sparse coding (MSCSC) method, that (I) automatically learns filter banks at different scales in a joint fashion with enforced scale-specificity of learned patterns; and (II) provides an unsupervised solution for learning transferable base knowledge and fine-tuning it towards target tasks. Extensive experimental evaluation of MSCSC demonstrates the effectiveness of the proposed MSCSC in both regular and transfer learning tasks in various biomedical domains. PMID:28129148
Modeling of Radiowave Propagation in a Forested Environment
2014-09-01
is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Propagation models used in wireless communication system design play an...domains. Applications in both domains require communication devices and sensors to be operated in forested environments. Various methods have been...wireless communication system design play an important role in overall link performance. Propagation models in a forested environment, in particular
ERIC Educational Resources Information Center
Ruff, Chloe
2016-01-01
The purpose of this qualitative study was to examine how first-year college students perceive their development of domain identification with, and interest in, their prospective science major during their initial year of college. Four themes emerged from the coding and analysis of interviews with eight first-year science students: Self-Definition…
de Lange, Orlando; Wolf, Christina; Dietze, Jörn; Elsaesser, Janett; Morbitzer, Robert; Lahaye, Thomas
2014-01-01
The tandem repeats of transcription activator like effectors (TALEs) mediate sequence-specific DNA binding using a simple code. Naturally, TALEs are injected by Xanthomonas bacteria into plant cells to manipulate the host transcriptome. In the laboratory TALE DNA binding domains are reprogrammed and used to target a fused functional domain to a genomic locus of choice. Research into the natural diversity of TALE-like proteins may provide resources for the further improvement of current TALE technology. Here we describe TALE-like proteins from the endosymbiotic bacterium Burkholderia rhizoxinica, termed Bat proteins. Bat repeat domains mediate sequence-specific DNA binding with the same code as TALEs, despite less than 40% sequence identity. We show that Bat proteins can be adapted for use as transcription factors and nucleases and that sequence preferences can be reprogrammed. Unlike TALEs, the core repeats of each Bat protein are highly polymorphic. This feature allowed us to explore alternative strategies for the design of custom Bat repeat arrays, providing novel insights into the functional relevance of non-RVD residues. The Bat proteins offer fertile grounds for research into the creation of improved programmable DNA-binding proteins and comparative insights into TALE-like evolution. PMID:24792163
Wong, Alex W K; Lau, Stephen C L; Cella, David; Lai, Jin-Shei; Xie, Guanli; Chen, Lidian; Chan, Chetwyn C H; Heinemann, Allen W
2017-09-01
The quality of life in neurological disorders (Neuro-QoL) is a U.S. National Institutes of Health initiative that produced a set of self-report measures of physical, mental, and social health experienced by adults or children who have a neurological condition or disorder. To describe the content of the Neuro-QoL at the item level using the World Health Organization's international classification of functioning, disability and health (ICF). We assessed the Neuro-QoL for its content coverage of functioning and disability relative to each of the four ICF domains (i.e., body functions, body structures, activities and participation, and environment). We used second-level ICF three-digit codes to classify items into categories within each ICF domain and computed the percentage of categories within each ICF domain that were represented in the Neuro-QoL items. All items of Neuro-QoL could be mapped to the ICF categories at the second-level classification codes. The activities and participation domain and the mental functions category of the body functions domain were the areas most often represented by Neuro-QoL. Neuro-QoL provides limited coverage of the environmental factors and body structure domains. Neuro-QoL measures map well to the ICF. The Neuro-QoL-ICF-mapped items provide a blueprint for users to select appropriate measures in ICF-based measurement applications.
The neural career of sensory-motor metaphors.
Desai, Rutvik H; Binder, Jeffrey R; Conant, Lisa L; Mano, Quintino R; Seidenberg, Mark S
2011-09-01
The role of sensory-motor systems in conceptual understanding has been controversial. It has been proposed that many abstract concepts are understood metaphorically through concrete sensory-motor domains such as actions. Using fMRI, we compared neural responses with literal action (Lit; The daughter grasped the flowers), metaphoric action (Met; The public grasped the idea), and abstract (Abs; The public understood the idea) sentences of varying familiarity. Both Lit and Met sentences activated the left anterior inferior parietal lobule, an area involved in action planning, with Met sentences also activating a homologous area in the right hemisphere, relative to Abs sentences. Both Met and Abs sentences activated the left superior temporal regions associated with abstract language. Importantly, activation in primary motor and biological motion perception regions was inversely correlated with Lit and Met familiarity. These results support the view that the understanding of metaphoric action retains a link to sensory-motor systems involved in action performance. However, the involvement of sensory-motor systems in metaphor understanding changes through a gradual abstraction process whereby relatively detailed simulations are used for understanding unfamiliar metaphors, and these simulations become less detailed and involve only secondary motor regions as familiarity increases. Consistent with these data, we propose that anterior inferior parietal lobule serves as an interface between sensory-motor and conceptual systems and plays an important role in both domains. The similarity of abstract and metaphoric sentences in the activation of left superior temporal regions suggests that action metaphor understanding is not completely based on sensory-motor simulations but relies also on abstract lexical-semantic codes.
Gonzalo, Jed D; Ahluwalia, Amarpreet; Hamilton, Maria; Wolf, Heidi; Wolpaw, Daniel R; Thompson, Britta M
2018-02-01
To develop a potential competency framework for faculty development programs aligned with the needs of faculty in academic health centers (AHCs). In 2014 and 2015, the authors interviewed 23 health system leaders and analyzed transcripts using constant comparative analysis and thematic analysis. They coded competencies and curricular concepts into subcategories. Lead investigators reviewed drafts of the categorization themes and subthemes related to gaps in faculty knowledge and skills, collapsed and combined competency domains, and resolved disagreements via discussion. Through analysis, the authors identified four themes. The first was core functional competencies and curricular domains for conceptual learning, including patient-centered care, health care processes, clinical informatics, population and public health, policy and payment, value-based care, and health system improvement. The second was the need for foundational competency domains, including systems thinking, change agency/management, teaming, and leadership. The third theme was paradigm shifts in how academic faculty should approach health care, categorized into four areas: delivery, transformation, provider characteristics and skills, and education. The fourth theme was the need for faculty to be aware of challenges in the culture of AHCs as an influential context for change. This broad competency framework for faculty development programs expands existing curricula by including a comprehensive scope of health systems science content and skills. AHC leaders can use these results to better align faculty education with the real-time needs of their health systems. Future work should focus on optimal prioritization and methods for teaching.
Use of biphase-coded pulses for wideband data storage in time-domain optical memories.
Shen, X A; Kachru, R
1993-06-10
We demonstrate that temporally long laser pulses with appropriate phase modulation can replace either temporally brief or frequency-chirped pulses in a time-domain optical memory to store and retrieve information. A 1.65-µs-long write pulse was biphase modulated according to the 13-bit Barker code for storing multiple bits of optical data into a Pr(3+):YAlO(3) crystal, and the stored information was later recalled faithfully by using a read pulse that was identical to the write pulse. Our results further show that the stored data cannot be retrieved faithfully if mismatched write and read pulses are used. This finding opens up the possibility of designing encrypted optical memories for secure data storage.
Temporal phase mask encrypted optical steganography carried by amplified spontaneous emission noise.
Wu, Ben; Wang, Zhenxing; Shastri, Bhavin J; Chang, Matthew P; Frost, Nicholas A; Prucnal, Paul R
2014-01-13
A temporal phase mask encryption method is proposed and experimentally demonstrated to improve the security of the stealth channel in an optical steganography system. The stealth channel is protected in two levels. In the first level, the data is carried by amplified spontaneous emission (ASE) noise, which cannot be detected in either the time domain or spectral domain. In the second level, even if the eavesdropper suspects the existence of the stealth channel, each data bit is covered by a fast changing phase mask. The phase mask code is always combined with the wide band noise from ASE. Without knowing the right phase mask code to recover the stealth data, the eavesdropper can only receive the noise like signal with randomized phase.
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1990-01-01
The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.
NASA Technical Reports Server (NTRS)
Sharma, Naveen
1992-01-01
In this paper we briefly describe a combined symbolic and numeric approach for solving mathematical models on parallel computers. An experimental software system, PIER, is being developed in Common Lisp to synthesize computationally intensive and domain formulation dependent phases of finite element analysis (FEA) solution methods. Quantities for domain formulation like shape functions, element stiffness matrices, etc., are automatically derived using symbolic mathematical computations. The problem specific information and derived formulae are then used to generate (parallel) numerical code for FEA solution steps. A constructive approach to specify a numerical program design is taken. The code generator compiles application oriented input specifications into (parallel) FORTRAN77 routines with the help of built-in knowledge of the particular problem, numerical solution methods and the target computer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, Michael S
2017-11-08
HPC software for ab-initio, condensed-matter physics, quantum mechanics calculations needs to be built on top of well tested libraries some of which address requirements unique to the programming domain. During the development of the DCA++ code, that we use in our research, we have developed a collection of libraries that may be of use to other computational scientists working in the same or similar domains. The libraries include: a) a pythonic input-language system, b) tensors whose shape is constructed from generalized dimension objects such at time domains. frequency domains, momentum domains, vertex domains et. al. and c) linear algebra operationsmore » that resolve to BLA/LAPACK operations when possible. This supports the implementation of Greens functions and operations on them such as are used in condensed matter physics.« less
Systematic detection of internal symmetry in proteins using CE-Symm.
Myers-Turnbull, Douglas; Bliven, Spencer E; Rose, Peter W; Aziz, Zaid K; Youkharibache, Philippe; Bourne, Philip E; Prlić, Andreas
2014-05-29
Symmetry is an important feature of protein tertiary and quaternary structures that has been associated with protein folding, function, evolution, and stability. Its emergence and ensuing prevalence has been attributed to gene duplications, fusion events, and subsequent evolutionary drift in sequence. This process maintains structural similarity and is further supported by this study. To further investigate the question of how internal symmetry evolved, how symmetry and function are related, and the overall frequency of internal symmetry, we developed an algorithm, CE-Symm, to detect pseudo-symmetry within the tertiary structure of protein chains. Using a large manually curated benchmark of 1007 protein domains, we show that CE-Symm performs significantly better than previous approaches. We use CE-Symm to build a census of symmetry among domain superfamilies in SCOP and note that 18% of all superfamilies are pseudo-symmetric. Our results indicate that more domains are pseudo-symmetric than previously estimated. We establish a number of recurring types of symmetry-function relationships and describe several characteristic cases in detail. With the use of the Enzyme Commission classification, symmetry was found to be enriched in some enzyme classes but depleted in others. CE-Symm thus provides a methodology for a more complete and detailed study of the role of symmetry in tertiary protein structure [availability: CE-Symm can be run from the Web at http://source.rcsb.org/jfatcatserver/symmetry.jsp. Source code and software binaries are also available under the GNU Lesser General Public License (version 2.1) at https://github.com/rcsb/symmetry. An interactive census of domains identified as symmetric by CE-Symm is available from http://source.rcsb.org/jfatcatserver/scopResults.jsp]. Copyright © 2014. Published by Elsevier Ltd.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... parts of the National Board Inspection Code at http://www.nationalboard.org . DATES: The comment period... edition of the National Board Inspection Code for public review at www.nationalboard.org . Both documents...
48 CFR 1301.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Publication and code arrangement. 1301.105-1 Section 1301.105-1 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE GENERAL DEPARTMENT OF COMMERCE ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1301.105-1...
41 CFR 109-26.203 - Activity address codes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Activity address codes. 109-26.203 Section 109-26.203 Public Contracts and Property Management Federal Property Management Regulations System (Continued) DEPARTMENT OF ENERGY PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 26...
The application of coded excitation technology in medical ultrasonic Doppler imaging
NASA Astrophysics Data System (ADS)
Li, Weifeng; Chen, Xiaodong; Bao, Jing; Yu, Daoyin
2008-03-01
Medical ultrasonic Doppler imaging is one of the most important domains of modern medical imaging technology. The application of coded excitation technology in medical ultrasonic Doppler imaging system has the potential of higher SNR and deeper penetration depth than conventional pulse-echo imaging system, it also improves the image quality, and enhances the sensitivity of feeble signal, furthermore, proper coded excitation is beneficial to received spectrum of Doppler signal. Firstly, this paper analyzes the application of coded excitation technology in medical ultrasonic Doppler imaging system abstractly, showing the advantage and bright future of coded excitation technology, then introduces the principle and the theory of coded excitation. Secondly, we compare some coded serials (including Chirp and fake Chirp signal, Barker codes, Golay's complementary serial, M-sequence, etc). Considering Mainlobe Width, Range Sidelobe Level, Signal-to-Noise Ratio and sensitivity of Doppler signal, we choose Barker codes as coded serial. At last, we design the coded excitation circuit. The result in B-mode imaging and Doppler flow measurement coincided with our expectation, which incarnated the advantage of application of coded excitation technology in Digital Medical Ultrasonic Doppler Endoscope Imaging System.
Single-shot secure quantum network coding on butterfly network with free public communication
NASA Astrophysics Data System (ADS)
Owari, Masaki; Kato, Go; Hayashi, Masahito
2018-01-01
Quantum network coding on the butterfly network has been studied as a typical example of quantum multiple cast network. We propose a secure quantum network code for the butterfly network with free public classical communication in the multiple unicast setting under restricted eavesdropper’s power. This protocol certainly transmits quantum states when there is no attack. We also show the secrecy with shared randomness as additional resource when the eavesdropper wiretaps one of the channels in the butterfly network and also derives the information sending through public classical communication. Our protocol does not require verification process, which ensures single-shot security.
Turi, Bruna Camilo; Codogno, Jamile S; Fernandes, Romulo A; Sui, Xuemei; Lavie, Carl J; Blair, Steven N; Monteiro, Henrique Luiz
2015-11-01
Hypertension is one of the most common noncommunicable diseases worldwide, and physical inactivity is a risk factor predisposing to its occurrence and complications. However, it is still unclear the association between physical inactivity domains and hypertension, especially in public healthcare systems. Thus, this study aimed to investigate the association between physical inactivity aggregation in different domains and prevalence of hypertension among users of Brazilian public health system. 963 participants composed the sample. Subjects were divided into quartiles groups according to 3 different domains of physical activity (occupational; physical exercises; and leisure-time and transportation). Hypertension was based on physician diagnosis. Physical inactivity in occupational domain was significantly associated with higher prevalence of hypertension (OR = 1.52 [1.05 to 2.21]). The same pattern occurred for physical inactivity in leisure-time (OR = 1.63 [1.11 to 2.39]) and aggregation of physical inactivity in 3 domains (OR = 2.46 [1.14 to 5.32]). However, the multivariate-adjusted model showed significant association between hypertension and physical inactivity in 3 domains (OR = 2.57 [1.14 to 5.79]). The results suggest an unequal prevalence of hypertension according to physical inactivity across different domains and increasing the promotion of physical activity in the healthcare system is needed.
Identification and characterization of a novel zebrafish (Danio rerio) pentraxin-carbonic anhydrase.
Patrikainen, Maarit S; Tolvanen, Martti E E; Aspatwar, Ashok; Barker, Harlan R; Ortutay, Csaba; Jänis, Janne; Laitaoja, Mikko; Hytönen, Vesa P; Azizi, Latifeh; Manandhar, Prajwol; Jáger, Edit; Vullo, Daniela; Kukkurainen, Sampo; Hilvo, Mika; Supuran, Claudiu T; Parkkila, Seppo
2017-01-01
Carbonic anhydrases (CAs) are ubiquitous, essential enzymes which catalyze the conversion of carbon dioxide and water to bicarbonate and H + ions. Vertebrate genomes generally contain gene loci for 15-21 different CA isoforms, three of which are enzymatically inactive. CA VI is the only secretory protein of the enzymatically active isoforms. We discovered that non-mammalian CA VI contains a C-terminal pentraxin (PTX) domain, a novel combination for both CAs and PTXs. We isolated and sequenced zebrafish ( Danio rerio ) CA VI cDNA, complete with the sequence coding for the PTX domain, and produced the recombinant CA VI-PTX protein. Enzymatic activity and kinetic parameters were measured with a stopped-flow instrument. Mass spectrometry, analytical gel filtration and dynamic light scattering were used for biophysical characterization. Sequence analyses and Bayesian phylogenetics were used in generating hypotheses of protein structure and CA VI gene evolution. A CA VI-PTX antiserum was produced, and the expression of CA VI protein was studied by immunohistochemistry. A knock-down zebrafish model was constructed, and larvae were observed up to five days post-fertilization (dpf). The expression of ca6 mRNA was quantitated by qRT-PCR in different developmental times in morphant and wild-type larvae and in different adult fish tissues. Finally, the swimming behavior of the morphant fish was compared to that of wild-type fish. The recombinant enzyme has a very high carbonate dehydratase activity. Sequencing confirms a 530-residue protein identical to one of the predicted proteins in the Ensembl database (ensembl.org). The protein is pentameric in solution, as studied by gel filtration and light scattering, presumably joined by the PTX domains. Mass spectrometry confirms the predicted signal peptide cleavage and disulfides, and N-glycosylation in two of the four observed glycosylation motifs. Molecular modeling of the pentamer is consistent with the modifications observed in mass spectrometry. Phylogenetics and sequence analyses provide a consistent hypothesis of the evolutionary history of domains associated with CA VI in mammals and non-mammals. Briefly, the evidence suggests that ancestral CA VI was a transmembrane protein, the exon coding for the cytoplasmic domain was replaced by one coding for PTX domain, and finally, in the therian lineage, the PTX-coding exon was lost. We knocked down CA VI expression in zebrafish embryos with antisense morpholino oligonucleotides, resulting in phenotype features of decreased buoyancy and swim bladder deflation in 4 dpf larvae. These findings provide novel insights into the evolution, structure, and function of this unique CA form.
Development and validation of a public attitudes toward epilepsy (PATE) scale.
Lim, Kheng-Seang; Wu, Cathie; Choo, Wan-Yuen; Tan, Chong-Tin
2012-06-01
A quantitative scale of public attitudes toward epilepsy is essential to determine the magnitude of social stigma against epilepsy. This study aims to develop and validate a cross-culturally applicable scale of public attitudes toward epilepsy. A set of questions was selected from questionnaires identified from a literature review, following which a panel review determined the final version, consisting of 18 items. A 1-5 Likert scale was used for scoring. Additional questions, related to perception of the productivity of people with epilepsy and of a modified epilepsy stigma scale, were added as part of construct validation. One hundred and thirty heterogeneous respondents were collected, consisting of various age groups, ethnicity and occupation status levels. After item and factor analyses, the final version consisted of 14 items. Psychometric properties of the scale were first determined using factor analysis, which revealed a general and a personal domain, with good internal consistency (Cronbach's coefficient 0.868 and 0.633, respectively). Construct validation was demonstrated. The mean score for the personal domain was higher than that for the general domain (2.72±0.56 and 2.09±0.59, respectively). The mean scores of those with tertiary education were significantly lower for the general domain, but not for the personal domain. Age was positively correlated with the mean scores in the personal domain, but not in the general domain. This scale is a reliable and valid scale to assess public attitudes toward epilepsy, in both the general and personal domains. Copyright © 2012 Elsevier Inc. All rights reserved.
48 CFR 2301.105-1 - Publication and code ar-rangement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Publication and code ar-rangement. 2301.105-1 Section 2301.105-1 Federal Acquisition Regulations System SOCIAL SECURITY ADMINISTRATION GENERAL SOCIAL SECURITY ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 2301.105-1...
41 CFR 101-30.403-2 - Management codes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Management codes. 101-30.403-2 Section 101-30.403-2 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30...
pycola: N-body COLA method code
NASA Astrophysics Data System (ADS)
Tassev, Svetlin; Eisenstein, Daniel J.; Wandelt, Benjamin D.; Zaldarriagag, Matias
2015-09-01
pycola is a multithreaded Python/Cython N-body code, implementing the Comoving Lagrangian Acceleration (COLA) method in the temporal and spatial domains, which trades accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing. The COLA method achieves its speed by calculating the large-scale dynamics exactly using LPT while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos.
Improvement of Mishchenko's T-matrix code for absorbing particles.
Moroz, Alexander
2005-06-10
The use of Gaussian elimination with backsubstitution for matrix inversion in scattering theories is discussed. Within the framework of the T-matrix method (the state-of-the-art code by Mishchenko is freely available at http://www.giss.nasa.gov/-crmim), it is shown that the domain of applicability of Mishchenko's FORTRAN 77 (F77) code can be substantially expanded in the direction of strongly absorbing particles where the current code fails to converge. Such an extension is especially important if the code is to be used in nanoplasmonic or nanophotonic applications involving metallic particles. At the same time, convergence can also be achieved for large nonabsorbing particles, in which case the non-Numerical Algorithms Group option of Mishchenko's code diverges. Computer F77 implementation of Mishchenko's code supplemented with Gaussian elimination with backsubstitution is freely available at http://www.wave-scattering.com.
Flowers, Natalie L
2010-01-01
CodeSlinger is a desktop application that was developed to aid medical professionals in the intertranslation, exploration, and use of biomedical coding schemes. The application was designed to provide a highly intuitive, easy-to-use interface that simplifies a complex business problem: a set of time-consuming, laborious tasks that were regularly performed by a group of medical professionals involving manually searching coding books, searching the Internet, and checking documentation references. A workplace observation session with a target user revealed the details of the current process and a clear understanding of the business goals of the target user group. These goals drove the design of the application's interface, which centers on searches for medical conditions and displays the codes found in the application's database that represent those conditions. The interface also allows the exploration of complex conceptual relationships across multiple coding schemes.
Izzi, Stephanie A; Colantuono, Bonnie J; Sullivan, Kelly; Khare, Parul; Meedel, Thomas H
2013-04-15
Ci-MRF is the sole myogenic regulatory factor (MRF) of the ascidian Ciona intestinalis, an invertebrate chordate. In order to investigate its properties we developed a simple in vivo assay based on misexpressing Ci-MRF in the notochord of Ciona embryos. We used this assay to examine the roles of three structural motifs that are conserved among MRFs: an alanine-threonine (Ala-Thr) dipeptide of the basic domain that is known in vertebrates as the myogenic code, a cysteine/histidine-rich (C/H) domain found just N-terminal to the basic domain, and a carboxy-terminal amphipathic α-helix referred to as Helix III. We show that the Ala-Thr dipeptide is necessary for normal Ci-MRF function, and that while eliminating the C/H domain or Helix III individually has no demonstrable effect on Ci-MRF, simultaneous loss of both motifs significantly reduces its activity. Our studies also indicate that direct interaction between CiMRF and an essential E-box of Ciona Troponin I is required for the expression of this muscle-specific gene and that multiple classes of MRF-regulated genes exist in Ciona. These findings are consistent with substantial conservation of MRF-directed myogenesis in chordates and demonstrate for the first time that the Ala/Thr dipeptide of the basic domain of an invertebrate MRF behaves as a myogenic code. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Garcia Cartagena, Edgardo Javier; Santoni, Christian; Ciri, Umberto; Iungo, Giacomo Valerio; Leonardi, Stefano
2015-11-01
A large-scale wind farm operating under realistic atmospheric conditions is studied by coupling a meso-scale and micro-scale models. For this purpose, the Weather Research and Forecasting model (WRF) is coupled with an in-house LES solver for wind farms. The code is based on a finite difference scheme, with a Runge-Kutta, fractional step and the Actuator Disk Model. The WRF model has been configured using seven one-way nested domains where the child domain has a mesh size one third of its parent domain. A horizontal resolution of 70 m is used in the innermost domain. A section from the smallest and finest nested domain, 7.5 diameters upwind of the wind farm is used as inlet boundary condition for the LES code. The wind farm consists in six-turbines aligned with the mean wind direction and streamwise spacing of 10 rotor diameters, (D), and 2.75D in the spanwise direction. Three simulations were performed by varying the velocity fluctuations at the inlet: random perturbations, precursor simulation, and recycling perturbation method. Results are compared with a simulation on the same wind farm with an ideal uniform wind speed to assess the importance of the time varying incoming wind velocity. Numerical simulations were performed at TACC (Grant CTS070066). This work was supported by NSF, (Grant IIA-1243482 WINDINSPIRE).
Partially Key Distribution with Public Key Cryptosystem Based on Error Control Codes
NASA Astrophysics Data System (ADS)
Tavallaei, Saeed Ebadi; Falahati, Abolfazl
Due to the low level of security in public key cryptosystems based on number theory, fundamental difficulties such as "key escrow" in Public Key Infrastructure (PKI) and a secure channel in ID-based cryptography, a new key distribution cryptosystem based on Error Control Codes (ECC) is proposed . This idea is done by some modification on McEliece cryptosystem. The security of ECC cryptosystem obtains from the NP-Completeness of block codes decoding. The capability of generating public keys with variable lengths which is suitable for different applications will be provided by using ECC. It seems that usage of these cryptosystems because of decreasing in the security of cryptosystems based on number theory and increasing the lengths of their keys would be unavoidable in future.
Code of Federal Regulations, 2010 CFR
2010-07-01
... land withdrawn or reserved from the public domain and determines that it no longer needs this land, what must it do? 102-75.100 Section 102-75.100 Public Contracts and Property Management Federal... it no longer needs this land, what must it do? An agency holding unneeded land withdrawn or reserved...
1990-09-01
Justification Cat: Left Timeliness Identifier: Qwe (FI] Domain I Def Text: -Press [F3 to nove in/out of the fields below. Use ARROW keys to scroll- Rec: Host...Delete a record Elemont Creator ID: Justification Cat: Left Timeliness Identifier: Qwe [FI) Domain Def Text: - Press (F3) to move in/out of the...Number: 1 Alias Name: Accounting Code Data Value Type ID: QL Max Length Characters: 34 Timeliness ID: Qwe Justification Category: Left Creator ID: Domain
Stotz, Henrik U; Harvey, Pascoe J; Haddadi, Parham; Mashanova, Alla; Kukol, Andreas; Larkan, Nicholas J; Borhan, M Hossein; Fitt, Bruce D L
2018-01-01
Genes coding for nucleotide-binding leucine-rich repeat (LRR) receptors (NLRs) control resistance against intracellular (cell-penetrating) pathogens. However, evidence for a role of genes coding for proteins with LRR domains in resistance against extracellular (apoplastic) fungal pathogens is limited. Here, the distribution of genes coding for proteins with eLRR domains but lacking kinase domains was determined for the Brassica napus genome. Predictions of signal peptide and transmembrane regions divided these genes into 184 coding for receptor-like proteins (RLPs) and 121 coding for secreted proteins (SPs). Together with previously annotated NLRs, a total of 720 LRR genes were found. Leptosphaeria maculans-induced expression during a compatible interaction with cultivar Topas differed between RLP, SP and NLR gene families; NLR genes were induced relatively late, during the necrotrophic phase of pathogen colonization. Seven RLP, one SP and two NLR genes were found in Rlm1 and Rlm3/Rlm4/Rlm7/Rlm9 loci for resistance against L. maculans on chromosome A07 of B. napus. One NLR gene at the Rlm9 locus was positively selected, as was the RLP gene on chromosome A10 with LepR3 and Rlm2 alleles conferring resistance against L. maculans races with corresponding effectors AvrLm1 and AvrLm2, respectively. Known loci for resistance against L. maculans (extracellular hemi-biotrophic fungus), Sclerotinia sclerotiorum (necrotrophic fungus) and Plasmodiophora brassicae (intracellular, obligate biotrophic protist) were examined for presence of RLPs, SPs and NLRs in these regions. Whereas loci for resistance against P. brassicae were enriched for NLRs, no such signature was observed for the other pathogens. These findings demonstrate involvement of (i) NLR genes in resistance against the intracellular pathogen P. brassicae and a putative NLR gene in Rlm9-mediated resistance against the extracellular pathogen L. maculans.
NASA Astrophysics Data System (ADS)
Zhang, H.; Fang, H.; Yao, H.; Maceira, M.; van der Hilst, R. D.
2014-12-01
Recently, Zhang et al. (2014, Pure and Appiled Geophysics) have developed a joint inversion code incorporating body-wave arrival times and surface-wave dispersion data. The joint inversion code was based on the regional-scale version of the double-difference tomography algorithm tomoDD. The surface-wave inversion part uses the propagator matrix solver in the algorithm DISPER80 (Saito, 1988) for forward calculation of dispersion curves from layered velocity models and the related sensitivities. The application of the joint inversion code to the SAFOD site in central California shows that the fault structure is better imaged in the new model, which is able to fit both the body-wave and surface-wave observations adequately. Here we present a new joint inversion method that solves the model in the wavelet domain constrained by sparsity regularization. Compared to the previous method, it has the following advantages: (1) The method is both data- and model-adaptive. For the velocity model, it can be represented by different wavelet coefficients at different scales, which are generally sparse. By constraining the model wavelet coefficients to be sparse, the inversion in the wavelet domain can inherently adapt to the data distribution so that the model has higher spatial resolution in the good data coverage zone. Fang and Zhang (2014, Geophysical Journal International) have showed the superior performance of the wavelet-based double-difference seismic tomography method compared to the conventional method. (2) For the surface wave inversion, the joint inversion code takes advantage of the recent development of direct inversion of surface wave dispersion data for 3-D variations of shear wave velocity without the intermediate step of phase or group velocity maps (Fang et al., 2014, Geophysical Journal International). A fast marching method is used to compute, at each period, surface wave traveltimes and ray paths between sources and receivers. We will test the new joint inversion code at the SAFOD site to compare its performance over the previous code. We will also select another fault zone such as the San Jacinto Fault Zone to better image its structure.
Public Domain Microcomputer Software for Forestry.
ERIC Educational Resources Information Center
Martin, Les
A project was conducted to develop a computer forestry/forest products bibliography applicable to high school and community college vocational/technical programs. The project director contacted curriculum clearinghouses, computer companies, and high school and community college instructors in order to obtain listings of public domain programs for…
41 CFR 102-173.35 - Who authorizes domain names?
Code of Federal Regulations, 2012 CFR
2012-01-01
... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Who authorizes domain names? 102-173.35 Section 102-173.35 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION TELECOMMUNICATIONS 173-INTERNET GOV...
41 CFR 102-173.35 - Who authorizes domain names?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false Who authorizes domain names? 102-173.35 Section 102-173.35 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION TELECOMMUNICATIONS 173-INTERNET GOV...
41 CFR 102-173.35 - Who authorizes domain names?
Code of Federal Regulations, 2014 CFR
2014-01-01
... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false Who authorizes domain names? 102-173.35 Section 102-173.35 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION TELECOMMUNICATIONS 173-INTERNET GOV...
41 CFR 102-173.35 - Who authorizes domain names?
Code of Federal Regulations, 2011 CFR
2011-01-01
... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Who authorizes domain names? 102-173.35 Section 102-173.35 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION TELECOMMUNICATIONS 173-INTERNET GOV...
41 CFR 102-173.35 - Who authorizes domain names?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Who authorizes domain names? 102-173.35 Section 102-173.35 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION TELECOMMUNICATIONS 173-INTERNET GOV...
Green, Nancy
2005-04-01
We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.
Optical LDPC decoders for beyond 100 Gbits/s optical transmission.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2009-05-01
We present an optical low-density parity-check (LDPC) decoder suitable for implementation above 100 Gbits/s, which provides large coding gains when based on large-girth LDPC codes. We show that a basic building block, the probabilities multiplier circuit, can be implemented using a Mach-Zehnder interferometer, and we propose corresponding probabilistic-domain sum-product algorithm (SPA). We perform simulations of a fully parallel implementation employing girth-10 LDPC codes and proposed SPA. The girth-10 LDPC(24015,19212) code of the rate of 0.8 outperforms the BCH(128,113)xBCH(256,239) turbo-product code of the rate of 0.82 by 0.91 dB (for binary phase-shift keying at 100 Gbits/s and a bit error rate of 10(-9)), and provides a net effective coding gain of 10.09 dB.
1999-01-01
Some means currently under investigation include domain-speci c languages which are easy to check (e.g., PLAN), proof-carrying code [NL96, Nec97...domain-speci c language coupled to an extension system with heavyweight checks. In this way, the frequent (per- packet) dynamic checks are inexpensive...to CISC architectures remains problematic. Typed assembly language [MWCG98] propagates type safety information to the assembly language level, so
Domain Specific Language Support for Exascale. Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baden, Scott
The project developed a domain specific translator enable legacy MPI source code to tolerate communication delays, which are increasing over time due to technological factors. The translator performs source-to-source translation that incorporates semantic information into the translation process. The output of the translator is a C program runs as a data driven program, and uses an existing run time to overlap communication automatically
2010-01-01
Background Intragenic tandem repeats occur throughout all domains of life and impart functional and structural variability to diverse translation products. Repeat proteins confer distinctive surface phenotypes to many unicellular organisms, including those with minimal genomes such as the wall-less bacterial monoderms, Mollicutes. One such repeat pattern in this clade is distributed in a manner suggesting its exchange by horizontal gene transfer (HGT). Expanding genome sequence databases reveal the pattern in a widening range of bacteria, and recently among eucaryotic microbes. We examined the genomic flux and consequences of the motif by determining its distribution, predicted structural features and association with membrane-targeted proteins. Results Using a refined hidden Markov model, we document a 25-residue protein sequence motif tandemly arrayed in variable-number repeats in ORFs lacking assigned functions. It appears sporadically in unicellular microbes from disparate bacterial and eucaryotic clades, representing diverse lifestyles and ecological niches that include host parasitic, marine and extreme environments. Tracts of the repeats predict a malleable configuration of recurring domains, with conserved hydrophobic residues forming an amphipathic secondary structure in which hydrophilic residues endow extensive sequence variation. Many ORFs with these domains also have membrane-targeting sequences that predict assorted topologies; others may comprise reservoirs of sequence variants. We demonstrate expressed variants among surface lipoproteins that distinguish closely related animal pathogens belonging to a subgroup of the Mollicutes. DNA sequences encoding the tandem domains display dyad symmetry. Moreover, in some taxa the domains occur in ORFs selectively associated with mobile elements. These features, a punctate phylogenetic distribution, and different patterns of dispersal in genomes of related taxa, suggest that the repeat may be disseminated by HGT and intra-genomic shuffling. Conclusions We describe novel features of PARCELs (Palindromic Amphipathic Repeat Coding ELements), a set of widely distributed repeat protein domains and coding sequences that were likely acquired through HGT by diverse unicellular microbes, further mobilized and diversified within genomes, and co-opted for expression in the membrane proteome of some taxa. Disseminated by multiple gene-centric vehicles, ORFs harboring these elements enhance accessory gene pools as part of the "mobilome" connecting genomes of various clades, in taxa sharing common niches. PMID:20626840
Universal Ethics Code: Both Possible and Feasible.
ERIC Educational Resources Information Center
Kruckeberg, Dean
1993-01-01
Argues that no insurmountable barriers preclude the development of a binding code within the public relations professional community. Suggests a professional model similar to that used by Certified Public Accountants as more appropriate, because it recognizes that not all the activities of its practitioners can be exclusionary and limited to those…
41 CFR 102-33.375 - What is a FSCAP Criticality Code?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is a FSCAP Criticality Code? 102-33.375 Section 102-33.375 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION PERSONAL PROPERTY 33-MANAGEMENT OF...
Internal Corrosion Control of Water Supply Systems Code of Practice
This Code of Practice is part of a series of publications by the IWA Specialist Group on Metals and Related Substances in Drinking Water. It complements the following IWA Specialist Group publications: 1. Best Practice Guide on the Control of Lead in Drinking Water 2. Best Prac...
41 CFR 102-36.240 - What are the disposal condition codes?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What are the disposal condition codes? 102-36.240 Section 102-36.240 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION PERSONAL PROPERTY 36-DISPOSITION OF...
The search for person-related information in general practice: a qualitative study.
Schrans, Diego; Avonts, Dirk; Christiaens, Thierry; Willems, Sara; de Smet, Kaat; van Boven, Kees; Boeckxstaens, Pauline; Kühlein, Thomas
2016-02-01
General practice is person-focused. Contextual information influences the clinical decision-making process in primary care. Currently, person-related information (PeRI) is neither recorded in a systematic way nor coded in the electronic medical record (EMR), and therefore not usable for scientific use. To search for classes of PeRI influencing the process of care. GPs, from nine countries worldwide, were asked to write down narrative case histories where personal factors played a role in decision-making. In an inductive process, the case histories were consecutively coded according to classes of PeRI. The classes found were deductively applied to the following cases and refined, until saturation was reached. Then, the classes were grouped into code-families and further clustered into domains. The inductive analysis of 32 case histories resulted in 33 defined PeRI codes, classifying all personal-related information in the cases. The 33 codes were grouped in the following seven mutually exclusive code-families: 'aspects between patient and formal care provider', 'social environment and family', 'functioning/behaviour', 'life history/non-medical experiences', 'personal medical information', 'socio-demographics' and 'work-/employment-related information'. The code-families were clustered into four domains: 'social environment and extended family', 'medicine', 'individual' and 'work and employment'. As PeRI is used in the process of decision-making, it should be part of the EMR. The PeRI classes we identified might form the basis of a new contextual classification mainly for research purposes. This might help to create evidence of the person-centredness of general practice. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Public health legal preparedness in Indian country.
Bryan, Ralph T; Schaefer, Rebecca McLaughlin; DeBruyn, Lemyra; Stier, Daniel D
2009-04-01
American Indian/Alaska Native tribal governments are sovereign entities with inherent authority to create laws and enact health regulations. Laws are an essential tool for ensuring effective public health responses to emerging threats. To analyze how tribal laws support public health practice in tribal communities, we reviewed tribal legal documentation available through online databases and talked with subject-matter experts in tribal public health law. Of the 70 tribal codes we found, 14 (20%) had no clearly identifiable public health provisions. The public health-related statutes within the remaining codes were rarely well integrated or comprehensive. Our findings provide an evidence base to help tribal leaders strengthen public health legal foundations in tribal communities.
Oya, Eriko; Kato, Hiroaki; Chikashige, Yuji; Tsutsumi, Chihiro; Hiraoka, Yasushi; Murakami, Yota
2013-01-01
Heterochromatin at the pericentromeric repeats in fission yeast is assembled and spread by an RNAi-dependent mechanism, which is coupled with the transcription of non-coding RNA from the repeats by RNA polymerase II. In addition, Rrp6, a component of the nuclear exosome, also contributes to heterochromatin assembly and is coupled with non-coding RNA transcription. The multi-subunit complex Mediator, which directs initiation of RNA polymerase II-dependent transcription, has recently been suggested to function after initiation in processes such as elongation of transcription and splicing. However, the role of Mediator in the regulation of chromatin structure is not well understood. We investigated the role of Mediator in pericentromeric heterochromatin formation and found that deletion of specific subunits of the head domain of Mediator compromised heterochromatin structure. The Mediator head domain was required for Rrp6-dependent heterochromatin nucleation at the pericentromere and for RNAi-dependent spreading of heterochromatin into the neighboring region. In the latter process, Mediator appeared to contribute to efficient processing of siRNA from transcribed non-coding RNA, which was required for efficient spreading of heterochromatin. Furthermore, the head domain directed efficient transcription in heterochromatin. These results reveal a pivotal role for Mediator in multiple steps of transcription-coupled formation of pericentromeric heterochromatin. This observation further extends the role of Mediator to co-transcriptional chromatin regulation.
Noussa-Yao, Joseph; Heudes, Didier; Escudie, Jean-Baptiste; Degoulet, Patrice
2016-01-01
Short-stay MSO (Medicine, Surgery, Obstetrics) hospitalization activities in public and private hospitals providing public services are funded through charges for the services provided (T2A in French). Coding must be well matched to the severity of the patient's condition, to ensure that appropriate funding is provided to the hospital. We propose the use of an autocompletion process and multidimensional matrix, to help physicians to improve the expression of information and to optimize clinical coding. With this approach, physicians without knowledge of the encoding rules begin from a rough concept, which is gradually refined through semantic proximity and uses information on the associated codes stemming of optimized knowledge bases of diagnosis code.
Error-correcting pairs for a public-key cryptosystem
NASA Astrophysics Data System (ADS)
Pellikaan, Ruud; Márquez-Corbella, Irene
2017-06-01
Code-based Cryptography (CBC) is a powerful and promising alternative for quantum resistant cryptography. Indeed, together with lattice-based cryptography, multivariate cryptography and hash-based cryptography are the principal available techniques for post-quantum cryptography. CBC was first introduced by McEliece where he designed one of the most efficient Public-Key encryption schemes with exceptionally strong security guarantees and other desirable properties that still resist to attacks based on Quantum Fourier Transform and Amplitude Amplification. The original proposal, which remains unbroken, was based on binary Goppa codes. Later, several families of codes have been proposed in order to reduce the key size. Some of these alternatives have already been broken. One of the main requirements of a code-based cryptosystem is having high performance t-bounded decoding algorithms which is achieved in the case the code has a t-error-correcting pair (ECP). Indeed, those McEliece schemes that use GRS codes, BCH, Goppa and algebraic geometry codes are in fact using an error-correcting pair as a secret key. That is, the security of these Public-Key Cryptosystems is not only based on the inherent intractability of bounded distance decoding but also on the assumption that it is difficult to retrieve efficiently an error-correcting pair. In this paper, the class of codes with a t-ECP is proposed for the McEliece cryptosystem. Moreover, we study the hardness of distinguishing arbitrary codes from those having a t-error correcting pair.
41 CFR 101-27.205 - Shelf-life codes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Shelf-life codes. 101-27...-Management of Shelf-Life Materials § 101-27.205 Shelf-life codes. Shelf-life items shall be identified by use of a one-digit code to provide for uniform coding of shelf-life materials by all agencies. (a) The...
41 CFR 101-27.205 - Shelf-life codes.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 41 Public Contracts and Property Management 2 2011-07-01 2007-07-01 true Shelf-life codes. 101-27...-Management of Shelf-Life Materials § 101-27.205 Shelf-life codes. Shelf-life items shall be identified by use of a one-digit code to provide for uniform coding of shelf-life materials by all agencies. (a) The...
41 CFR 101-27.205 - Shelf-life codes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 41 Public Contracts and Property Management 2 2014-07-01 2012-07-01 true Shelf-life codes. 101-27...-Management of Shelf-Life Materials § 101-27.205 Shelf-life codes. Shelf-life items shall be identified by use of a one-digit code to provide for uniform coding of shelf-life materials by all agencies. (a) The...
41 CFR 101-27.205 - Shelf-life codes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 41 Public Contracts and Property Management 2 2013-07-01 2012-07-01 true Shelf-life codes. 101-27...-Management of Shelf-Life Materials § 101-27.205 Shelf-life codes. Shelf-life items shall be identified by use of a one-digit code to provide for uniform coding of shelf-life materials by all agencies. (a) The...
41 CFR 101-27.205 - Shelf-life codes.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 41 Public Contracts and Property Management 2 2012-07-01 2012-07-01 false Shelf-life codes. 101-27...-Management of Shelf-Life Materials § 101-27.205 Shelf-life codes. Shelf-life items shall be identified by use of a one-digit code to provide for uniform coding of shelf-life materials by all agencies. (a) The...
Illinois, Indiana, and Ohio Magnetic and Gravity Maps and Data: A Website for Distribution of Data
Daniels, David L.; Kucks, Robert P.; Hill, Patricia L.
2008-01-01
This web site gives the results of a USGS project to acquire the best available, public-domain, aeromagnetic and gravity data in the United States and merge these data into uniform, composite grids for each state. The results for the three states, Illinois, Indiana, and Ohio are presented here in one site. Files of aeromagnetic and gravity grids and images are available for these states for downloading. In Illinois, Indiana, and Ohio, 19 magnetic surveys have been knit together to form a single digital grid and map. And, a complete Bouguer gravity anomaly grid and map was generated from 128,227 gravity station measurements in and adjacent to Illinois, Indiana, and Ohio. In addition, a map shows the location of the aeromagnetic surveys, color-coded to the survey flight-line spacing. This project was supported by the Mineral Resource Program of the USGS.
NASA Technical Reports Server (NTRS)
Brusse, Jay
2000-01-01
The Active and Passive Supplier Assessment Programs (ASAP and PSAP) WWW Sites provide general information to the electronic parts community regarding the availability of electronic parts. They also provide information to NASA regarding modifications to commonly used procurement specifications and test methods. The ASAP and PSAP www sites are ongoing resources produced by Code 562 in support of the NASA HQ funded NASA Electronic Parts and Packaging (NEPP) Program. These WWW sites do not provide information pertaining to patented or proprietary information. All of the information contained in these www sites is available through various other public domain resources such as US Military Qualified Producers Listings (QPLs) and Qualified Manufacturer Listings (QMLs) and industry working groups such as the Electronics Industry Alliance (EIA) and the Space Parts Working Group (SPWG).
PROVAT: a tool for Voronoi tessellation analysis of protein structures and complexes.
Gore, Swanand P; Burke, David F; Blundell, Tom L
2005-08-01
Voronoi tessellation has proved to be a useful tool in protein structure analysis. We have developed PROVAT, a versatile public domain software that enables computation and visualization of Voronoi tessellations of proteins and protein complexes. It is a set of Python scripts that integrate freely available specialized software (Qhull, Pymol etc.) into a pipeline. The calculation component of the tool computes Voronoi tessellation of a given protein system in a way described by a user-supplied XML recipe and stores resulting neighbourhood information as text files with various styles. The Python pickle file generated in the process is used by the visualization component, a Pymol plug-in, that offers a GUI to explore the tessellation visually. PROVAT source code can be downloaded from http://raven.bioc.cam.ac.uk/~swanand/Provat1, which also provides a webserver for its calculation component, documentation and examples.
NASA Astrophysics Data System (ADS)
Landry, Blake J.; Hancock, Matthew J.; Mei, Chiang C.; García, Marcelo H.
2012-09-01
The ability to determine wave heights and phases along a spatial domain is vital to understanding a wide range of littoral processes. The software tool presented here employs established Stokes wave theory and sampling methods to calculate parameters for the incident and reflected components of a field of weakly nonlinear waves, monochromatic at first order in wave slope and propagating in one horizontal dimension. The software calculates wave parameters over an entire wave tank and accounts for reflection, weak nonlinearity, and a free second harmonic. Currently, no publicly available program has such functionality. The included MATLAB®-based open source code has also been compiled for Windows®, Mac® and Linux® operating systems. An additional companion program, VirtualWave, is included to generate virtual wave fields for WaveAR. Together, the programs serve as ideal analysis and teaching tools for laboratory water wave systems.
Dirty liberals! Reminders of physical cleanliness influence moral and political attitudes.
Helzer, Erik G; Pizarro, David A
2011-04-01
Many moral codes place a special emphasis on bodily purity, and manipulations that directly target bodily purity have been shown to influence a variety of moral judgments. Across two studies, we demonstrated that reminders of physical purity influence specific moral judgments regarding behaviors in the sexual domain as well as broad political attitudes. In Study 1, individuals in a public setting who were given a reminder of physical cleansing reported being more politically conservative than did individuals who were not given such a reminder. In Study 2, individuals reminded of physical cleansing in the laboratory demonstrated harsher moral judgments toward violations of sexual purity and were more likely to report being politically conservative than control participants. Together, these experiments provide further evidence of a deep link between physical purity and moral judgment, and they offer preliminary evidence that manipulations of physical purity can influence general (and putatively stable) political attitudes.
NASA Technical Reports Server (NTRS)
Banks, David C.
1994-01-01
This talk features two simple and useful tools for digital image processing in the UNIX environment. They are xv and pbmplus. The xv image viewer which runs under the X window system reads images in a number of different file formats and writes them out in different formats. The view area supports a pop-up control panel. The 'algorithms' menu lets you blur an image. The xv control panel also activates the color editor which displays the image's color map (if one exists). The xv image viewer is available through the internet. The pbmplus package is a set of tools designed to perform image processing from within a UNIX shell. The acronym 'pbm' stands for portable bit map. Like xv, the pbm plus tool can convert images from and to many different file formats. The source code and manual pages for pbmplus are also available through the internet. This software is in the public domain.
Future perspectives - proposal for Oxford Physiome Project.
Oku, Yoshitaka
2010-01-01
The Physiome Project is an effort to understand living creatures using "analysis by synthesis" strategy, i.e., by reproducing their behaviors. In order to achieve its goal, sharing developed models between different computer languages and application programs to incorporate into integrated models is critical. To date, several XML-based markup languages has been developed for this purpose. However, source codes written with XML-based languages are very difficult to read and edit using text editors. An alternative way is to use an object-oriented meta-language, which can be translated to different computer languages and transplanted to different application programs. Object-oriented languages are suitable for describing structural organization by hierarchical classes and taking advantage of statistical properties to reduce the number of parameter while keeping the complexity of behaviors. Using object-oriented languages to describe each element and posting it to a public domain should be the next step to build up integrated models of the respiratory control system.
Overpressures in the Uinta Basin, Utah: Analysis using a three-dimensional basin evolution model
NASA Astrophysics Data System (ADS)
McPherson, Brian J. O. L.; Bredehoeft, John D.
2001-04-01
High pore fluid pressures, approaching lithostatic, are observed in the deepest sections of the Uinta basin, Utah. Geologic observations and previous modeling studies suggest that the most likely cause of observed overpressures is hydrocarbon generation. We studied Uinta overpressures by developing and applying a three-dimensional, numerical model of the evolution of the basin. The model was developed from a public domain computer code, with addition of a new mesh generator that builds the basin through time, coupling the structural, thermal, and hydrodynamic evolution. Also included in the model are in situ hydrocarbon generation and multiphase migration. The modeling study affirmed oil generation as an overpressure mechanism, but also elucidated the relative roles of multiphase fluid interaction, oil density and viscosity, and sedimentary compaction. An important result is that overpressures by oil generation create conditions for rock fracturing, and associated fracture permeability may regulate or control the propensity to maintain overpressures.
Som, Nicholas A.; Goodman, Damon H.; Perry, Russell W.; Hardy, Thomas B.
2016-01-01
Previous methods for constructing univariate habitat suitability criteria (HSC) curves have ranged from professional judgement to kernel-smoothed density functions or combinations thereof. We present a new method of generating HSC curves that applies probability density functions as the mathematical representation of the curves. Compared with previous approaches, benefits of our method include (1) estimation of probability density function parameters directly from raw data, (2) quantitative methods for selecting among several candidate probability density functions, and (3) concise methods for expressing estimation uncertainty in the HSC curves. We demonstrate our method with a thorough example using data collected on the depth of water used by juvenile Chinook salmon (Oncorhynchus tschawytscha) in the Klamath River of northern California and southern Oregon. All R code needed to implement our example is provided in the appendix. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.
Coastal Processes: Challenges for Monitoring and Prediction
2009-01-01
Code 1008.3 ADOR/Director NCST E. R. Franchi , 7000 Public Affairs (Unclassified/ Unlimited Only). Code 703o 4 Division, Code Author, Code n...Research Global and the Fondazione Cassa di Risparmio di La Spezia for the financial support provided for the conference and the special issue.
28 CFR 36.607 - Guidance concerning model codes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Guidance concerning model codes. 36.607... BY PUBLIC ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.607 Guidance concerning model codes. Upon application by an authorized representative of a...
28 CFR 36.607 - Guidance concerning model codes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Guidance concerning model codes. 36.607... BY PUBLIC ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.607 Guidance concerning model codes. Upon application by an authorized representative of a...
28 CFR 36.607 - Guidance concerning model codes.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Guidance concerning model codes. 36.607... BY PUBLIC ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.607 Guidance concerning model codes. Upon application by an authorized representative of a...
28 CFR 36.607 - Guidance concerning model codes.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Guidance concerning model codes. 36.607... BY PUBLIC ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.607 Guidance concerning model codes. Upon application by an authorized representative of a...