Adaptive format conversion for scalable video coding
NASA Astrophysics Data System (ADS)
Wan, Wade K.; Lim, Jae S.
2001-12-01
The enhancement layer in many scalable coding algorithms is composed of residual coding information. There is another type of information that can be transmitted instead of (or in addition to) residual coding. Since the encoder has access to the original sequence, it can utilize adaptive format conversion (AFC) to generate the enhancement layer and transmit the different format conversion methods as enhancement data. This paper investigates the use of adaptive format conversion information as enhancement data in scalable video coding. Experimental results are shown for a wide range of base layer qualities and enhancement bitrates to determine when AFC can improve video scalability. Since the parameters needed for AFC are small compared to residual coding, AFC can provide video scalability at low enhancement layer bitrates that are not possible with residual coding. In addition, AFC can also be used in addition to residual coding to improve video scalability at higher enhancement layer bitrates. Adaptive format conversion has not been studied in detail, but many scalable applications may benefit from it. An example of an application that AFC is well-suited for is the migration path for digital television where AFC can provide immediate video scalability as well as assist future migrations.
A unified approach to the study of temporal, correlational, and rate coding.
Panzeri, S; Schultz, S R
2001-06-01
We demonstrate that the information contained in the spike occurrence times of a population of neurons can be broken up into a series of terms, each reflecting something about potential coding mechanisms. This is possible in the coding regime in which few spikes are emitted in the relevant time window. This approach allows us to study the additional information contributed by spike timing beyond that present in the spike counts and to examine the contributions to the whole information of different statistical properties of spike trains, such as firing rates and correlation functions. It thus forms the basis for a new quantitative procedure for analyzing simultaneous multiple neuron recordings and provides theoretical constraints on neural coding strategies. We find a transition between two coding regimes, depending on the size of the relevant observation timescale. For time windows shorter than the timescale of the stimulus-induced response fluctuations, there exists a spike count coding phase, in which the purely temporal information is of third order in time. For time windows much longer than the characteristic timescale, there can be additional timing information of first order, leading to a temporal coding phase in which timing information may affect the instantaneous information rate. In this new framework, we study the relative contributions of the dynamic firing rate and correlation variables to the full temporal information, the interaction of signal and noise correlations in temporal coding, synergy between spikes and between cells, and the effect of refractoriness. We illustrate the utility of the technique by analyzing a few cells from the rat barrel cortex.
Venepalli, Neeta K; Qamruzzaman, Yusuf; Li, Jianrong John; Lussier, Yves A; Boyd, Andrew D
2014-03-01
To quantify coding ambiguity in International Classification of Diseases Ninth Revision Clinical Modification conversions (ICD-9-CM) to ICD-10-CM mappings for hematology-oncology diagnoses within an Illinois Medicaid database and an academic cancer center database (University of Illinois Cancer Center [UICC]) with the goal of anticipating challenges during ICD-10-CM transition. One data set of ICD-9-CM diagnosis codes came from the 2010 Illinois Department of Medicaid, filtered for diagnoses generated by hematology-oncology providers. The other data set of ICD-9-CM diagnosis codes came from UICC. Using a translational methodology via the Motif Web portal ICD-9-CM conversion tool, ICD-9-CM to ICD-10-CM code conversions were graphically mapped and evaluated for clinical loss of information. The transition to ICD-10-CM led to significant information loss, affecting 8% of total Medicaid codes and 1% of UICC codes; 39 ICD-9-CM codes with information loss accounted for 2.9% of total Medicaid reimbursements and 5.3% of UICC billing charges. Prior work stated hematology-oncology would be the least affected medical specialty. However, information loss affecting 5% of billing costs could evaporate the operating margin of a practice. By identifying codes at risk for complex transitions, the analytic tools described can be replicated for oncology practices to forecast areas requiring additional training and resource allocation. In summary, complex transitions and diagnosis codes associated with information loss within clinical oncology require additional attention during the transition to ICD-10-CM.
Side information in coded aperture compressive spectral imaging
NASA Astrophysics Data System (ADS)
Galvis, Laura; Arguello, Henry; Lau, Daniel; Arce, Gonzalo R.
2017-02-01
Coded aperture compressive spectral imagers sense a three-dimensional cube by using two-dimensional projections of the coded and spectrally dispersed source. These imagers systems often rely on FPA detectors, SLMs, micromirror devices (DMDs), and dispersive elements. The use of the DMDs to implement the coded apertures facilitates the capture of multiple projections, each admitting a different coded aperture pattern. The DMD allows not only to collect the sufficient number of measurements for spectrally rich scenes or very detailed spatial scenes but to design the spatial structure of the coded apertures to maximize the information content on the compressive measurements. Although sparsity is the only signal characteristic usually assumed for reconstruction in compressing sensing, other forms of prior information such as side information have been included as a way to improve the quality of the reconstructions. This paper presents the coded aperture design in a compressive spectral imager with side information in the form of RGB images of the scene. The use of RGB images as side information of the compressive sensing architecture has two main advantages: the RGB is not only used to improve the reconstruction quality but to optimally design the coded apertures for the sensing process. The coded aperture design is based on the RGB scene and thus the coded aperture structure exploits key features such as scene edges. Real reconstructions of noisy compressed measurements demonstrate the benefit of the designed coded apertures in addition to the improvement in the reconstruction quality obtained by the use of side information.
NASA Technical Reports Server (NTRS)
Woo, Simon S.; Cheng, Michael K.
2011-01-01
The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode" but also "what to encode" to achieve UEP. Another advantage of the priority encoding process is that the majority of high-priority data can be decoded sooner since only a small number of code symbols are required to reconstruct high-priority data. This approach increases the likelihood that high-priority data is decoded first over low-priority data. The Prioritized LT code scheme achieves an improvement in high-priority data decoding performance as well as overall information recovery without penalizing the decoding of low-priority data, assuming high-priority data is no more than half of a message block. The cost is in the additional complexity required in the encoder. If extra computation resource is available at the transmitter, image, voice, and video transmission quality in terrestrial and space communications can benefit from accurate use of redundancy in protecting data with varying priorities.
Computer-Based Learning of Spelling Skills in Children with and without Dyslexia
ERIC Educational Resources Information Center
Kast, Monika; Baschera, Gian-Marco; Gross, Markus; Jancke, Lutz; Meyer, Martin
2011-01-01
Our spelling training software recodes words into multisensory representations comprising visual and auditory codes. These codes represent information about letters and syllables of a word. An enhanced version, developed for this study, contains an additional phonological code and an improved word selection controller relying on a phoneme-based…
Kim, Yeong Gug; Woo, Eunju
2016-07-01
The objectives of this study are to apply the TAM using the addition of perceived information to individuals' behavioral intention to use the QR code for the food traceability system; and to determine the moderating effects of food involvement on the relationship between perceived information and perceived usefulness. Results from a survey of 420 respondents are analyzed using structural equation modeling. The study findings reveal that the extended TAM has a satisfactory fit to the data and that the underlying dimensions have a significant effect on consumers' intention to use the QR code for the food traceability system. In addition, food involvement plays a significant moderating function in the relationship between perceived information and perceived usefulness. The implications of this study for future research are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes
NASA Astrophysics Data System (ADS)
Harrington, James William
Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present a local classical processing scheme for correcting errors on toric codes, which demonstrates that quantum information can be maintained in two dimensions by purely local (quantum and classical) resources.
UNICOS Kernel Internals Application Development
NASA Technical Reports Server (NTRS)
Caredo, Nicholas; Craw, James M. (Technical Monitor)
1995-01-01
Having an understanding of UNICOS Kernel Internals is valuable information. However, having the knowledge is only half the value. The second half comes with knowing how to use this information and apply it to the development of tools. The kernel contains vast amounts of useful information that can be utilized. This paper discusses the intricacies of developing utilities that utilize kernel information. In addition, algorithms, logic, and code will be discussed for accessing kernel information. Code segments will be provided that demonstrate how to locate and read kernel structures. Types of applications that can utilize kernel information will also be discussed.
JEA's Code of Ethics for Advisers; and Sites for Additional Ethics Information.
ERIC Educational Resources Information Center
Bowen, John
1997-01-01
Lists general principles that media advisers should follow, the 12 points agreed upon as the Journalism Education Association's (JEA) Code of Ethics for Advisers, and a list of Web sites that deal with journalism ethics. (PA)
Users manual for the improved NASA Lewis ice accretion code LEWICE 1.6
NASA Technical Reports Server (NTRS)
Wright, William B.
1995-01-01
This report is intended as an update/replacement to NASA CR 185129 'User's Manual for the NASALewis Ice Accretion Prediction Code (LEWICE)' and as an update to NASA CR 195387 'Update to the NASA Lewis Ice Accretion Code LEWICE'. In addition to describing the changes specifically made for this version, information from previous manuals will be duplicated so that the user will not need three manuals to use this code.
Child Injury Deaths: Comparing Prevention Information from Two Coding Systems
Schnitzer, Patricia G.; Ewigman, Bernard G.
2006-01-01
Objectives The International Classification of Disease (ICD) external cause of injury E-codes do not sufficiently identify injury circumstances amenable to prevention. The researchers developed an alternative classification system (B-codes) that incorporates behavioral and environmental factors, for use in childhood injury research, and compare the two coding systems in this paper. Methods All fatal injuries among children less than age five that occurred between January 1, 1992, and December 31, 1994, were classified using both B-codes and E-codes. Results E-codes identified the most common causes of injury death: homicide (24%), fires (21%), motor vehicle incidents (21%), drowning (10%), and suffocation (9%). The B-codes further revealed that homicides (51%) resulted from the child being shaken or struck by another person; many fires deaths (42%) resulted from children playing with matches or lighters; drownings (46%) usually occurred in natural bodies of water; and most suffocation deaths (68%) occurred in unsafe sleeping arrangements. Conclusions B-codes identify additional information with specific relevance for prevention of childhood injuries. PMID:15944169
NASA Astrophysics Data System (ADS)
Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi
2014-12-01
Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.
Visual pattern image sequence coding
NASA Technical Reports Server (NTRS)
Silsbee, Peter; Bovik, Alan C.; Chen, Dapang
1990-01-01
The visual pattern image coding (VPIC) configurable digital image-coding process is capable of coding with visual fidelity comparable to the best available techniques, at compressions which (at 30-40:1) exceed all other technologies. These capabilities are associated with unprecedented coding efficiencies; coding and decoding operations are entirely linear with respect to image size and entail a complexity that is 1-2 orders of magnitude faster than any previous high-compression technique. The visual pattern image sequence coding to which attention is presently given exploits all the advantages of the static VPIC in the reduction of information from an additional, temporal dimension, to achieve unprecedented image sequence coding performance.
Optical encryption and QR codes: secure and noise-free information retrieval.
Barrera, John Fredy; Mira, Alejandro; Torroba, Roberto
2013-03-11
We introduce for the first time the concept of an information "container" before a standard optical encrypting procedure. The "container" selected is a QR code which offers the main advantage of being tolerant to pollutant speckle noise. Besides, the QR code can be read by smartphones, a massively used device. Additionally, QR code includes another secure step to the encrypting benefits the optical methods provide. The QR is generated by means of worldwide free available software. The concept development probes that speckle noise polluting the outcomes of normal optical encrypting procedures can be avoided, then making more attractive the adoption of these techniques. Actual smartphone collected results are shown to validate our proposal.
Convolutional encoding of self-dual codes
NASA Technical Reports Server (NTRS)
Solomon, G.
1994-01-01
There exist almost complete convolutional encodings of self-dual codes, i.e., block codes of rate 1/2 with weights w, w = 0 mod 4. The codes are of length 8m with the convolutional portion of length 8m-2 and the nonsystematic information of length 4m-1. The last two bits are parity checks on the two (4m-1) length parity sequences. The final information bit complements one of the extended parity sequences of length 4m. Solomon and van Tilborg have developed algorithms to generate these for the Quadratic Residue (QR) Codes of lengths 48 and beyond. For these codes and reasonable constraint lengths, there are sequential decodings for both hard and soft decisions. There are also possible Viterbi-type decodings that may be simple, as in a convolutional encoding/decoding of the extended Golay Code. In addition, the previously found constraint length K = 9 for the QR (48, 24;12) Code is lowered here to K = 8.
40 CFR 94.5 - Reference materials.
Code of Federal Regulations, 2012 CFR
2012-07-01
... available for inspection at the National Archives and Records Administration (NARA). For information on the.../code_of_federal_regulations/ibr_locations.html. In addition, these materials are available from the...-7611, or at http://www.imo.org. (1) Resolution 2—Technical Code on Control of Emission of Nitrogen...
40 CFR 94.5 - Reference materials.
Code of Federal Regulations, 2014 CFR
2014-07-01
... available for inspection at the National Archives and Records Administration (NARA). For information on the.../code_of_federal_regulations/ibr_locations.html. In addition, these materials are available from the...-7611, or at http://www.imo.org. (1) Resolution 2—Technical Code on Control of Emission of Nitrogen...
40 CFR 94.5 - Reference materials.
Code of Federal Regulations, 2013 CFR
2013-07-01
... available for inspection at the National Archives and Records Administration (NARA). For information on the.../code_of_federal_regulations/ibr_locations.html. In addition, these materials are available from the...-7611, or at http://www.imo.org. (1) Resolution 2—Technical Code on Control of Emission of Nitrogen...
Pritoni, Marco; Ford, Rebecca; Karlin, Beth; Sanguinetti, Angela
2018-02-01
Policymakers worldwide are currently discussing whether to include home energy management (HEM) products in their portfolio of technologies to reduce carbon emissions and improve grid reliability. However, very little data is available about these products. Here we present the results of an extensive review including 308 HEM products available on the US market in 2015-2016. We gathered these data from publicly available sources such as vendor websites, online marketplaces and other vendor documents. A coding guide was developed iteratively during the data collection and utilized to classify the devices. Each product was coded based on 96 distinct attributes, grouped into 11 categories: Identifying information, Product components, Hardware, Communication, Software, Information - feedback, Information - feedforward, Control, Utility interaction, Additional benefits and Usability. The codes describe product features and functionalities, user interaction and interoperability with other devices. A mix of binary attributes and more descriptive codes allow to sort and group data without losing important qualitative information. The information is stored in a large spreadsheet included with this article, along with an explanatory coding guide. This dataset is analyzed and described in a research article entitled "Categories and functionality of smart home technology for energy management" (Ford et al., 2017) [1].
Research on pre-processing of QR Code
NASA Astrophysics Data System (ADS)
Sun, Haixing; Xia, Haojie; Dong, Ning
2013-10-01
QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.
Reduction of PAPR in coded OFDM using fast Reed-Solomon codes over prime Galois fields
NASA Astrophysics Data System (ADS)
Motazedi, Mohammad Reza; Dianat, Reza
2017-02-01
In this work, two new techniques using Reed-Solomon (RS) codes over GF(257) and GF(65,537) are proposed for peak-to-average power ratio (PAPR) reduction in coded orthogonal frequency division multiplexing (OFDM) systems. The lengths of these codes are well-matched to the length of OFDM frames. Over these fields, the block lengths of codes are powers of two and we fully exploit the radix-2 fast Fourier transform algorithms. Multiplications and additions are simple modulus operations. These codes provide desirable randomness with a small perturbation in information symbols that is essential for generation of different statistically independent candidates. Our simulations show that the PAPR reduction ability of RS codes is the same as that of conventional selected mapping (SLM), but contrary to SLM, we can get error correction capability. Also for the second proposed technique, the transmission of side information is not needed. To the best of our knowledge, this is the first work using RS codes for PAPR reduction in single-input single-output systems.
HANFORD FACILITY ANNUAL DANGEROUS WASTE REPORT CY2003 [SEC 1 & 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
FREEMAN, D.A.
2004-02-17
The Hanford Facility Annual Dangerous Waste Report (ADWR) is prepared to meet the requirements of Washington Administrative Code Sections 173-303-220, Generator Reporting, and 173-303-390, Facility Reporting. In addition, the ADWR is required to meet Hanford Facility RCRA Permit Condition I.E.22, Annual Reporting. The ADWR provides summary information on dangerous waste generation and management activities for the Calendar Year for the Hanford Facility EPA ID number assigned to the Department of Energy for RCRA regulated waste, as well as Washington State only designated waste and radioactive mixed waste. The Solid Waste Information and Tracking System (SWITS) database is utilized to collectmore » and compile the large array of data needed for preparation of this report. Information includes details of waste generated on the Hanford Facility, waste generated offsite and sent to Hanford for management, and other waste management activities conducted at Hanford, including treatment, storage, and disposal. Report details consist of waste descriptions and weights, waste codes and designations, and waste handling codes. In addition, for waste shipped to Hanford for treatment and or disposal, information on manifest numbers, the waste transporter, the waste receiving facility, and the original waste generators are included. In addition to paper copies, the report is also transmitted electronically to a web site maintained by the Washington State Department of Ecology.« less
Quick reproduction of blast-wave flow-field properties of nuclear, TNT, and ANFO explosions
NASA Astrophysics Data System (ADS)
Groth, C. P. T.
1986-04-01
In many instances, extensive blast-wave flow-field properties are required in gasdynamics research studies of blast-wave loading and structure response, and in evaluating the effects of explosions on their environment. This report provides a very useful computer code, which can be used in conjunction with the DNA Nuclear Blast Standard subroutines and code, to quickly reconstruct complete and fairly accurate blast-wave data for almost any free-air (spherical) and surface-burst (hemispherical) nuclear, trinitrotoluene (TNT), or ammonium nitrate-fuel oil (ANFO) explosion. This code is capable of computing all of the main flow properties as functions of radius and time, as well as providing additional information regarding air viscosity, reflected shock-wave properties, and the initial decay of the flow properties just behind the shock front. Both spatial and temporal distributions of the major blast-wave flow properties are also made readily available. Finally, provisions are also included in the code to provide additional information regarding the peak or shock-front flow properties over a range of radii, for a specific explosion of interest.
Experimental QR code optical encryption: noise-free data recovering.
Barrera, John Fredy; Mira-Agudelo, Alejandro; Torroba, Roberto
2014-05-15
We report, to our knowledge for the first time, the experimental implementation of a quick response (QR) code as a "container" in an optical encryption system. A joint transform correlator architecture in an interferometric configuration is chosen as the experimental scheme. As the implementation is not possible in a single step, a multiplexing procedure to encrypt the QR code of the original information is applied. Once the QR code is correctly decrypted, the speckle noise present in the recovered QR code is eliminated by a simple digital procedure. Finally, the original information is retrieved completely free of any kind of degradation after reading the QR code. Additionally, we propose and implement a new protocol in which the reception of the encrypted QR code and its decryption, the digital block processing, and the reading of the decrypted QR code are performed employing only one device (smartphone, tablet, or computer). The overall method probes to produce an outcome far more attractive to make the adoption of the technique a plausible option. Experimental results are presented to demonstrate the practicality of the proposed security system.
Information retrieval based on single-pixel optical imaging with quick-response code
NASA Astrophysics Data System (ADS)
Xiao, Yin; Chen, Wen
2018-04-01
Quick-response (QR) code technique is combined with ghost imaging (GI) to recover original information with high quality. An image is first transformed into a QR code. Then the QR code is treated as an input image in the input plane of a ghost imaging setup. After measurements, traditional correlation algorithm of ghost imaging is utilized to reconstruct an image (QR code form) with low quality. With this low-quality image as an initial guess, a Gerchberg-Saxton-like algorithm is used to improve its contrast, which is actually a post processing. Taking advantage of high error correction capability of QR code, original information can be recovered with high quality. Compared to the previous method, our method can obtain a high-quality image with comparatively fewer measurements, which means that the time-consuming postprocessing procedure can be avoided to some extent. In addition, for conventional ghost imaging, the larger the image size is, the more measurements are needed. However, for our method, images with different sizes can be converted into QR code with the same small size by using a QR generator. Hence, for the larger-size images, the time required to recover original information with high quality will be dramatically reduced. Our method makes it easy to recover a color image in a ghost imaging setup, because it is not necessary to divide the color image into three channels and respectively recover them.
Zhang, Yunfang; Zhang, Xudong; Shi, Junchao; Tuorto, Francesca; Li, Xin; Liu, Yusheng; Liebers, Reinhard; Zhang, Liwen; Qu, Yongcun; Qian, Jingjing; Pahima, Maya; Liu, Ying; Yan, Menghong; Cao, Zhonghong; Lei, Xiaohua; Cao, Yujing; Peng, Hongying; Liu, Shichao; Wang, Yue; Zheng, Huili; Woolsey, Rebekah; Quilici, David; Zhai, Qiwei; Li, Lei; Zhou, Tong; Yan, Wei; Lyko, Frank; Zhang, Ying; Zhou, Qi; Duan, Enkui; Chen, Qi
2018-05-01
The discovery of RNAs (for example, messenger RNAs, non-coding RNAs) in sperm has opened the possibility that sperm may function by delivering additional paternal information aside from solely providing the DNA 1 . Increasing evidence now suggests that sperm small non-coding RNAs (sncRNAs) can mediate intergenerational transmission of paternally acquired phenotypes, including mental stress 2,3 and metabolic disorders 4-6 . How sperm sncRNAs encode paternal information remains unclear, but the mechanism may involve RNA modifications. Here we show that deletion of a mouse tRNA methyltransferase, DNMT2, abolished sperm sncRNA-mediated transmission of high-fat-diet-induced metabolic disorders to offspring. Dnmt2 deletion prevented the elevation of RNA modifications (m 5 C, m 2 G) in sperm 30-40 nt RNA fractions that are induced by a high-fat diet. Also, Dnmt2 deletion altered the sperm small RNA expression profile, including levels of tRNA-derived small RNAs and rRNA-derived small RNAs, which might be essential in composing a sperm RNA 'coding signature' that is needed for paternal epigenetic memory. Finally, we show that Dnmt2-mediated m 5 C contributes to the secondary structure and biological properties of sncRNAs, implicating sperm RNA modifications as an additional layer of paternal hereditary information.
Variational learning and bits-back coding: an information-theoretic view to Bayesian learning.
Honkela, Antti; Valpola, Harri
2004-07-01
The bits-back coding first introduced by Wallace in 1990 and later by Hinton and van Camp in 1993 provides an interesting link between Bayesian learning and information-theoretic minimum-description-length (MDL) learning approaches. The bits-back coding allows interpreting the cost function used in the variational Bayesian method called ensemble learning as a code length in addition to the Bayesian view of misfit of the posterior approximation and a lower bound of model evidence. Combining these two viewpoints provides interesting insights to the learning process and the functions of different parts of the model. In this paper, the problem of variational Bayesian learning of hierarchical latent variable models is used to demonstrate the benefits of the two views. The code-length interpretation provides new views to many parts of the problem such as model comparison and pruning and helps explain many phenomena occurring in learning.
HANFORD FACILITY ANNUAL DANGEROUS WASTE REPORT CY2005
DOE Office of Scientific and Technical Information (OSTI.GOV)
SKOLRUD, J.O.
2006-02-15
The Hanford Facility Annual Dangerous Waste Report (ADWR) is prepared to meet the requirements of Washington Administrative Code Sections 173-303-220, Generator Reporting, and 173-303-390, Facility Reporting. In addition, the ADWR is required to meet Hanford Facility RCR4 Permit Condition I.E.22, Annual Reporting. The ADWR provides summary information on dangerous waste generation and management activities for the Calendar Year for the Hanford Facility EPA ID number assigned to the Department of Energy for RCRA regulated waste, as well as Washington State only designated waste and radioactive mixed waste. An electronic database is utilized to collect and compile the large array ofmore » data needed for preparation of this report. Information includes details of waste generated on the Hanford Facility, waste generated offsite and sent to Hanford for management, and other waste management activities conducted at Hanford, including treatment, storage, and disposal. Report details consist of waste descriptions and weights, waste codes and designations, and waste handling codes, In addition, for waste shipped to Hanford for treatment and/or disposal, information on manifest numbers, the waste transporter, the waste receiving facility, and the original waste generators are included. In addition to paper copies, the report is also transmitted electronically to a web site maintained by the Washington State Department of Ecology.« less
High-Content Optical Codes for Protecting Rapid Diagnostic Tests from Counterfeiting.
Gökçe, Onur; Mercandetti, Cristina; Delamarche, Emmanuel
2018-06-19
Warnings and reports on counterfeit diagnostic devices are released several times a year by regulators and public health agencies. Unfortunately, mishandling, altering, and counterfeiting point-of-care diagnostics (POCDs) and rapid diagnostic tests (RDTs) is lucrative, relatively simple and can lead to devastating consequences. Here, we demonstrate how to implement optical security codes in silicon- and nitrocellulose-based flow paths for device authentication using a smartphone. The codes are created by inkjet spotting inks directly on nitrocellulose or on micropillars. Codes containing up to 32 elements per mm 2 and 8 colors can encode as many as 10 45 combinations. Codes on silicon micropillars can be erased by setting a continuous flow path across the entire array of code elements or for nitrocellulose by simply wicking a liquid across the code. Static or labile code elements can further be formed on nitrocellulose to create a hidden code using poly(ethylene glycol) (PEG) or glycerol additives to the inks. More advanced codes having a specific deletion sequence can also be created in silicon microfluidic devices using an array of passive routing nodes, which activate in a particular, programmable sequence. Such codes are simple to fabricate, easy to view, and efficient in coding information; they can be ideally used in combination with information on a package to protect diagnostic devices from counterfeiting.
48 CFR 204.7204 - Maintenance of the CAGE file.
Code of Federal Regulations, 2014 CFR
2014-10-01
... electronic equivalent, to— DLA Logistics Information Service, DLIS-SBB, Federal Center, 74 Washington Avenue... Maintenance of the CAGE file. (a) DLA Logistics Information Service will accept written requests for changes...) Additional guidance for maintaining CAGE codes is in Volume 7 of DoD 4100.39-M, Federal Logistics Information...
48 CFR 204.7204 - Maintenance of the CAGE file.
Code of Federal Regulations, 2011 CFR
2011-10-01
... electronic equivalent, to— DLA Logistics Information Service, DLIS-SBB, Federal Center, 74 Washington Avenue... Maintenance of the CAGE file. (a) DLA Logistics Information Service will accept written requests for changes...) Additional guidance for maintaining CAGE codes is in Volume 7 of DoD 4100.39-M, Federal Logistics Information...
48 CFR 204.7204 - Maintenance of the CAGE file.
Code of Federal Regulations, 2013 CFR
2013-10-01
... electronic equivalent, to— DLA Logistics Information Service, DLIS-SBB, Federal Center, 74 Washington Avenue... Maintenance of the CAGE file. (a) DLA Logistics Information Service will accept written requests for changes...) Additional guidance for maintaining CAGE codes is in Volume 7 of DoD 4100.39-M, Federal Logistics Information...
48 CFR 204.7204 - Maintenance of the CAGE file.
Code of Federal Regulations, 2012 CFR
2012-10-01
... electronic equivalent, to— DLA Logistics Information Service, DLIS-SBB, Federal Center, 74 Washington Avenue... Maintenance of the CAGE file. (a) DLA Logistics Information Service will accept written requests for changes...) Additional guidance for maintaining CAGE codes is in Volume 7 of DoD 4100.39-M, Federal Logistics Information...
Designing and maintaining an effective chargemaster.
Abbey, D C
2001-03-01
The chargemaster is the central repository of charges and associated coding information used to develop claims. But this simple description belies the chargemaster's true complexity. The chargemaster's role in the coding process differs from department to department, and not all codes provided on a claim form are necessarily included in the chargemaster, as codes for complex services may need to be developed and reviewed by coding staff. In addition, with the rise of managed care, the chargemaster increasingly is being used to track utilization of supplies and services. To ensure that the chargemaster performs all of its functions effectively, hospitals should appoint a chargemaster coordinator, supported by a chargemaster review team, to oversee the design and maintenance of the chargemaster. Important design issues that should be considered include the principle of "form follows function," static versus dynamic coding, how modifiers should be treated, how charges should be developed, how to incorporate physician fee schedules into the chargemaster, the interface between the chargemaster and cost reports, and how to include statistical information for tracking utilization.
Pattern analysis of fraud case in Taiwan, China and Indonesia
NASA Astrophysics Data System (ADS)
Kusumo, A. H.; Chi, C.-F.; Dewi, R. S.
2017-11-01
The current study analyzed 125 successful fraud cases happened in Taiwan, China, and Indonesia from 2008 to 2012 published in the English online newspapers. Each of the case report was coded in terms of scam principle, information media (information exchange between fraudsters and victim), money media (media used by fraudsters to obtain unauthorized financial benefit) and other additional information which was judged to be relevant. The Chi-square Automatic Interaction Detector (CHAID) was applied to the coded data of information, scam principle and money media to find a subset of predictors that might derive meaningful classifications. A series of flow diagrams was constructed based on CHAID result to illustrate the flow of information (scam) travelling from information media to money media.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... use of statistical compilations of data under section 7216 of the Internal Revenue Code (Code) by a... preparation business, including identification of additional limited circumstances when a tax return preparer... tax return business under Sec. 301.7216-2(n); disclose and use statistical compilations of data...
Role of non-coding RNAs in non-aging-related neurological disorders.
Vieira, A S; Dogini, D B; Lopes-Cendes, I
2018-06-11
Protein coding sequences represent only 2% of the human genome. Recent advances have demonstrated that a significant portion of the genome is actively transcribed as non-coding RNA molecules. These non-coding RNAs are emerging as key players in the regulation of biological processes, and act as "fine-tuners" of gene expression. Neurological disorders are caused by a wide range of genetic mutations, epigenetic and environmental factors, and the exact pathophysiology of many of these conditions is still unknown. It is currently recognized that dysregulations in the expression of non-coding RNAs are present in many neurological disorders and may be relevant in the mechanisms leading to disease. In addition, circulating non-coding RNAs are emerging as potential biomarkers with great potential impact in clinical practice. In this review, we discuss mainly the role of microRNAs and long non-coding RNAs in several neurological disorders, such as epilepsy, Huntington disease, fragile X-associated ataxia, spinocerebellar ataxias, amyotrophic lateral sclerosis (ALS), and pain. In addition, we give information about the conditions where microRNAs have demonstrated to be potential biomarkers such as in epilepsy, pain, and ALS.
The neuronal encoding of information in the brain.
Rolls, Edmund T; Treves, Alessandro
2011-11-01
We describe the results of quantitative information theoretic analyses of neural encoding, particularly in the primate visual, olfactory, taste, hippocampal, and orbitofrontal cortex. Most of the information turns out to be encoded by the firing rates of the neurons, that is by the number of spikes in a short time window. This has been shown to be a robust code, for the firing rate representations of different neurons are close to independent for small populations of neurons. Moreover, the information can be read fast from such encoding, in as little as 20 ms. In quantitative information theoretic studies, only a little additional information is available in temporal encoding involving stimulus-dependent synchronization of different neurons, or the timing of spikes within the spike train of a single neuron. Feature binding appears to be solved by feature combination neurons rather than by temporal synchrony. The code is sparse distributed, with the spike firing rate distributions close to exponential or gamma. A feature of the code is that it can be read by neurons that take a synaptically weighted sum of their inputs. This dot product decoding is biologically plausible. Understanding the neural code is fundamental to understanding not only how the cortex represents, but also processes, information. Copyright © 2011 Elsevier Ltd. All rights reserved.
2013-01-01
Background Primary care databases are a major source of data for epidemiological and health services research. However, most studies are based on coded information, ignoring information stored in free text. Using the early presentation of rheumatoid arthritis (RA) as an exemplar, our objective was to estimate the extent of data hidden within free text, using a keyword search. Methods We examined the electronic health records (EHRs) of 6,387 patients from the UK, aged 30 years and older, with a first coded diagnosis of RA between 2005 and 2008. We listed indicators for RA which were present in coded format and ran keyword searches for similar information held in free text. The frequency of indicator code groups and keywords from one year before to 14 days after RA diagnosis were compared, and temporal relationships examined. Results One or more keyword for RA was found in the free text in 29% of patients prior to the RA diagnostic code. Keywords for inflammatory arthritis diagnoses were present for 14% of patients whereas only 11% had a diagnostic code. Codes for synovitis were found in 3% of patients, but keywords were identified in an additional 17%. In 13% of patients there was evidence of a positive rheumatoid factor test in text only, uncoded. No gender differences were found. Keywords generally occurred close in time to the coded diagnosis of rheumatoid arthritis. They were often found under codes indicating letters and communications. Conclusions Potential cases may be missed or wrongly dated when coded data alone are used to identify patients with RA, as diagnostic suspicions are frequently confined to text. The use of EHRs to create disease registers or assess quality of care will be misleading if free text information is not taken into account. Methods to facilitate the automated processing of text need to be developed and implemented. PMID:23964710
NASA Technical Reports Server (NTRS)
Abbott, T. S.; Moen, G. C.; Person, L. H., Jr.; Keyser, G. L., Jr.; Yenni, K. R.; Garren, J. F., Jr.
1980-01-01
Traffic symbology was encoded to provide additional information concerning the traffic, which was displayed on the pilot's electronic horizontal situation indicators (EHSI). A research airplane representing an advanced operational environment was used to assess the benefit of coded traffic symbology in a realistic work-load environment. Traffic scenarios, involving both conflict-free and conflict situations, were employed. Subjective pilot commentary was obtained through the use of a questionnaire and extensive pilot debriefings. These results grouped conveniently under two categories: display factors and task performance. A major item under the display factor category was the problem of display clutter. The primary contributors to clutter were the use of large map-scale factors, the use of traffic data blocks, and the presentation of more than a few airplanes. In terms of task performance, the cockpit-displayed traffic information was found to provide excellent overall situation awareness. Additionally, mile separation prescribed during these tests.
Hanford Facility Annual Dangerous Waste Report Calendar Year 2002
DOE Office of Scientific and Technical Information (OSTI.GOV)
FREEMAN, D.A.
2003-02-01
Hanford CY 2002 dangerous waste generation and management forms. The Hanford Facility Annual Dangerous Waste Report (ADWR) is prepared to meet the requirements of Washington Administrative Code Sections 173-303-220, Generator Reporting, and 173-303-390, Facility Reporting. In addition, the ADWR is required to meet Hanford Facility RCRA Permit Condition I.E.22, Annual Reporting. The ADWR provides summary information on dangerous waste generation and management activities for the Calendar Year for the Hanford Facility EPA ID number assigned to the Department of Energy for RCRA regulated waste, as well as Washington State only designated waste and radioactive mixed waste. The Solid Waste Informationmore » and Tracking System (SWITS) database is utilized to collect and compile the large array of data needed for preparation of this report. Information includes details of waste generated on the Hanford Facility, waste generated offsite and sent to Hanford for management, and other waste management activities conducted at Hanford, including treatment, storage, and disposal. Report details consist of waste descriptions and weights, waste codes and designations, and waste handling codes. In addition, for waste shipped to Hanford for treatment and/or disposal, information on manifest numbers, the waste transporter, the waste receiving facility, and the original waste generators are included. In addition to paper copies, electronic copies of the report are also transmitted to the regulatory agency.« less
Flexible patient information search and retrieval framework: pilot implementation
NASA Astrophysics Data System (ADS)
Erdal, Selnur; Catalyurek, Umit V.; Saltz, Joel; Kamal, Jyoti; Gurcan, Metin N.
2007-03-01
Medical centers collect and store significant amount of valuable data pertaining to patients' visit in the form of medical free-text. In addition, standardized diagnosis codes (International Classification of Diseases, Ninth Revision, Clinical Modification: ICD9-CM) related to those dictated reports are usually available. In this work, we have created a framework where image searches could be initiated through a combination of free-text reports as well as ICD9 codes. This framework enables more comprehensive search on existing large sets of patient data in a systematic way. The free text search is enriched by computer-aided inclusion of additional search terms enhanced by a thesaurus. This combination of enriched search allows users to access to a larger set of relevant results from a patient-centric PACS in a simpler way. Therefore, such framework is of particular use in tasks such as gathering images for desired patient populations, building disease models, and so on. As the motivating application of our framework, we implemented a search engine. This search engine processed two years of patient data from the OSU Medical Center's Information Warehouse and identified lung nodule location information using a combination of UMLS Meta-Thesaurus enhanced text report searches along with ICD9 code searches on patients that have been discharged. Five different queries with various ICD9 codes involving lung cancer were carried out on 172552 cases. Each search was completed under a minute on average per ICD9 code and the inclusion of UMLS thesaurus increased the number of relevant cases by 45% on average.
Lincoln, A; Sorock, G; Courtney, T; Wellman, H; Smith, G; Amoroso, P
2004-01-01
Objective: To determine whether narrative text in safety reports contains sufficient information regarding contributing factors and precipitating mechanisms to prioritize occupational back injury prevention strategies. Design, setting, subjects, and main outcome measures: Nine essential data elements were identified in narratives and coded sections of safety reports for each of 94 cases of back injuries to United States Army truck drivers reported to the United States Army Safety Center between 1987 and 1997. The essential elements of each case were used to reconstruct standardized event sequences. A taxonomy of the event sequences was then developed to identify common hazard scenarios and opportunities for primary interventions. Results: Coded data typically only identified five data elements (broad activity, task, event/exposure, nature of injury, and outcomes) while narratives provided additional elements (contributing factor, precipitating mechanism, primary source) essential for developing our taxonomy. Three hazard scenarios were associated with back injuries among Army truck drivers accounting for 83% of cases: struck by/against events during motor vehicle crashes; falls resulting from slips/trips or loss of balance; and overexertion from lifting activities. Conclusions: Coded data from safety investigations lacked sufficient information to thoroughly characterize the injury event. However, the combination of existing narrative text (similar to that collected by many injury surveillance systems) and coded data enabled us to develop a more complete taxonomy of injury event characteristics and identify common hazard scenarios. This study demonstrates that narrative text can provide the additional information on contributing factors and precipitating mechanisms needed to target prevention strategies. PMID:15314055
Spatiotemporal coding of inputs for a system of globally coupled phase oscillators
NASA Astrophysics Data System (ADS)
Wordsworth, John; Ashwin, Peter
2008-12-01
We investigate the spatiotemporal coding of low amplitude inputs to a simple system of globally coupled phase oscillators with coupling function g(ϕ)=-sin(ϕ+α)+rsin(2ϕ+β) that has robust heteroclinic cycles (slow switching between cluster states). The inputs correspond to detuning of the oscillators. It was recently noted that globally coupled phase oscillators can encode their frequencies in the form of spatiotemporal codes of a sequence of cluster states [P. Ashwin, G. Orosz, J. Wordsworth, and S. Townley, SIAM J. Appl. Dyn. Syst. 6, 728 (2007)]. Concentrating on the case of N=5 oscillators we show in detail how the spatiotemporal coding can be used to resolve all of the information that relates the individual inputs to each other, providing that a long enough time series is considered. We investigate robustness to the addition of noise and find a remarkable stability, especially of the temporal coding, to the addition of noise even for noise of a comparable magnitude to the inputs.
Exposure calculation code module for reactor core analysis: BURNER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Cunningham, G.W.
1979-02-01
The code module BURNER for nuclear reactor exposure calculations is presented. The computer requirements are shown, as are the reference data and interface data file requirements, and the programmed equations and procedure of calculation are described. The operating history of a reactor is followed over the period between solutions of the space, energy neutronics problem. The end-of-period nuclide concentrations are determined given the necessary information. A steady state, continuous fueling model is treated in addition to the usual fixed fuel model. The control options provide flexibility to select among an unusually wide variety of programmed procedures. The code also providesmore » user option to make a number of auxiliary calculations and print such information as the local gamma source, cumulative exposure, and a fine scale power density distribution in a selected zone. The code is used locally in a system for computation which contains the VENTURE diffusion theory neutronics code and other modules.« less
Stimulus information contaminates summation tests of independent neural representations of features
NASA Technical Reports Server (NTRS)
Shimozaki, Steven S.; Eckstein, Miguel P.; Abbey, Craig K.
2002-01-01
Many models of visual processing assume that visual information is analyzed into separable and independent neural codes, or features. A common psychophysical test of independent features is known as a summation study, which measures performance in a detection, discrimination, or visual search task as the number of proposed features increases. Improvement in human performance with increasing number of available features is typically attributed to the summation, or combination, of information across independent neural coding of the features. In many instances, however, increasing the number of available features also increases the stimulus information in the task, as assessed by an optimal observer that does not include the independent neural codes. In a visual search task with spatial frequency and orientation as the component features, a particular set of stimuli were chosen so that all searches had equivalent stimulus information, regardless of the number of features. In this case, human performance did not improve with increasing number of features, implying that the improvement observed with additional features may be due to stimulus information and not the combination across independent features.
Adaptive software-defined coded modulation for ultra-high-speed optical transport
NASA Astrophysics Data System (ADS)
Djordjevic, Ivan B.; Zhang, Yequn
2013-10-01
In optically-routed networks, different wavelength channels carrying the traffic to different destinations can have quite different optical signal-to-noise ratios (OSNRs) and signal is differently impacted by various channel impairments. Regardless of the data destination, an optical transport system (OTS) must provide the target bit-error rate (BER) performance. To provide target BER regardless of the data destination we adjust the forward error correction (FEC) strength. Depending on the information obtained from the monitoring channels, we select the appropriate code rate matching to the OSNR range that current channel OSNR falls into. To avoid frame synchronization issues, we keep the codeword length fixed independent of the FEC code being employed. The common denominator is the employment of quasi-cyclic (QC-) LDPC codes in FEC. For high-speed implementation, low-complexity LDPC decoding algorithms are needed, and some of them will be described in this invited paper. Instead of conventional QAM based modulation schemes, we employ the signal constellations obtained by optimum signal constellation design (OSCD) algorithm. To improve the spectral efficiency, we perform the simultaneous rate adaptation and signal constellation size selection so that the product of number of bits per symbol × code rate is closest to the channel capacity. Further, we describe the advantages of using 4D signaling instead of polarization-division multiplexed (PDM) QAM, by using the 4D MAP detection, combined with LDPC coding, in a turbo equalization fashion. Finally, to solve the problems related to the limited bandwidth of information infrastructure, high energy consumption, and heterogeneity of optical networks, we describe an adaptive energy-efficient hybrid coded-modulation scheme, which in addition to amplitude, phase, and polarization state employs the spatial modes as additional basis functions for multidimensional coded-modulation.
User Manual for the NASA Glenn Ice Accretion Code LEWICE. Version 2.2.2
NASA Technical Reports Server (NTRS)
Wright, William B.
2002-01-01
A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.2.2 of this code, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A of this report has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.
Validation of asthma recording in the Clinical Practice Research Datalink (CPRD)
Morales, Daniel R; Mullerova, Hana; Smeeth, Liam; Douglas, Ian J; Quint, Jennifer K
2017-01-01
Objectives The optimal method of identifying people with asthma from electronic health records in primary care is not known. The aim of this study is to determine the positive predictive value (PPV) of different algorithms using clinical codes and prescription data to identify people with asthma in the United Kingdom Clinical Practice Research Datalink (CPRD). Methods 684 participants registered with a general practitioner (GP) practice contributing to CPRD between 1 December 2013 and 30 November 2015 were selected according to one of eight predefined potential asthma identification algorithms. A questionnaire was sent to the GPs to confirm asthma status and provide additional information to support an asthma diagnosis. Two study physicians independently reviewed and adjudicated the questionnaires and additional information to form a gold standard for asthma diagnosis. The PPV was calculated for each algorithm. Results 684 questionnaires were sent, of which 494 (72%) were returned and 475 (69%) were complete and analysed. All five algorithms including a specific Read code indicating asthma or non-specific Read code accompanied by additional conditions performed well. The PPV for asthma diagnosis using only a specific asthma code was 86.4% (95% CI 77.4% to 95.4%). Extra information on asthma medication prescription (PPV 83.3%), evidence of reversibility testing (PPV 86.0%) or a combination of all three selection criteria (PPV 86.4%) did not result in a higher PPV. The algorithm using non-specific asthma codes, information on reversibility testing and respiratory medication use scored highest (PPV 90.7%, 95% CI (82.8% to 98.7%), but had a much lower identifiable population. Algorithms based on asthma symptom codes had low PPVs (43.1% to 57.8%)%). Conclusions People with asthma can be accurately identified from UK primary care records using specific Read codes. The inclusion of spirometry or asthma medications in the algorithm did not clearly improve accuracy. Ethics and dissemination The protocol for this research was approved by the Independent Scientific Advisory Committee (ISAC) for MHRA Database Research (protocol number15_257) and the approved protocol was made available to the journal and reviewers during peer review. Generic ethical approval for observational research using the CPRD with approval from ISAC has been granted by a Health Research Authority Research Ethics Committee (East Midlands—Derby, REC reference number 05/MRE04/87). The results will be submitted for publication and will be disseminated through research conferences and peer-reviewed journals. PMID:28801439
76 FR 59024 - Federal Government Participation in the Automated Clearing House
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
... Act of 2008, account-related information could be shared only for certain types of benefit [[Page... international payment transactions using a new Standard Entry Class Code and to include certain information in... Control (OFAC). In addition, the rule requires financial institutions to provide limited account-related...
Coding and Quantization in Communications and Microeconomics
ERIC Educational Resources Information Center
Xu, Yun
2013-01-01
Since information theory was developed by Claude E. Shannon, in addition to its primary role in communications and networking, it has broadened to find applications in many other areas of science and technology, such as microeconomics, statistics, and neuroscience. This thesis investigates the application of information theoretic viewpoints to two…
Comparison of procedure coding systems for level 1 and 2 hospitals in South Africa.
Montewa, Lebogang; Hanmer, Lyn; Reagon, Gavin
2013-01-01
The ability of three procedure coding systems to reflect the procedure concepts extracted from patient records from six hospitals was compared, in order to inform decision making about a procedure coding standard for South Africa. A convenience sample of 126 procedure concepts was extracted from patient records at three level 1 hospitals and three level 2 hospitals. Each procedure concept was coded using ICPC-2, ICD-9-CM, and CCSA-2001. The extent to which each code assigned actually reflected the procedure concept was evaluated (between 'no match' and 'complete match'). For the study sample, CCSA-2001 was found to reflect the procedure concepts most completely, followed by ICD-9-CM and then ICPC-2. In practice, decision making about procedure coding standards would depend on multiple factors in addition to coding accuracy.
Structured Low-Density Parity-Check Codes with Bandwidth Efficient Modulation
NASA Technical Reports Server (NTRS)
Cheng, Michael K.; Divsalar, Dariush; Duy, Stephanie
2009-01-01
In this work, we study the performance of structured Low-Density Parity-Check (LDPC) Codes together with bandwidth efficient modulations. We consider protograph-based LDPC codes that facilitate high-speed hardware implementations and have minimum distances that grow linearly with block sizes. We cover various higher- order modulations such as 8-PSK, 16-APSK, and 16-QAM. During demodulation, a demapper transforms the received in-phase and quadrature samples into reliability information that feeds the binary LDPC decoder. We will compare various low-complexity demappers and provide simulation results for assorted coded-modulation combinations on the additive white Gaussian noise and independent Rayleigh fading channels.
[The QR code in society, economy and medicine--fields of application, options and chances].
Flaig, Benno; Parzeller, Markus
2011-01-01
2D codes like the QR Code ("Quick Response") are becoming more and more common in society and medicine. The application spectrum and benefits in medicine and other fields are described. 2D codes can be created free of charge on any computer with internet access without any previous knowledge. The codes can be easily used in publications, presentations, on business cards and posters. Editors choose between contact details, text or a hyperlink as information behind the code. At expert conferences, linkage by QR Code allows the audience to download presentations and posters quickly. The documents obtained can then be saved, printed, processed etc. Fast access to stored data in the internet makes it possible to integrate additional and explanatory multilingual videos into medical posters. In this context, a combination of different technologies (printed handout, QR Code and screen) may be reasonable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This document contains the State Building Energy Codes Status prepared by Pacific Northwest National Laboratory for the U.S. Department of Energy under Contract DE-AC06-76RL01830 and dated September 1996. The U.S. Department of Energy`s Office of Codes and Standards has developed this document to provide an information resource for individuals interested in energy efficiency of buildings and the relevant building energy codes in each state and U.S. territory. This is considered to be an evolving document and will be updated twice a year. In addition, special state updates will be issued as warranted.
Side-information-dependent correlation channel estimation in hash-based distributed video coding.
Deligiannis, Nikos; Barbarien, Joeri; Jacobs, Marc; Munteanu, Adrian; Skodras, Athanassios; Schelkens, Peter
2012-04-01
In the context of low-cost video encoding, distributed video coding (DVC) has recently emerged as a potential candidate for uplink-oriented applications. This paper builds on a concept of correlation channel (CC) modeling, which expresses the correlation noise as being statistically dependent on the side information (SI). Compared with classical side-information-independent (SII) noise modeling adopted in current DVC solutions, it is theoretically proven that side-information-dependent (SID) modeling improves the Wyner-Ziv coding performance. Anchored in this finding, this paper proposes a novel algorithm for online estimation of the SID CC parameters based on already decoded information. The proposed algorithm enables bit-plane-by-bit-plane successive refinement of the channel estimation leading to progressively improved accuracy. Additionally, the proposed algorithm is included in a novel DVC architecture that employs a competitive hash-based motion estimation technique to generate high-quality SI at the decoder. Experimental results corroborate our theoretical gains and validate the accuracy of the channel estimation algorithm. The performance assessment of the proposed architecture shows remarkable and consistent coding gains over a germane group of state-of-the-art distributed and standard video codecs, even under strenuous conditions, i.e., large groups of pictures and highly irregular motion content.
Population coding in sparsely connected networks of noisy neurons.
Tripp, Bryan P; Orchard, Jeff
2012-01-01
This study examines the relationship between population coding and spatial connection statistics in networks of noisy neurons. Encoding of sensory information in the neocortex is thought to require coordinated neural populations, because individual cortical neurons respond to a wide range of stimuli, and exhibit highly variable spiking in response to repeated stimuli. Population coding is rooted in network structure, because cortical neurons receive information only from other neurons, and because the information they encode must be decoded by other neurons, if it is to affect behavior. However, population coding theory has often ignored network structure, or assumed discrete, fully connected populations (in contrast with the sparsely connected, continuous sheet of the cortex). In this study, we modeled a sheet of cortical neurons with sparse, primarily local connections, and found that a network with this structure could encode multiple internal state variables with high signal-to-noise ratio. However, we were unable to create high-fidelity networks by instantiating connections at random according to spatial connection probabilities. In our models, high-fidelity networks required additional structure, with higher cluster factors and correlations between the inputs to nearby neurons.
Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)
NASA Astrophysics Data System (ADS)
Valentine, Timothy
2017-09-01
The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.
Bar code, good for industry and trade--how does it benefit the dentist?
Oehlmann, H
2001-10-01
Every dentist who attentively follows the change in product labelling can easily see that the HIBC bar code is on the increase. In fact, according to information from FIDE/VDDI and ADE/BVD, the dental industry and trade are firmly resolved to apply the HIBC bar code to all products used internationally in dental practices. Why? Indeed, at first it looks like extra expense to additionally print a bar code on the packages. Good reasons can only lie in advantages which manufacturers and the trade expect from the HIBC bar code, Indications in dental technician circles are that the HIBC bar code is coming. If there are advantages, what are these, and can the dentist also profit from them? What does HIBC bar code mean and what items of interest does it include? What does bar code cost and does only one code exist? This is explained briefly, concentrating on the benefits bar code can bring for different users.
Nang, Roberto N; Monahan, Felicia; Diehl, Glendon B; French, Daniel
2015-04-01
Many institutions collect reports in databases to make important lessons-learned available to their members. The Uniformed Services University of the Health Sciences collaborated with the Peacekeeping and Stability Operations Institute to conduct a descriptive and qualitative analysis of global health engagements (GHEs) contained in the Stability Operations Lessons Learned and Information Management System (SOLLIMS). This study used a summative qualitative content analysis approach involving six steps: (1) a comprehensive search; (2) two-stage reading and screening process to identify first-hand, health-related records; (3) qualitative and quantitative data analysis using MAXQDA, a software program; (4) a word cloud to illustrate word frequencies and interrelationships; (5) coding of individual themes and validation of the coding scheme; and (6) identification of relationships in the data and overarching lessons-learned. The individual codes with the most number of text segments coded included: planning, personnel, interorganizational coordination, communication/information sharing, and resources/supplies. When compared to the Department of Defense's (DoD's) evolving GHE principles and capabilities, the SOLLIMS coding scheme appeared to align well with the list of GHE capabilities developed by the Department of Defense Global Health Working Group. The results of this study will inform practitioners of global health and encourage additional qualitative analysis of other lessons-learned databases. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
NASA Technical Reports Server (NTRS)
Rice, R. F.
1974-01-01
End-to-end system considerations involving channel coding and data compression are reported which could drastically improve the efficiency in communicating pictorial information from future planetary spacecraft. In addition to presenting new and potentially significant system considerations, this report attempts to fill a need for a comprehensive tutorial which makes much of this very subject accessible to readers whose disciplines lie outside of communication theory.
Architectural Coatings: National Volatile Organic Compounds Emission Standards
Read about the section 183(e) rule for volatile organic compounds for architectural coatings. Read the rule summary and history, find the code of federal regulations test, and additional documents, including compliance information.
Hanford facility dangerous waste permit application, general information portion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hays, C.B.
1998-05-19
The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. Documentation contained in the General Information Portion ismore » broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in this report).« less
Least Reliable Bits Coding (LRBC) for high data rate satellite communications
NASA Technical Reports Server (NTRS)
Vanderaar, Mark; Wagner, Paul; Budinger, James
1992-01-01
An analysis and discussion of a bandwidth efficient multi-level/multi-stage block coded modulation technique called Least Reliable Bits Coding (LRBC) is presented. LRBC uses simple multi-level component codes that provide increased error protection on increasingly unreliable modulated bits in order to maintain an overall high code rate that increases spectral efficiency. Further, soft-decision multi-stage decoding is used to make decisions on unprotected bits through corrections made on more protected bits. Using analytical expressions and tight performance bounds it is shown that LRBC can achieve increased spectral efficiency and maintain equivalent or better power efficiency compared to that of Binary Phase Shift Keying (BPSK). Bit error rates (BER) vs. channel bit energy with Additive White Gaussian Noise (AWGN) are given for a set of LRB Reed-Solomon (RS) encoded 8PSK modulation formats with an ensemble rate of 8/9. All formats exhibit a spectral efficiency of 2.67 = (log2(8))(8/9) information bps/Hz. Bit by bit coded and uncoded error probabilities with soft-decision information are determined. These are traded with with code rate to determine parameters that achieve good performance. The relative simplicity of Galois field algebra vs. the Viterbi algorithm and the availability of high speed commercial Very Large Scale Integration (VLSI) for block codes indicates that LRBC using block codes is a desirable method for high data rate implementations.
ERIC Educational Resources Information Center
McCabe, Paul C.; Marshall, Debra J.
2006-01-01
The correspondence between direct observation and informant ratings of preschool children with specific language impairment (SLI) was investigated. Preschoolers with and without SLI were observed during free play using the "Social Interactive Coding System" (SICS; Rice, Sell, & Hadley, 1990). In addition, teachers and parents…
Oil and gas field code master list 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This is the thirteenth annual edition of the Energy Information Administration`s (EIA) Oil and Gas Field Code Master List. It reflects data collected through October 1994 and provides standardized field name spellings and codes for all identified oil and/or gas fields in the United States. The master field name spellings and codes are to be used by respondents when filing the following Department of Energy (DOE) forms: Form EIA-23, {open_quotes}Annual Survey of Domestic Oil and Gas Reserves,{close_quotes} filed by oil and gas well operators (field codes are required from larger operators only); Forms FERC 8 and EIA-191, {open_quotes}Underground Gas Storagemore » Report,{close_quotes} filed by natural gas producers and distributors who operate underground natural gas storage facilities. Other Federal and State government agencies, as well as industry, use the EIA Oil and Gas Field Code Master List as the standard for field identification. A machine-readable version of the Oil and Gas Field Code Master List is available from the National Technical Information Service, 5285 Port Royal Road, Springfield, Virginia 22161, (703) 487-4650. In order for the Master List to be useful, it must be accurate and remain current. To accomplish this, EIA constantly reviews and revises this list. The EIA welcomes all comments, corrections, and additions to the Master List. All such information should be given to the EIA Field Code Coordinator at (214) 953-1858. EIA gratefully acknowledges the assistance provides by numerous State organizations and trade associations in verifying the existence of fields and their official nomenclature.« less
Operational rate-distortion performance for joint source and channel coding of images.
Ruf, M J; Modestino, J W
1999-01-01
This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.
A Categorization of Dynamic Analyzers
NASA Technical Reports Server (NTRS)
Lujan, Michelle R.
1997-01-01
Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.
Green, Nancy
2005-04-01
We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.
Tsopra, Rosy; Peckham, Daniel; Beirne, Paul; Rodger, Kirsty; Callister, Matthew; White, Helen; Jais, Jean-Philippe; Ghosh, Dipansu; Whitaker, Paul; Clifton, Ian J; Wyatt, Jeremy C
2018-07-01
Coding of diagnoses is important for patient care, hospital management and research. However coding accuracy is often poor and may reflect methods of coding. This study investigates the impact of three alternative coding methods on the inaccuracy of diagnosis codes and hospital reimbursement. Comparisons of coding inaccuracy were made between a list of coded diagnoses obtained by a coder using (i)the discharge summary alone, (ii)case notes and discharge summary, and (iii)discharge summary with the addition of medical input. For each method, inaccuracy was determined for the primary, secondary diagnoses, Healthcare Resource Group (HRG) and estimated hospital reimbursement. These data were then compared with a gold standard derived by a consultant and coder. 107 consecutive patient discharges were analysed. Inaccuracy of diagnosis codes was highest when a coder used the discharge summary alone, and decreased significantly when the coder used the case notes (70% vs 58% respectively, p < 0.0001) or coded from the discharge summary with medical support (70% vs 60% respectively, p < 0.0001). When compared with the gold standard, the percentage of incorrect HRGs was 42% for discharge summary alone, 31% for coding with case notes, and 35% for coding with medical support. The three coding methods resulted in an annual estimated loss of hospital remuneration of between £1.8 M and £16.5 M. The accuracy of diagnosis codes and percentage of correct HRGs improved when coders used either case notes or medical support in addition to the discharge summary. Further emphasis needs to be placed on improving the standard of information recorded in discharge summaries. Copyright © 2018 Elsevier B.V. All rights reserved.
Clay Ceramics Manufacturing: National Emission Standards for Hazardous Air Pollutants (NESHAP)
Learn about the NESHAP regulation for clay ceramic manufacturing by reading the rule summary, rule history, code of federal regulations, and the additional resources like fact sheets and background information documents
U.S. Seismic Design Maps Web Application
NASA Astrophysics Data System (ADS)
Martinez, E.; Fee, J.
2015-12-01
The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.
Abuagla, Ayat; Badr, Elsheikh
2016-06-30
The WHO Global Code of Practice on the International Recruitment of Health Personnel (hereafter the WHO Code) was adopted by the World Health Assembly in 2010 as a voluntary instrument to address challenges of health worker migration worldwide. To ascertain its relevance and effectiveness, the implementation of the WHO Code needs to be assessed based on country experience; hence, this case study on Sudan. This qualitative study depended mainly on documentary sources in addition to key informant interviews. Experiences of the authors has informed the analysis. Migration of Sudanese health workers represents a major health system challenge. Over half of Sudanese physicians practice abroad and new trends are showing involvement of other professions and increased feminization. Traditional destinations include Gulf States, especially Saudi Arabia and Libya, as well as the United Kingdom and the Republic of Ireland. Low salaries, poor work environment, and a lack of adequate professional development are the leading push factors. Massive emigration of skilled health workers has jeopardized coverage and quality of healthcare and health professional education. Poor evidence, lack of a national policy, and active recruitment in addition to labour market problems were barriers for effective migration management in Sudan. Response of destination countries in relation to cooperative arrangements with Sudan as a source country has always been suboptimal, demonstrating less attention to solidarity and ethical dimensions. The WHO Code boosted Sudan's efforts to address health worker migration and health workforce development in general. Improving migration evidence, fostering a national dialogue, and promoting bilateral agreements in addition to catalysing health worker retention strategies are some of the benefits accrued. There are, however, limitations in publicity of the WHO Code and its incorporation into national laws and regulatory frameworks for ethical recruitment. The outlook is bleak for Sudan unless the country designs and implements a robust national policy for migration management and unless prospects for source-destination country collaboration improve within a more sound version of the WHO Code. The WHO Code catalysed some vital steps in managing migration and strengthening the national health workforce in Sudan. Nevertheless, the country has not utilized the full potential of this instrument. Revisions of the WHO Code would benefit much from lessons of its application in the context of developing countries such as Sudan.
Coding tools investigation for next generation video coding based on HEVC
NASA Astrophysics Data System (ADS)
Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin
2015-09-01
The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.
Theory of Coding Informational Simulation.
1981-04-06
reach the valu , cf several thousands; single-prcgressicn represertation of this valu i5. little attractive duc to the unwieldinsss. Here we approachfd a...the moment/torque when contents of location ccunter must be chanqid tc the larger or smaller side. Value and directicn of change are assiqned by the...ths register of transition is formed by the algebraic addition of contained location counter and value of a change in the code of the latter (step
NASA Technical Reports Server (NTRS)
Bade, W. L.; Yos, J. M.
1975-01-01
The present, third volume of the final report is a programmer's manual for the code. It provides a listing of the FORTRAN 4 source program; a complete glossary of FORTRAN symbols; a discussion of the purpose and method of operation of each subroutine (including mathematical analyses of special algorithms); and a discussion of the operation of the code on IBM/360 and UNIVAC 1108 systems, including required control cards and the overlay structure used to accommodate the code to the limited core size of the 1108. In addition, similar information is provided to document the programming of the NOZFIT code, which is employed to set up nozzle profile curvefits for use in NATA.
Shannon Entropy of the Canonical Genetic Code
NASA Astrophysics Data System (ADS)
Nemzer, Louis
The probability that a non-synonymous point mutation in DNA will adversely affect the functionality of the resultant protein is greatly reduced if the substitution is conservative. In that case, the amino acid coded by the mutated codon has similar physico-chemical properties to the original. Many simplified alphabets, which group the 20 common amino acids into families, have been proposed. To evaluate these schema objectively, we introduce a novel, quantitative method based on the inherent redundancy in the canonical genetic code. By calculating the Shannon information entropy carried by 1- or 2-bit messages, groupings that best leverage the robustness of the code are identified. The relative importance of properties related to protein folding - like hydropathy and size - and function, including side-chain acidity, can also be estimated. In addition, this approach allows us to quantify the average information value of nucleotide codon positions, and explore the physiological basis for distinguishing between transition and transversion mutations. Supported by NSU PFRDG Grant #335347.
A Study on Architecture of Malicious Code Blocking Scheme with White List in Smartphone Environment
NASA Astrophysics Data System (ADS)
Lee, Kijeong; Tolentino, Randy S.; Park, Gil-Cheol; Kim, Yong-Tae
Recently, the interest and demands for mobile communications are growing so fast because of the increasing prevalence of smartphones around the world. In addition, the existing feature phones were replaced by smartphones and it has widely improved while using the explosive growth of Internet users using smartphones, e-commerce enabled Internet banking transactions and the importance of protecting personal information. Therefore, the development of smartphones antivirus products was developed and launched in order to prevent malicious code or virus infection. In this paper, we proposed a new scheme to protect the smartphone from malicious codes and malicious applications that are element of security threats in mobile environment and to prevent information leakage from malicious code infection. The proposed scheme is based on the white list smartphone application which only allows installing authorized applications and to prevent the installation of malicious and untrusted mobile applications which can possibly infect the applications and programs of smartphones.
Decoding a Decision Process in the Neuronal Population of Dorsal Premotor Cortex.
Rossi-Pool, Román; Zainos, Antonio; Alvarez, Manuel; Zizumbo, Jerónimo; Vergara, José; Romo, Ranulfo
2017-12-20
When trained monkeys discriminate the temporal structure of two sequential vibrotactile stimuli, dorsal premotor cortex (DPC) showed high heterogeneity among its neuronal responses. Notably, DPC neurons coded stimulus patterns as broader categories and signaled them during working memory, comparison, and postponed decision periods. Here, we show that such population activity can be condensed into two major coding components: one that persistently represented in working memory both the first stimulus identity and the postponed informed choice and another that transiently coded the initial sensory information and the result of the comparison between the two stimuli. Additionally, we identified relevant signals that coded the timing of task events. These temporal and task-parameter readouts were shown to be strongly linked to the monkeys' behavior when contrasted to those obtained in a non-demanding cognitive control task and during error trials. These signals, hidden in the heterogeneity, were prominently represented by the DPC population response. Copyright © 2017 Elsevier Inc. All rights reserved.
Coherent state coding approaches the capacity of non-Gaussian bosonic channels
NASA Astrophysics Data System (ADS)
Huber, Stefan; König, Robert
2018-05-01
The additivity problem asks if the use of entanglement can boost the information-carrying capacity of a given channel beyond what is achievable by coding with simple product states only. This has recently been shown not to be the case for phase-insensitive one-mode Gaussian channels, but remains unresolved in general. Here we consider two general classes of bosonic noise channels, which include phase-insensitive Gaussian channels as special cases: these are attenuators with general, potentially non-Gaussian environment states and classical noise channels with general probabilistic noise. We show that additivity violations, if existent, are rather minor for all these channels: the maximal gain in classical capacity is bounded by a constant independent of the input energy. Our proof shows that coding by simple classical modulation of coherent states is close to optimal.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 17 2014-07-01 2014-07-01 false What are the additional requirements... knowledge and belief after I have taken reasonable and appropriate steps to verify the accuracy thereof. I... States Code, section 1001, the penalty for furnishing false, incomplete or misleading information in this...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What are the additional requirements... knowledge and belief after I have taken reasonable and appropriate steps to verify the accuracy thereof. I... States Code, section 1001, the penalty for furnishing false, incomplete or misleading information in this...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 17 2012-07-01 2012-07-01 false What are the additional requirements... knowledge and belief after I have taken reasonable and appropriate steps to verify the accuracy thereof. I... States Code, section 1001, the penalty for furnishing false, incomplete or misleading information in this...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 16 2011-07-01 2011-07-01 false What are the additional requirements... knowledge and belief after I have taken reasonable and appropriate steps to verify the accuracy thereof. I... States Code, section 1001, the penalty for furnishing false, incomplete or misleading information in this...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 17 2013-07-01 2013-07-01 false What are the additional requirements... knowledge and belief after I have taken reasonable and appropriate steps to verify the accuracy thereof. I... States Code, section 1001, the penalty for furnishing false, incomplete or misleading information in this...
JPEG 2000 Encoding with Perceptual Distortion Control
NASA Technical Reports Server (NTRS)
Watson, Andrew B.; Liu, Zhen; Karam, Lina J.
2008-01-01
An alternative approach has been devised for encoding image data in compliance with JPEG 2000, the most recent still-image data-compression standard of the Joint Photographic Experts Group. Heretofore, JPEG 2000 encoding has been implemented by several related schemes classified as rate-based distortion-minimization encoding. In each of these schemes, the end user specifies a desired bit rate and the encoding algorithm strives to attain that rate while minimizing a mean squared error (MSE). While rate-based distortion minimization is appropriate for transmitting data over a limited-bandwidth channel, it is not the best approach for applications in which the perceptual quality of reconstructed images is a major consideration. A better approach for such applications is the present alternative one, denoted perceptual distortion control, in which the encoding algorithm strives to compress data to the lowest bit rate that yields at least a specified level of perceptual image quality. Some additional background information on JPEG 2000 is prerequisite to a meaningful summary of JPEG encoding with perceptual distortion control. The JPEG 2000 encoding process includes two subprocesses known as tier-1 and tier-2 coding. In order to minimize the MSE for the desired bit rate, a rate-distortion- optimization subprocess is introduced between the tier-1 and tier-2 subprocesses. In tier-1 coding, each coding block is independently bit-plane coded from the most-significant-bit (MSB) plane to the least-significant-bit (LSB) plane, using three coding passes (except for the MSB plane, which is coded using only one "clean up" coding pass). For M bit planes, this subprocess involves a total number of (3M - 2) coding passes. An embedded bit stream is then generated for each coding block. Information on the reduction in distortion and the increase in the bit rate associated with each coding pass is collected. This information is then used in a rate-control procedure to determine the contribution of each coding block to the output compressed bit stream.
ERIC Educational Resources Information Center
Gilstrap, Donald L.
1998-01-01
Explains how to build World Wide Web home pages using frames-based HTML so that librarians can manage Web-based information and improve their home pages. Provides descriptions and 15 examples for writing frames-HTML code, including advanced concepts and additional techniques for home-page design. (Author/LRW)
Meijs, Celeste; Hurks, Petra P M; Wassenberg, Renske; Feron, Frans J M; Jolles, Jelle
2016-01-01
This study examines inter-individual differences in how presentation modality affects verbal learning performance. Children aged 5 to 16 performed a verbal learning test within one of three presentation modalities: pictorial, auditory, or textual. The results indicated that a beneficial effect of pictures exists over auditory and textual presentation modalities and that this effect increases with age. However, this effect is only found if the information to be learned is presented once (or at most twice) and only in children above the age of 7. The results may be explained in terms of single or dual coding of information in which the phonological loop is involved. Development of the (sub)vocal rehearsal system in the phonological loop is believed to be a gradual process that begins developing around the age of 7. The developmental trajectories are similar for boys and girls. Additionally, auditory information and textual information both seemed to be processed in a similar manner, namely without labeling or recoding, leading to single coding. In contrast, pictures are assumed to be processed by the dual coding of both the visual information and a (verbal) labeling of the pictures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adrian Miron; Joshua Valentine; John Christenson
2009-10-01
The current state of the art in nuclear fuel cycle (NFC) modeling is an eclectic mixture of codes with various levels of applicability, flexibility, and availability. In support of the advanced fuel cycle systems analyses, especially those by the Advanced Fuel Cycle Initiative (AFCI), Unviery of Cincinnati in collaboration with Idaho State University carried out a detailed review of the existing codes describing various aspects of the nuclear fuel cycle and identified the research and development needs required for a comprehensive model of the global nuclear energy infrastructure and the associated nuclear fuel cycles. Relevant information obtained on the NFCmore » codes was compiled into a relational database that allows easy access to various codes' properties. Additionally, the research analyzed the gaps in the NFC computer codes with respect to their potential integration into programs that perform comprehensive NFC analysis.« less
38 CFR 61.15 - Obtaining additional information and awarding capital grants.
Code of Federal Regulations, 2011 CFR
2011-07-01
... establishing compliance with local and state zoning codes; (7) Documentation in the form of one set of design... ensure compliance with the provisions of the National Environmental Policy Act (42 U.S.C. 4321 et seq...
38 CFR 61.15 - Obtaining additional information and awarding capital grants.
Code of Federal Regulations, 2010 CFR
2010-07-01
... establishing compliance with local and state zoning codes; (7) Documentation in the form of one set of design... ensure compliance with the provisions of the National Environmental Policy Act (42 U.S.C. 4321 et seq...
Learn about the NESHAP regulation for brick and structural clay products by reading the rule summary, rule history, code of federal regulations, and the additional resources like fact sheets and background information documents
HowTo - Easy use of global unique identifier
NASA Astrophysics Data System (ADS)
Czerniak, A.; Fleischer, D.; Schirnick, C.
2013-12-01
The GEOMAR sample- and core repository covers several thousands of samples and cores and was collected over the last decades. In the actual project, we bring this collection up to the new generation and tag every sample and core with a unique identifier, in our case the International Geo Sample Number (ISGN). This work is done with our digital Ink and hand writing recognition implementation. The Smart Pen technology was save time and resources to record the information on every sample or core. In the procedure of recording, there are several steps systematical are done: 1. Getting all information about the core or sample, such as cruise number, responsible person and so on. 2. Tag with unique identifiers, in our case a QR-Code. 3. Wrote down the location of sample or core. After transmitting the information from Smart Pen, actually via USB but wireless is a choice too, into our server infrastructure the link to other information began. As it linked in our Virtual Research Environment (VRE) with the unique identifier (ISGN) sample or core can be located and the QR-Code was simply linked back from core or sample to ISGN with additional scientific information. On the QR-Code all important information are on it and it was simple to produce thousand of it.
End-to-end imaging information rate advantages of various alternative communication systems
NASA Technical Reports Server (NTRS)
Rice, R. F.
1982-01-01
The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.
Single-shot secure quantum network coding on butterfly network with free public communication
NASA Astrophysics Data System (ADS)
Owari, Masaki; Kato, Go; Hayashi, Masahito
2018-01-01
Quantum network coding on the butterfly network has been studied as a typical example of quantum multiple cast network. We propose a secure quantum network code for the butterfly network with free public classical communication in the multiple unicast setting under restricted eavesdropper’s power. This protocol certainly transmits quantum states when there is no attack. We also show the secrecy with shared randomness as additional resource when the eavesdropper wiretaps one of the channels in the butterfly network and also derives the information sending through public classical communication. Our protocol does not require verification process, which ensures single-shot security.
More About Vector Adaptive/Predictive Coding Of Speech
NASA Technical Reports Server (NTRS)
Jedrey, Thomas C.; Gersho, Allen
1992-01-01
Report presents additional information about digital speech-encoding and -decoding system described in "Vector Adaptive/Predictive Encoding of Speech" (NPO-17230). Summarizes development of vector adaptive/predictive coding (VAPC) system and describes basic functions of algorithm. Describes refinements introduced enabling receiver to cope with errors. VAPC algorithm implemented in integrated-circuit coding/decoding processors (codecs). VAPC and other codecs tested under variety of operating conditions. Tests designed to reveal effects of various background quiet and noisy environments and of poor telephone equipment. VAPC found competitive with and, in some respects, superior to other 4.8-kb/s codecs and other codecs of similar complexity.
User Manual for the NASA Glenn Ice Accretion Code LEWICE: Version 2.0
NASA Technical Reports Server (NTRS)
Wright, William B.
1999-01-01
A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.0 of this code, which is called LEWICE. This version differs from previous releases due to its robustness and its ability to reproduce results accurately for different spacing and time step criteria across computing platform. It also differs in the extensive effort undertaken to compare the results against the database of ice shapes which have been generated in the NASA Glenn Icing Research Tunnel (IRT) 1. This report will only describe the features of the code related to the use of the program. The report will not describe the inner working of the code or the physical models used. This information is available in the form of several unpublished documents which will be collectively referred to as a Programmers Manual for LEWICE 2 in this report. These reports are intended as an update/replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.
An empirical analysis of journal policy effectiveness for computational reproducibility.
Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun
2018-03-13
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.
An empirical analysis of journal policy effectiveness for computational reproducibility
Seiler, Jennifer; Ma, Zhaokun
2018-01-01
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by (i) requesting data and code from authors and (ii) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility. PMID:29531050
76 FR 79656 - Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-22
... buildings, parks and memorials. Draft agendas and additional information regarding the Commission are available on our Web site: www.cfa.gov . Inquiries regarding the agenda and requests to submit written or... Filed 12-21-11; 8:45 am] BILLING CODE 6330-01-M ...
78 FR 51263 - Small Business Size Standards: Waiver of the Nonmanufacturer Rule
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-20
... the Privacy and Use Notice/User Notice at www.Regulations.gov and you do not want that information... and Budget in the NAICS Manual. 13 CFR 121.1202(d). In addition, SBA uses Product Service Codes (PSCs...
Design implications for task-specific search utilities for retrieval and re-engineering of code
NASA Astrophysics Data System (ADS)
Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif
2017-05-01
The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.
Sub-block motion derivation for merge mode in HEVC
NASA Astrophysics Data System (ADS)
Chien, Wei-Jung; Chen, Ying; Chen, Jianle; Zhang, Li; Karczewicz, Marta; Li, Xiang
2016-09-01
The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. In this paper, two additional merge candidates, advanced temporal motion vector predictor and spatial-temporal motion vector predictor, are developed to improve motion information prediction scheme under the HEVC structure. The proposed method allows each Prediction Unit (PU) to fetch multiple sets of motion information from multiple blocks smaller than the current PU. By splitting a large PU into sub-PUs and filling motion information for all the sub-PUs of the large PU, signaling cost of motion information could be reduced. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. Simulation results show that 2.4% performance improvement over HEVC can be achieved.
Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes
NASA Astrophysics Data System (ADS)
Farzan Sabahi, Mohammad; Dehghanfard, Ali
2014-12-01
The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.
Cooper, P David; Smart, David R
2017-06-01
Recent Australian attempts to facilitate disinvestment in healthcare, by identifying instances of 'inappropriate' care from large Government datasets, are subject to significant methodological flaws. Amongst other criticisms has been the fact that the Government datasets utilized for this purpose correlate poorly with datasets collected by relevant professional bodies. Government data derive from official hospital coding, collected retrospectively by clerical personnel, whilst professional body data derive from unit-specific databases, collected contemporaneously with care by clinical personnel. Assessment of accuracy of official hospital coding data for hyperbaric services in a tertiary referral hospital. All official hyperbaric-relevant coding data submitted to the relevant Australian Government agencies by the Royal Hobart Hospital, Tasmania, Australia for financial year 2010-2011 were reviewed and compared against actual hyperbaric unit activity as determined by reference to original source documents. Hospital coding data contained one or more errors in diagnoses and/or procedures in 70% of patients treated with hyperbaric oxygen that year. Multiple discrete error types were identified, including (but not limited to): missing patients; missing treatments; 'additional' treatments; 'additional' patients; incorrect procedure codes and incorrect diagnostic codes. Incidental observations of errors in surgical, anaesthetic and intensive care coding within this cohort suggest that the problems are not restricted to the specialty of hyperbaric medicine alone. Publications from other centres indicate that these problems are not unique to this institution or State. Current Government datasets are irretrievably compromised and not fit for purpose. Attempting to inform the healthcare policy debate by reference to these datasets is inappropriate. Urgent clinical engagement with hospital coding departments is warranted.
Zelingher, Julian; Ash, Nachman
2013-05-01
The IsraeLi healthcare system has undergone major processes for the adoption of health information technologies (HIT), and enjoys high Levels of utilization in hospital and ambulatory care. Coding is an essential infrastructure component of HIT, and ts purpose is to represent data in a simplified and common format, enhancing its manipulation by digital systems. Proper coding of data enables efficient identification, storage, retrieval and communication of data. UtiLization of uniform coding systems by different organizations enables data interoperability between them, facilitating communication and integrating data elements originating in different information systems from various organizations. Current needs in Israel for heaLth data coding include recording and reporting of diagnoses for hospitalized patients, outpatients and visitors of the Emergency Department, coding of procedures and operations, coding of pathology findings, reporting of discharge diagnoses and causes of death, billing codes, organizational data warehouses and national registries. New national projects for cLinicaL data integration, obligatory reporting of quality indicators and new Ministry of Health (MOH) requirements for HIT necessitate a high Level of interoperability that can be achieved only through the adoption of uniform coding. Additional pressures were introduced by the USA decision to stop the maintenance of the ICD-9-CM codes that are also used by Israeli healthcare, and the adoption of ICD-10-C and ICD-10-PCS as the main coding system for billing purpose. The USA has also mandated utilization of SNOMED-CT as the coding terminology for the ELectronic Health Record problem list, and for reporting quality indicators to the CMS. Hence, the Israeli MOH has recently decided that discharge diagnoses will be reported using ICD-10-CM codes, and SNOMED-CT will be used to code the cLinical information in the EHR. We reviewed the characteristics, strengths and weaknesses of these two coding systems. In summary, the adoption of ICD-10-CM is in line with the USA decision to abandon ICD-9-CM, and the Israeli heaLthcare system could benefit from USA heaLthcare efforts in this direction. The Large content of SNOMED-CT and its sophisticated hierarchical data structure will enable advanced cLinicaL decision support and quality improvement applications.
Open Source Subtitle Editor Software Study for Section 508 Close Caption Applications
NASA Technical Reports Server (NTRS)
Murphy, F. Brandon
2013-01-01
This paper will focus on a specific item within the NASA Electronic Information Accessibility Policy - Multimedia Presentation shall have synchronized caption; thus making information accessible to a person with hearing impairment. This synchronized caption will assist a person with hearing or cognitive disability to access the same information as everyone else. This paper focuses on the research and implementation for CC (subtitle option) support to video multimedia. The goal of this research is identify the best available open-source (free) software to achieve synchronized captions requirement and achieve savings, while meeting the security requirement for Government information integrity and assurance. CC and subtitling are processes that display text within a video to provide additional or interpretive information for those whom may need it or those whom chose it. Closed captions typically show the transcription of the audio portion of a program (video) as it occurs (either verbatim or in its edited form), sometimes including non-speech elements (such as sound effects). The transcript can be provided by a third party source or can be extracted word for word from the video. This feature can be made available for videos in two forms: either Soft-Coded or Hard-Coded. Soft-Coded is the more optional version of CC, where you can chose to turn them on if you want, or you can turn them off. Most of the time, when using the Soft-Coded option, the transcript is also provided to the view along-side the video. This option is subject to compromise, whereas the transcript is merely a text file that can be changed by anyone who has access to it. With this option the integrity of the CC is at the mercy of the user. Hard-Coded CC is a more permanent form of CC. A Hard-Coded CC transcript is embedded within a video, without the option of removal.
Code Lavender: Cultivating Intentional Acts of Kindness in Response to Stressful Work Situations.
Davidson, Judy E; Graham, Patricia; Montross-Thomas, Lori; Norcross, William; Zerbi, Giovanna
Providing healthcare can be stressful. Gone unchecked, clinicians may experience decreased compassion, and increased burnout or secondary traumatic stress. Code Lavender is designed to increase acts of kindness after stressful workplace events occur. To test the feasibility of providing Code Lavender. After stressful events in the workplace, staff will provide, receive, and recommend Code Lavender to others. The provision of Code Lavender will improve Professional Quality of Life Scale (ProQoL) scores, general job satisfaction, and feeling cared for in the workplace. Pilot program testing and evaluation. Staff and physicians on four hospital units were informed of the Code Lavender kit availability, which includes words of comfort, chocolate, lavender essential oil, and employee health referral information. Feasibility data and ProQoL scores were collected at baseline and three months. At baseline, 48% (n = 164) reported a stressful event at work in the last three months. Post-intervention, 51% reported experiencing a stressful workplace event, with 32% receiving a Code Lavender kit from their co-workers as a result (n = 83). Of those who received the Code Lavender intervention; 100% found it helpful, and 84% would recommend it to others. No significant changes were demonstrated before and after the intervention in ProQoL scores or job satisfaction, however the emotion of feeling cared-for improved. Results warrant continuation and further dissemination of Code Lavender. Investigators have received requests to expand the program implying positive reception of the intervention. Additional interventions are needed to overcome workplace stressors. A more intense peer support program is being tested. Copyright © 2017. Published by Elsevier Inc.
Verbal-spatial and visuospatial coding of power-space interactions.
Dai, Qiang; Zhu, Lei
2018-05-10
A power-space interaction, which denotes the phenomenon that people responded faster to powerful words when they are placed higher in a visual field and faster to powerless words when they are placed lower in a visual field, has been repeatedly found. The dominant explanation of this power-space interaction is that it results from a tight correspondence between the representation of power and visual space (i.e., a visuospatial coding account). In the present study, we demonstrated that the interaction between power and space could be also based on a verbal-spatial coding in absence of any vertical spatial information. Additionally, the verbal-spatial coding was dominant in driving the power-space interaction when verbal space was contrasted with the visual space. Copyright © 2018 Elsevier Inc. All rights reserved.
Hemispheric processing asymmetries: implications for memory.
Funnell, M G; Corballis, P M; Gazzaniga, M S
2001-01-01
Recent research has demonstrated that memory for words elicits left hemisphere activation, faces right hemisphere activation, and nameable objects bilateral activation. This pattern of results was attributed to dual coding of information, with the left hemisphere employing a verbal code and the right a nonverbal code. Nameable objects can be encoded either verbally or nonverbally and this accounts for their bilateral activation. We investigated this hypothesis in a callosotomy patient. Consistent with dual coding, the left hemisphere was superior to the right in memory for words, whereas the right was superior for faces. Contrary to prediction, performance on nameable pictures was not equivalent in the two hemispheres, but rather resulted in a right hemisphere superiority. In addition, memory for pictures was significantly better than for either words or faces. These findings suggest that the dual code hypothesis is an oversimplification of the processing capabilities of the two hemispheres.
Iterative Code-Aided ML Phase Estimation and Phase Ambiguity Resolution
NASA Astrophysics Data System (ADS)
Wymeersch, Henk; Moeneclaey, Marc
2005-12-01
As many coded systems operate at very low signal-to-noise ratios, synchronization becomes a very difficult task. In many cases, conventional algorithms will either require long training sequences or result in large BER degradations. By exploiting code properties, these problems can be avoided. In this contribution, we present several iterative maximum-likelihood (ML) algorithms for joint carrier phase estimation and ambiguity resolution. These algorithms operate on coded signals by accepting soft information from the MAP decoder. Issues of convergence and initialization are addressed in detail. Simulation results are presented for turbo codes, and are compared to performance results of conventional algorithms. Performance comparisons are carried out in terms of BER performance and mean square estimation error (MSEE). We show that the proposed algorithm reduces the MSEE and, more importantly, the BER degradation. Additionally, phase ambiguity resolution can be performed without resorting to a pilot sequence, thus improving the spectral efficiency.
The multidimensional Self-Adaptive Grid code, SAGE, version 2
NASA Technical Reports Server (NTRS)
Davies, Carol B.; Venkatapathy, Ethiraj
1995-01-01
This new report on Version 2 of the SAGE code includes all the information in the original publication plus all upgrades and changes to the SAGE code since that time. The two most significant upgrades are the inclusion of a finite-volume option and the ability to adapt and manipulate zonal-matching multiple-grid files. In addition, the original SAGE code has been upgraded to Version 1.1 and includes all options mentioned in this report, with the exception of the multiple grid option and its associated features. Since Version 2 is a larger and more complex code, it is suggested (but not required) that Version 1.1 be used for single-grid applications. This document contains all the information required to run both versions of SAGE. The formulation of the adaption method is described in the first section of this document. The second section is presented in the form of a user guide that explains the input and execution of the code. The third section provides many examples. Successful application of the SAGE code in both two and three dimensions for the solution of various flow problems has proven the code to be robust, portable, and simple to use. Although the basic formulation follows the method of Nakahashi and Deiwert, many modifications have been made to facilitate the use of the self-adaptive grid method for complex grid structures. Modifications to the method and the simple but extensive input options make this a flexible and user-friendly code. The SAGE code can accommodate two-dimensional and three-dimensional, finite-difference and finite-volume, single grid, and zonal-matching multiple grid flow problems.
Moderate Deviation Analysis for Classical Communication over Quantum Channels
NASA Astrophysics Data System (ADS)
Chubb, Christopher T.; Tan, Vincent Y. F.; Tomamichel, Marco
2017-11-01
We analyse families of codes for classical data transmission over quantum channels that have both a vanishing probability of error and a code rate approaching capacity as the code length increases. To characterise the fundamental tradeoff between decoding error, code rate and code length for such codes we introduce a quantum generalisation of the moderate deviation analysis proposed by Altŭg and Wagner as well as Polyanskiy and Verdú. We derive such a tradeoff for classical-quantum (as well as image-additive) channels in terms of the channel capacity and the channel dispersion, giving further evidence that the latter quantity characterises the necessary backoff from capacity when transmitting finite blocks of classical data. To derive these results we also study asymmetric binary quantum hypothesis testing in the moderate deviations regime. Due to the central importance of the latter task, we expect that our techniques will find further applications in the analysis of other quantum information processing tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schnack, Dalton D.
Final technical report for research performed by Dr. Thomas G. Jenkins in collaboration with Professor Dalton D. Schnack on SciDAC Cooperative Agreement: Center for Wave Interactions with Magnetohydrodyanics, DE-FC02-06ER54899, for the period of 8/15/06 - 8/14/11. This report centers on the Slow MHD physics campaign work performed by Dr. Jenkins while at UW-Madison and then at Tech-X Corporation. To make progress on the problem of RF induced currents affect magnetic island evolution in toroidal plasmas, a set of research approaches are outlined. Three approaches can be addressed in parallel. These are: (1) Analytically prescribed additional term in Ohm's law tomore » model the effect of localized ECCD current drive; (2) Introduce an additional evolution equation for the Ohm's law source term. Establish a RF source 'box' where information from the RF code couples to the fluid evolution; and (3) Carry out a more rigorous analytic calculation treating the additional RF terms in a closure problem. These approaches rely on the necessity of reinvigorating the computation modeling efforts of resistive and neoclassical tearing modes with present day versions of the numerical tools. For the RF community, the relevant action item is - RF ray tracing codes need to be modified so that general three-dimensional spatial information can be obtained. Further, interface efforts between the two codes require work as well as an assessment as to the numerical stability properties of the procedures to be used.« less
Augmenting Traditional Static Analysis With Commonly Available Metadata
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Devin
Developers and security analysts have been using static analysis for a long time to analyze programs for defects and vulnerabilities with some success. Generally a static analysis tool is run on the source code for a given program, flagging areas of code that need to be further inspected by a human analyst. These areas may be obvious bugs like potential bu er over flows, information leakage flaws, or the use of uninitialized variables. These tools tend to work fairly well - every year they find many important bugs. These tools are more impressive considering the fact that they only examinemore » the source code, which may be very complex. Now consider the amount of data available that these tools do not analyze. There are many pieces of information that would prove invaluable for finding bugs in code, things such as a history of bug reports, a history of all changes to the code, information about committers, etc. By leveraging all this additional data, it is possible to nd more bugs with less user interaction, as well as track useful metrics such as number and type of defects injected by committer. This dissertation provides a method for leveraging development metadata to find bugs that would otherwise be difficult to find using standard static analysis tools. We showcase two case studies that demonstrate the ability to find 0day vulnerabilities in large and small software projects by finding new vulnerabilities in the cpython and Roundup open source projects.« less
Quality of online information on type 2 diabetes: a cross-sectional study.
Weymann, Nina; Härter, Martin; Dirmaier, Jörg
2015-12-01
Evidence-based health information is a prerequisite for patients with type 2 diabetes to engage in self-management and to make informed medical decisions. The Internet is an important source of health information. In the present study, we systematically assessed formal quality, quality of decision support and usability of German and English language websites on type 2 diabetes. The search term 'type 2 diabetes' was entered in the two most popular search engines. Descriptive data on website quality are presented. Additionally, associations between website quality and affiliation (commercial vs. non-commercial), presence of the HON code quality seal and website traffic were explored. Forty-six websites were included. Most websites provided basic information necessary for decision-making, while only one website also provided decision support. Websites with a HON code had significantly better formal quality than websites without HON code. We found a highly significant correlation between usability and website traffic and a significant correlation between formal quality and website traffic. Most websites do not provide sufficient information to support patients in medical decision-making. Our finding that usability and website traffic are tightly associated is consistent with previous research indicating that design is the most important cue for users assessing website credibility. © The Author (2014). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
Dittrich, Peter
2018-02-01
The organic code concept and its operationalization by molecular codes have been introduced to study the semiotic nature of living systems. This contribution develops further the idea that the semantic capacity of a physical medium can be measured by assessing its ability to implement a code as a contingent mapping. For demonstration and evaluation, the approach is applied to a formal medium: elementary cellular automata (ECA). The semantic capacity is measured by counting the number of ways codes can be implemented. Additionally, a link to information theory is established by taking multivariate mutual information for quantifying contingency. It is shown how ECAs differ in their semantic capacities, how this is related to various ECA classifications, and how this depends on how a meaning is defined. Interestingly, if the meaning should persist for a certain while, the highest semantic capacity is found in CAs with apparently simple behavior, i.e., the fixed-point and two-cycle class. Synergy as a predictor for a CA's ability to implement codes can only be used if context implementing codes are common. For large context spaces with sparse coding contexts synergy is a weak predictor. Concluding, the approach presented here can distinguish CA-like systems with respect to their ability to implement contingent mappings. Applying this to physical systems appears straight forward and might lead to a novel physical property indicating how suitable a physical medium is to implement a semiotic system. Copyright © 2017 Elsevier B.V. All rights reserved.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, D. J., Jr.
1986-01-01
High rate concatenated coding systems with trellis inner codes and Reed-Solomon (RS) outer codes for application in satellite communication systems are considered. Two types of inner codes are studied: high rate punctured binary convolutional codes which result in overall effective information rates between 1/2 and 1 bit per channel use; and bandwidth efficient signal space trellis codes which can achieve overall effective information rates greater than 1 bit per channel use. Channel capacity calculations with and without side information performed for the concatenated coding system. Concatenated coding schemes are investigated. In Scheme 1, the inner code is decoded with the Viterbi algorithm and the outer RS code performs error-correction only (decoding without side information). In scheme 2, the inner code is decoded with a modified Viterbi algorithm which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, while branch metrics are used to provide the reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. These two schemes are proposed for use on NASA satellite channels. Results indicate that high system reliability can be achieved with little or no bandwidth expansion.
Ensemble coding of face identity is not independent of the coding of individual identity.
Neumann, Markus F; Ng, Ryan; Rhodes, Gillian; Palermo, Romina
2018-06-01
Information about a group of similar objects can be summarized into a compressed code, known as ensemble coding. Ensemble coding of simple stimuli (e.g., groups of circles) can occur in the absence of detailed exemplar coding, suggesting dissociable processes. Here, we investigate whether a dissociation would still be apparent when coding facial identity, where individual exemplar information is much more important. We examined whether ensemble coding can occur when exemplar coding is difficult, as a result of large sets or short viewing times, or whether the two types of coding are positively associated. We found a positive association, whereby both ensemble and exemplar coding were reduced for larger groups and shorter viewing times. There was no evidence for ensemble coding in the absence of exemplar coding. At longer presentation times, there was an unexpected dissociation, where exemplar coding increased yet ensemble coding decreased, suggesting that robust information about face identity might suppress ensemble coding. Thus, for face identity, we did not find the classic dissociation-of access to ensemble information in the absence of detailed exemplar information-that has been used to support claims of distinct mechanisms for ensemble and exemplar coding.
1987-08-01
TO 8/87 68 6 CUPPEIFMyTARY 0A,()N Copies are available from,. the National Technical Information Service Springfield, VA 22161 *COSATI CODES 18 SUBJECT... information . The first exploratory research step was to determine the breath and depth of the construction schedule analysis domain. This step defined...ADDITIONAL INFORMATION REGARDING THIS RESEARCIH I. O’Connor, Michael J., Jesus M. Dc La Garza, and C. William Ibbs, "An Expert Systcm for Construction
Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim
2003-01-01
With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.
Hoelzer, Simon; Schweiger, Ralf K.; Dudeck, Joachim
2003-01-01
With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or “semantically associated” parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813
Study of information transfer optimization for communication satellites
NASA Technical Reports Server (NTRS)
Odenwalder, J. P.; Viterbi, A. J.; Jacobs, I. M.; Heller, J. A.
1973-01-01
The results are presented of a study of source coding, modulation/channel coding, and systems techniques for application to teleconferencing over high data rate digital communication satellite links. Simultaneous transmission of video, voice, data, and/or graphics is possible in various teleconferencing modes and one-way, two-way, and broadcast modes are considered. A satellite channel model including filters, limiter, a TWT, detectors, and an optimized equalizer is treated in detail. A complete analysis is presented for one set of system assumptions which exclude nonlinear gain and phase distortion in the TWT. Modulation, demodulation, and channel coding are considered, based on an additive white Gaussian noise channel model which is an idealization of an equalized channel. Source coding with emphasis on video data compression is reviewed, and the experimental facility utilized to test promising techniques is fully described.
On Asymptotically Good Ramp Secret Sharing Schemes
NASA Astrophysics Data System (ADS)
Geil, Olav; Martin, Stefano; Martínez-Peñas, Umberto; Matsumoto, Ryutaroh; Ruano, Diego
Asymptotically good sequences of linear ramp secret sharing schemes have been intensively studied by Cramer et al. in terms of sequences of pairs of nested algebraic geometric codes. In those works the focus is on full privacy and full reconstruction. In this paper we analyze additional parameters describing the asymptotic behavior of partial information leakage and possibly also partial reconstruction giving a more complete picture of the access structure for sequences of linear ramp secret sharing schemes. Our study involves a detailed treatment of the (relative) generalized Hamming weights of the considered codes.
Creation of the Naturalistic Engagement in Secondary Tasks (NEST) distracted driving dataset.
Owens, Justin M; Angell, Linda; Hankey, Jonathan M; Foley, James; Ebe, Kazutoshi
2015-09-01
Distracted driving has become a topic of critical importance to driving safety research over the past several decades. Naturalistic driving data offer a unique opportunity to study how drivers engage with secondary tasks in real-world driving; however, the complexities involved with identifying and coding relevant epochs of naturalistic data have limited its accessibility to the general research community. This project was developed to help address this problem by creating an accessible dataset of driver behavior and situational factors observed during distraction-related safety-critical events and baseline driving epochs, using the Strategic Highway Research Program 2 (SHRP2) naturalistic dataset. The new NEST (Naturalistic Engagement in Secondary Tasks) dataset was created using crashes and near-crashes from the SHRP2 dataset that were identified as including secondary task engagement as a potential contributing factor. Data coding included frame-by-frame video analysis of secondary task and hands-on-wheel activity, as well as summary event information. In addition, information about each secondary task engagement within the trip prior to the crash/near-crash was coded at a higher level. Data were also coded for four baseline epochs and trips per safety-critical event. 1,180 events and baseline epochs were coded, and a dataset was constructed. The project team is currently working to determine the most useful way to allow broad public access to the dataset. We anticipate that the NEST dataset will be extraordinarily useful in allowing qualified researchers access to timely, real-world data concerning how drivers interact with secondary tasks during safety-critical events and baseline driving. The coded dataset developed for this project will allow future researchers to have access to detailed data on driver secondary task engagement in the real world. It will be useful for standalone research, as well as for integration with additional SHRP2 data to enable the conduct of more complex research. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
Thomson, Dana R; Shitole, Shrutika; Shitole, Tejal; Sawant, Kiran; Subbaraman, Ramnath; Bloom, David E; Patil-Deshmukh, Anita
2014-01-01
We devised and implemented an innovative Location-Based Household Coding System (LBHCS) appropriate to a densely populated informal settlement in Mumbai, India. LBHCS codes were designed to double as unique household identifiers and as walking directions; when an entire community is enumerated, LBHCS codes can be used to identify the number of households located per road (or lane) segment. LBHCS was used in community-wide biometric, mental health, diarrheal disease, and water poverty studies. It also facilitated targeted health interventions by a research team of youth from Mumbai, including intensive door-to-door education of residents, targeted follow-up meetings, and a full census. In addition, LBHCS permitted rapid and low-cost preparation of GIS mapping of all households in the slum, and spatial summation and spatial analysis of survey data. LBHCS was an effective, easy-to-use, affordable approach to household enumeration and re-identification in a densely populated informal settlement where alternative satellite imagery and GPS technologies could not be used.
Holographic Labeling And Reading Machine For Authentication And Security Appications
Weber, David C.; Trolinger, James D.
1999-07-06
A holographic security label and automated reading machine for marking and subsequently authenticating any object such as an identification badge, a pass, a ticket, a manufactured part, or a package is described. The security label is extremely difficult to copy or even to read by unauthorized persons. The system comprises a holographic security label that has been created with a coded reference wave, whose specification can be kept secret. The label contains information that can be extracted only with the coded reference wave, which is derived from a holographic key, which restricts access of the information to only the possessor of the key. A reading machine accesses the information contained in the label and compares it with data stored in the machine through the application of a joint transform correlator, which is also equipped with a reference hologram that adds additional security to the procedure.
Tucker Edmonds, Brownsyne; McKenzie, Fatima; Panoch, Janet E; White, Douglas B; Barnato, Amber E
2016-07-01
Relatively little is known about neonatologists' roles in helping families navigate the difficult decision to attempt or withhold resuscitation for a neonate delivering at the threshold of viability. Therefore, we aimed to describe the "decision-making role" of neonatologists in simulated periviable counseling sessions. We conducted a qualitative content analysis of audio-recorded simulation encounters and post-encounter debriefing interviews collected as part of a single-center simulation study of neonatologists' resuscitation counseling practices in the face of ruptured membranes at 23 weeks gestation. We trained standardized patients to request a recommendation if the physician presented multiple treatment options. We coded each encounter for communication behaviors, applying an adapted, previously developed coding scheme to classify physicians into four decision-making roles (informative, facilitative, collaborative, or directive). We also coded post-simulation debriefing interviews for responses to the open-ended prompt: "During this encounter, what did you feel was your role in the management decision-making process?" Fifteen neonatologists (33% of the division) participated in the study; audio-recorded debriefing interviews were available for 13. We observed 9 (60%) take an informative role, providing medical information only; 2 (13%) take a facilitative role, additionally eliciting the patient's values; 3 (20%) take a collaborative role, additionally engaging the patient in deliberation and providing a recommendation; and 1 (7%) take a directive role, making a treatment decision independent of the patient. Almost all (10/13, 77%) of the neonatologists described their intended role as informative. Neonatologists did not routinely elicit preferences, engage in deliberation, or provide treatment recommendations-even in response to requests for recommendations. These findings suggest there may be a gap between policy recommendations calling for shared decision making and actual clinical practice.
49 CFR 172.201 - Preparation and retention of shipping papers.
Code of Federal Regulations, 2013 CFR
2013-10-01
... shipping description may not contain any code or abbreviation. (4) A shipping paper may contain additional... definition. (i) When the information applicable to the consignment is provided under this requirement the... hazardous waste, the shipping paper copy must be retained for three years after the material is accepted by...
76 FR 55882 - Procurement List; Addition
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-09
... Americans to work, earn income and be productive members of society. The AbilityOne Program, which the... Administration, Fort Worth, TX. Coverage: A-List for the Total Government Requirement as aggregated by the... Information Management). [FR Doc. 2011-23107 Filed 9-8-11; 8:45 am] BILLING CODE 6353-01-P ...
Quality, not just quantity: Lessons learned from HIV testing in Salvador, Brazil
MacCarthy, Sarah; Rasanathan, Jennifer J. K.; Dourado, Ines; Gruskin, Sofia
2015-01-01
Studies have demonstrated that an early HIV diagnosis is a critical first step toward continued engagement in care. We examined HIV testing experiences in Salvador, Brazil, to understand how a focus on quality services can inform service provision more generally in the post–2015 global health agenda. Seventeen semi-structured interviews were conducted with HIV-positive pregnant women in Salvador, a large urban centre of northeast Brazil. Interviews were transcribed, translated, and coded for analysis. Deductive codes confirmed factors identified in the literature review. Inductive codes highlighted new factors emerging from the initial coding. ‘Quality’ was defined according to global and national guidelines as HIV testing with informed and voluntary consent, counselling, and confidentiality (3Cs). No pregnant woman experienced all elements of the 3Cs. Three women did not experience any informed and voluntary consent, counselling, or confidentiality. Few women provided consent overall and none received pre-test counselling. Post-test counselling and confidentiality of services were more consistently provided. This study suggests that testing in Salvador—the third-largest city in the country—is not of the quality called for by global and national guidelines, despite the fact that HIV testing is being routinely provided for HIV-positive pregnant women in Brazil. Going forward, additional clarity around the 3Cs is necessary to improve how the quality, not just the quantity, of HIV services is measured. PMID:24881693
Patient health record on a smart card.
Naszlady, A; Naszlady, J
1998-02-01
A validated health questionnaire has been used for the documentation of a patient's history (826 items) and of the findings from physical examination (591 items) in our clinical ward for 25 years. This computerized patient record has been completed in EUCLIDES code (CEN TC/251) for laboratory tests and an ATC and EAN code listing for the names of the drugs permanently required by the patient. In addition, emergency data were also included on an EEPROM chipcard with a 24 kb capacity. The program is written in FOX-PRO language. A group of 5000 chronically ill in-patients received these cards which contain their health data. For security reasons the contents of the smart card is only accessible by a doctor's PIN coded key card. The personalization of each card was carried out in our health center and the depersonalized alphanumeric data were collected for further statistical evaluation. This information served as a basis for a real need assessment of health care and for the calculation of its cost. Code-combined with an optical card, a completely paperless electronic patient record system has been developed containing all three information carriers in medicine: Texts, Curves and Pictures.
Finite-block-length analysis in classical and quantum information theory.
Hayashi, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.
Finite-block-length analysis in classical and quantum information theory
HAYASHI, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962
GPU Computing in Bayesian Inference of Realized Stochastic Volatility Model
NASA Astrophysics Data System (ADS)
Takaishi, Tetsuya
2015-01-01
The realized stochastic volatility (RSV) model that utilizes the realized volatility as additional information has been proposed to infer volatility of financial time series. We consider the Bayesian inference of the RSV model by the Hybrid Monte Carlo (HMC) algorithm. The HMC algorithm can be parallelized and thus performed on the GPU for speedup. The GPU code is developed with CUDA Fortran. We compare the computational time in performing the HMC algorithm on GPU (GTX 760) and CPU (Intel i7-4770 3.4GHz) and find that the GPU can be up to 17 times faster than the CPU. We also code the program with OpenACC and find that appropriate coding can achieve the similar speedup with CUDA Fortran.
The FORTRAN static source code analyzer program (SAP) user's guide, revision 1
NASA Technical Reports Server (NTRS)
Decker, W.; Taylor, W.; Eslinger, S.
1982-01-01
The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.
Sanchez, Robersy; Grau, Ricardo
2005-09-01
A Boolean structure of the genetic code where Boolean deductions have biological and physicochemical meanings was discussed in a previous paper. Now, from these Boolean deductions we propose to define the value of amino acid information in order to consider the genetic information system as a communication system and to introduce the semantic content of information ignored by the conventional information theory. In this proposal, the value of amino acid information is proportional to the molecular weight of amino acids with a proportional constant of about 1.96 x 10(25) bits per kg. In addition to this, for the experimental estimations of the minimum energy dissipation in genetic logic operations, we present two postulates: (1) the energy Ei (i=1,2,...,20) of amino acids in the messages conveyed by proteins is proportional to the value of information, and (2) amino acids are distributed according to their energy Ei so the amino acid population in proteins follows a Boltzmann distribution. Specifically, in the genetic message carried by the DNA from the genomes of living organisms, we found that the minimum energy dissipation in genetic logic operations was close to kTLn(2) joules per bit.
Self-organisation of symbolic information
NASA Astrophysics Data System (ADS)
Feistel, R.
2017-01-01
Information is encountered in two different appearances, in native form by arbitrary physical structures, or in symbolic form by coded sequences of letters or the like. The self-organised emergence of symbolic information from structural information is referred to as a ritualisation transition. Occurring at some stage in evolutionary history, ritualisation transitions have in common that after the crossover, arbitrary symbols are issued and recognised by information-processing devices, by transmitters and receivers in the sense of Shannon's communication theory. Symbolic information-processing systems exhibit the fundamental code symmetry whose key features, such as largely lossless copying or persistence under hostile conditions, may elucidate the reasons for the repeated successful occurrence of ritualisation phenomena in evolution history. Ritualisation examples are briefly reviewed such as the origin of life, the appearance of human languages, the establishment of emergent social categories such as money, or the development of digital computers. In addition to their role as carriers of symbolic information, symbols are physical structures which also represent structural information. For a thermodynamic description of symbols and their arrangements, it appears reasonable to distinguish between Boltzmann entropy, Clausius entropy and Pauling entropy. Thermodynamic properties of symbols imply that their lifetimes are limited by the 2nd law.
Technology Readiness Assessment (TRA) Deskbook
2005-05-01
information such as expiration date and lot number. DoD will probably be in a position to use a commercially proven technology with an inherently low...federal regulations, DoD was able to gain approval of pyridostigmine bromide for prophylaxis against the lethal effects of the soman nerve agent . H...www.fda.gov/cber/ H.4 ADDITIONAL INFORMATION Federal Food, Drug, and Cosmetic (FD&C) Act United States Code, Title 21 – Food and Drugs (21USC
Fadare, Joseph O; Porteri, Corinna
2010-03-01
Informed consent is a basic requirement for the conduct of ethical research involving human subjects. Currently, the Helsinki Declaration of the World Medical Association and the International Ethical Guidelines for Biomedical Research of the Council for International Organizations of Medical Sciences (CIOMS) are widely accepted as international codes regulating human subject research and the informed consent sections of these documents are quite important. Debates on the applicability of these guidelines in different socio-cultural settings are ongoing and many workers have advocated the need for national or regional guidelines. Nigeria, a developing country, has recently adopted its national guideline regulating human subject research: the National Health Research Ethics Committee (NHREC) code. A content analysis of the three guidelines was done to see if the Nigerian guidelines confer any additional protection for research subjects. The concept of a Community Advisory Committee in the Nigerian guideline is a novel one that emphasizes research as a community burden and should promote a form of "research friendship" to foster the welfare of research participants. There is also the need for a regular update of the NHREC code so as to address some issues that were not considered in its current version.
Label consistent K-SVD: learning a discriminative dictionary for recognition.
Jiang, Zhuolin; Lin, Zhe; Davis, Larry S
2013-11-01
A label consistent K-SVD (LC-KSVD) algorithm to learn a discriminative dictionary for sparse coding is presented. In addition to using class labels of training data, we also associate label information with each dictionary item (columns of the dictionary matrix) to enforce discriminability in sparse codes during the dictionary learning process. More specifically, we introduce a new label consistency constraint called "discriminative sparse-code error" and combine it with the reconstruction error and the classification error to form a unified objective function. The optimal solution is efficiently obtained using the K-SVD algorithm. Our algorithm learns a single overcomplete dictionary and an optimal linear classifier jointly. The incremental dictionary learning algorithm is presented for the situation of limited memory resources. It yields dictionaries so that feature points with the same class labels have similar sparse codes. Experimental results demonstrate that our algorithm outperforms many recently proposed sparse-coding techniques for face, action, scene, and object category recognition under the same learning conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-30
...] FDA's Public Database of Products With Orphan-Drug Designation: Replacing Non-Informative Code Names... replaced non- informative code names with descriptive identifiers on its public database of products that... on our public database with non-informative code names. After careful consideration of this matter...
Lyon, Aaron R; Lewis, Cara C; Melvin, Abigail; Boyd, Meredith; Nicodimos, Semret; Liu, Freda F; Jungbluth, Nathaniel
2016-09-22
Health information technologies (HIT) have become nearly ubiquitous in the contemporary healthcare landscape, but information about HIT development, functionality, and implementation readiness is frequently siloed. Theory-driven methods of compiling, evaluating, and integrating information from the academic and commercial sectors are necessary to guide stakeholder decision-making surrounding HIT adoption and to develop pragmatic HIT research agendas. This article presents the Health Information Technologies-Academic and Commercial Evaluation (HIT-ACE) methodology, a structured, theory-driven method for compiling and evaluating information from multiple sectors. As an example demonstration of the methodology, we apply HIT-ACE to mental and behavioral health measurement feedback systems (MFS). MFS are a specific class of HIT that support the implementation of routine outcome monitoring, an evidence-based practice. HIT-ACE is guided by theories and frameworks related to user-centered design and implementation science. The methodology involves four phases: (1) coding academic and commercial materials, (2) developer/purveyor interviews, (3) linking putative implementation mechanisms to hit capabilities, and (4) experimental testing of capabilities and mechanisms. In the current demonstration, phase 1 included a systematic process to identify MFS in mental and behavioral health using academic literature and commercial websites. Using user-centered design, implementation science, and feedback frameworks, the HIT-ACE coding system was developed, piloted, and used to review each identified system for the presence of 38 capabilities and 18 additional characteristics via a consensus coding process. Bibliometic data were also collected to examine the representation of the systems in the scientific literature. As an example, results are presented for the application of HIT-ACE phase 1 to MFS wherein 49 separate MFS were identified, reflecting a diverse array of characteristics and capabilities. Preliminary findings demonstrate the utility of HIT-ACE to represent the scope and diversity of a given class of HIT beyond what can be identified in the academic literature. Phase 2 data collection is expected to confirm and expand the information presented and phases 3 and 4 will provide more nuanced information about the impact of specific HIT capabilities. In all, HIT-ACE is expected to support adoption decisions and additional HIT development and implementation research.
Travnik, Jaden B; Pilarski, Patrick M
2017-07-01
Prosthetic devices have advanced in their capabilities and in the number and type of sensors included in their design. As the space of sensorimotor data available to a conventional or machine learning prosthetic control system increases in dimensionality and complexity, it becomes increasingly important that this data be represented in a useful and computationally efficient way. Well structured sensory data allows prosthetic control systems to make informed, appropriate control decisions. In this study, we explore the impact that increased sensorimotor information has on current machine learning prosthetic control approaches. Specifically, we examine the effect that high-dimensional sensory data has on the computation time and prediction performance of a true-online temporal-difference learning prediction method as embedded within a resource-limited upper-limb prosthesis control system. We present results comparing tile coding, the dominant linear representation for real-time prosthetic machine learning, with a newly proposed modification to Kanerva coding that we call selective Kanerva coding. In addition to showing promising results for selective Kanerva coding, our results confirm potential limitations to tile coding as the number of sensory input dimensions increases. To our knowledge, this study is the first to explicitly examine representations for realtime machine learning prosthetic devices in general terms. This work therefore provides an important step towards forming an efficient prosthesis-eye view of the world, wherein prompt and accurate representations of high-dimensional data may be provided to machine learning control systems within artificial limbs and other assistive rehabilitation technologies.
NASA Astrophysics Data System (ADS)
Jansen, S. D.
1981-09-01
The ORBES region consists of all of Kentucky, most of West Virginia, substantial parts of Illinois, Indiana, and Ohio, and southwestern Pennsylvania. The inventory lists installed electrical generating capacity in commercial service as of December 1, 1976, and scheduled capacity additions and removals between 1977 and 1986 in the six ORBES states (Illinois, Indiana, Kentucky, Ohio, Pennsylvania, and West Virginia). The following information is included for each electrical generating unit: unit ID code, company index, whether point or industrial ownership, plant name, whether inside or outside the ORBES region, FIPS county code, type of unit, size in megawatts, type of megawatt rating, status of unit, data of commercial operation, scheduled retirement date, primary fuel, alternate fuel, type of cooling, source of cooling water, and source of information.
Trellis coded modulation for 4800-9600 bps transmission over a fading mobile satellite channel
NASA Technical Reports Server (NTRS)
Divsalar, D.; Simon, M. K.
1986-01-01
The combination of trellis coding and multiple phase-shift-keyed (MPSK) signalling with the addition of asymmetry to the signal set is discussed with regard to its suitability as a modulation/coding scheme for the fading mobile satellite channel. For MPSK, introducing nonuniformity (asymmetry) into the spacing between signal points in the constellation buys a further improvement in performance over that achievable with trellis coded symmetric MPSK, all this without increasing average or peak power, or changing the bandwidth constraints imposed on the system. Whereas previous contributions have considered the performance of trellis coded modulation transmitted over an additive white Gaussian noise (AWGN) channel, the emphasis in the paper is on the performance of trellis coded MPSK in the fading environment. The results will be obtained by using a combination of analysis and simulation. It will be assumed that the effect of the fading on the phase of the received signal is fully compensated for either by tracking it with some form of phase-locked loop or with pilot tone calibration techniques. Thus, results will reflect only the degradation due to the effect of the fading on the amplitude of the received signal. Also, we shall consider only the case where interleaving/deinterleaving is employed to further combat the fading. This allows for considerable simplification of the analysis and is of great practical interest. Finally, the impact of the availability of channel state information on average bit error probability performance is assessed.
NASA Technical Reports Server (NTRS)
Cheng, Michael K.; Lyubarev, Mark; Nakashima, Michael A.; Andrews, Kenneth S.; Lee, Dennis
2008-01-01
Low-density parity-check (LDPC) codes are the state-of-the-art in forward error correction (FEC) technology that exhibits capacity approaching performance. The Jet Propulsion Laboratory (JPL) has designed a family of LDPC codes that are similar in structure and therefore, leads to a single decoder implementation. The Accumulate-Repeat-by-4-Jagged- Accumulate (AR4JA) code design offers a family of codes with rates 1/2, 2/3, 4/5 and lengths 1024, 4096, 16384 information bits. Performance is less than one dB from capacity for all combinations.Integrating a stand-alone LDPC decoder with a commercial-off-the-shelf (COTS) receiver faces additional challenges than building a single receiver-decoder unit from scratch. In this work, we outline the issues and show that these additional challenges can be over-come by simple solutions. To demonstrate that an LDPC decoder can be made to work seamlessly with a COTS receiver, we interface an AR4JA LDPC decoder developed on a field-programmable gate array (FPGA) with a modern high data rate receiver and mea- sure the combined receiver-decoder performance. Through optimizations that include an improved frame synchronizer and different soft-symbol scaling algorithms, we show that a combined implementation loss of less than one dB is possible and therefore, most of the coding gain evidence in theory can also be obtained in practice. Our techniques can benefit any modem that utilizes an advanced FEC code.
Limited distortion in LSB steganography
NASA Astrophysics Data System (ADS)
Kim, Younhee; Duric, Zoran; Richards, Dana
2006-02-01
It is well known that all information hiding methods that modify the least significant bits introduce distortions into the cover objects. Those distortions have been utilized by steganalysis algorithms to detect that the objects had been modified. It has been proposed that only coefficients whose modification does not introduce large distortions should be used for embedding. In this paper we propose an effcient algorithm for information hiding in the LSBs of JPEG coefficients. Our algorithm uses parity coding to choose the coefficients whose modifications introduce minimal additional distortion. We derive the expected value of the additional distortion as a function of the message length and the probability distribution of the JPEG quantization errors of cover images. Our experiments show close agreement between the theoretical prediction and the actual additional distortion.
High rate concatenated coding systems using bandwidth efficient trellis inner codes
NASA Technical Reports Server (NTRS)
Deng, Robert H.; Costello, Daniel J., Jr.
1989-01-01
High-rate concatenated coding systems with bandwidth-efficient trellis inner codes and Reed-Solomon (RS) outer codes are investigated for application in high-speed satellite communication systems. Two concatenated coding schemes are proposed. In one the inner code is decoded with soft-decision Viterbi decoding, and the outer RS code performs error-correction-only decoding (decoding without side information). In the other, the inner code is decoded with a modified Viterbi algorithm, which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, whereas branch metrics are used to provide reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. The two schemes have been proposed for high-speed data communication on NASA satellite channels. The rates considered are at least double those used in current NASA systems, and the results indicate that high system reliability can still be achieved.
The Simple Video Coder: A free tool for efficiently coding social video data.
Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C
2017-08-01
Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.
QR code for medical information uses.
Fontelo, Paul; Liu, Fang; Ducut, Erick G
2008-11-06
We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.
Schmitz, Matthew; Forst, Linda
2016-02-15
Inclusion of information about a patient's work, industry, and occupation, in the electronic health record (EHR) could facilitate occupational health surveillance, better health outcomes, prevention activities, and identification of workers' compensation cases. The US National Institute for Occupational Safety and Health (NIOSH) has developed an autocoding system for "industry" and "occupation" based on 1990 Bureau of Census codes; its effectiveness requires evaluation in conjunction with promoting the mandatory addition of these variables to the EHR. The objective of the study was to evaluate the intercoder reliability of NIOSH's Industry and Occupation Computerized Coding System (NIOCCS) when applied to data collected in a community survey conducted under the Affordable Care Act; to determine the proportion of records that are autocoded using NIOCCS. Standard Occupational Classification (SOC) codes are used by several federal agencies in databases that capture demographic, employment, and health information to harmonize variables related to work activities among these data sources. There are 359 industry and occupation responses that were hand coded by 2 investigators, who came to a consensus on every code. The same variables were autocoded using NIOCCS at the high and moderate criteria level. Kappa was .84 for agreement between hand coders and between the hand coder consensus code versus NIOCCS high confidence level codes for the first 2 digits of the SOC code. For 4 digits, NIOCCS coding versus investigator coding ranged from kappa=.56 to .70. In this study, NIOCCS was able to achieve production rates (ie, to autocode) 31%-36% of entered variables at the "high confidence" level and 49%-58% at the "medium confidence" level. Autocoding (production) rates are somewhat lower than those reported by NIOSH. Agreement between manually coded and autocoded data are "substantial" at the 2-digit level, but only "fair" to "good" at the 4-digit level. This work serves as a baseline for performance of NIOCCS by investigators in the field. Further field testing will clarify NIOCCS effectiveness in terms of ability to assign codes and coding accuracy and will clarify its value as inclusion of these occupational variables in the EHR is promoted.
Planning for Pre-Exascale Platform Environment (Fiscal Year 2015 Level 2 Milestone 5216)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R.; Lang, M.; Noe, J.
This Plan for ASC Pre-Exascale Platform Environments document constitutes the deliverable for the fiscal year 2015 (FY15) Advanced Simulation and Computing (ASC) Program Level 2 milestone Planning for Pre-Exascale Platform Environment. It acknowledges and quantifies challenges and recognized gaps for moving the ASC Program towards effective use of exascale platforms and recommends strategies to address these gaps. This document also presents an update to the concerns, strategies, and plans presented in the FY08 predecessor document that dealt with the upcoming (at the time) petascale high performance computing (HPC) platforms. With the looming push towards exascale systems, a review of themore » earlier document was appropriate in light of the myriad architectural choices currently under consideration. The ASC Program believes the platforms to be fielded in the 2020s will be fundamentally different systems that stress ASC’s ability to modify codes to take full advantage of new or unique features. In addition, the scale of components will increase the difficulty of maintaining an errorfree system, thus driving new approaches to resilience and error detection/correction. The code revamps of the past, from serial- to vector-centric code to distributed memory to threaded implementations, will be revisited as codes adapt to a new message passing interface (MPI) plus “x” or more advanced and dynamic programming models based on architectural specifics. Development efforts are already underway in some cases, and more difficult or uncertain aspects of the new architectures will require research and analysis that may inform future directions for program choices. In addition, the potential diversity of system architectures may require parallel if not duplicative efforts to analyze and modify environments, codes, subsystems, libraries, debugging tools, and performance analysis techniques as well as exploring new monitoring methodologies. It is difficult if not impossible to selectively eliminate some of these activities until more information is available through simulations of potential architectures, analysis of systems designs, and informed study of commodity technologies that will be the constituent parts of future platforms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, C.R.
The SNODOG Glossary is used by the DOE-supported life-span beagle studies to describe medical observations in a standardized format. It is an adaptation of the human medical glossary, SNOMED, which lists 107,165 terms. Each of the five laboratories, Argonne National Laboratory, the Inhalation Toxicology Research Institute, the Pacific Northwest Laboratory, the University of California at Davis, and the University of Utah, has selected an appropriate subset from the published SNOMED glossary and added beagle and research-specific terms. The National Radiobiology Archives is the coordinator of these enhancements, and periodically distributes SNODOG to the respective laboratories. Information donated by Colorado Statemore » University and Oak Ridge National Laboratory has been related to SNODOG and is available in a standardized format. This document is designed for the database manager and the scientist who will be managing or coding medical observations. It is also designed for the scientist analyzing coded information. The document includes: an overview of the NRA and the SNODOG glossary, a discussion of hardware requirements, a review of the SNODOG code structure and printed lists of the 4,770 terms which have been used at least once. Instructions for obtaining electronic copies of the glossary and for nominating additional terms are provided. This document describes the origins and structure of the SNODOG codes, explains code usage at each participating institution, and presents a usage frequency tabulation of the terms for neoplasia. A diskette or magnetic tape containing 15,641 SNODOG codes and translations is available on request.« less
SNODOG Glossary: Part 1, Introduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, C.R.
The SNODOG Glossary is used by the DOE-supported life-span beagle studies to describe medical observations in a standardized format. It is an adaptation of the human medical glossary, SNOMED, which lists 107,165 terms. Each of the five laboratories, Argonne National Laboratory, the Inhalation Toxicology Research Institute, the Pacific Northwest Laboratory, the University of California at Davis, and the University of Utah, has selected an appropriate subset from the published SNOMED glossary and added beagle and research-specific terms. The National Radiobiology Archives is the coordinator of these enhancements, and periodically distributes SNODOG to the respective laboratories. Information donated by Colorado Statemore » University and Oak Ridge National Laboratory has been related to SNODOG and is available in a standardized format. This document is designed for the database manager and the scientist who will be managing or coding medical observations. It is also designed for the scientist analyzing coded information. The document includes: an overview of the NRA and the SNODOG glossary, a discussion of hardware requirements, a review of the SNODOG code structure and printed lists of the 4,770 terms which have been used at least once. Instructions for obtaining electronic copies of the glossary and for nominating additional terms are provided. This document describes the origins and structure of the SNODOG codes, explains code usage at each participating institution, and presents a usage frequency tabulation of the terms for neoplasia. A diskette or magnetic tape containing 15,641 SNODOG codes and translations is available on request.« less
Computer-based learning of spelling skills in children with and without dyslexia.
Kast, Monika; Baschera, Gian-Marco; Gross, Markus; Jäncke, Lutz; Meyer, Martin
2011-12-01
Our spelling training software recodes words into multisensory representations comprising visual and auditory codes. These codes represent information about letters and syllables of a word. An enhanced version, developed for this study, contains an additional phonological code and an improved word selection controller relying on a phoneme-based student model. We investigated the spelling behavior of children by means of learning curves based on log-file data of the previous and the enhanced software version. First, we compared the learning progress of children with dyslexia working either with the previous software (n = 28) or the adapted version (n = 37). Second, we investigated the spelling behavior of children with dyslexia (n = 37) and matched children without dyslexia (n = 25). To gain deeper insight into which factors are relevant for acquiring spelling skills, we analyzed the influence of cognitive abilities, such as attention functions and verbal memory skills, on the learning behavior. All investigations of the learning process are based on learning curve analyses of the collected log-file data. The results evidenced that those children with dyslexia benefit significantly from the additional phonological cue and the corresponding phoneme-based student model. Actually, children with dyslexia improve their spelling skills to the same extent as children without dyslexia and were able to memorize phoneme to grapheme correspondence when given the correct support and adequate training. In addition, children with low attention functions benefit from the structured learning environment. Generally, our data showed that memory sources are supportive cognitive functions for acquiring spelling skills and for using the information cues of a multi-modal learning environment.
Identification of common, unique and polymorphic microsatellites among 73 cyanobacterial genomes.
Kabra, Ritika; Kapil, Aditi; Attarwala, Kherunnisa; Rai, Piyush Kant; Shanker, Asheesh
2016-04-01
Microsatellites also known as Simple Sequence Repeats are short tandem repeats of 1-6 nucleotides. These repeats are found in coding as well as non-coding regions of both prokaryotic and eukaryotic genomes and play a significant role in the study of gene regulation, genetic mapping, DNA fingerprinting and evolutionary studies. The availability of 73 complete genome sequences of cyanobacteria enabled us to mine and statistically analyze microsatellites in these genomes. The cyanobacterial microsatellites identified through bioinformatics analysis were stored in a user-friendly database named CyanoSat, which is an efficient data representation and query system designed using ASP.net. The information in CyanoSat comprises of perfect, imperfect and compound microsatellites found in coding, non-coding and coding-non-coding regions. Moreover, it contains PCR primers with 200 nucleotides long flanking region. The mined cyanobacterial microsatellites can be freely accessed at www.compubio.in/CyanoSat/home.aspx. In addition to this 82 polymorphic, 13,866 unique and 2390 common microsatellites were also detected. These microsatellites will be useful in strain identification and genetic diversity studies of cyanobacteria.
NASA Astrophysics Data System (ADS)
Kurceren, Ragip; Modestino, James W.
1998-12-01
The use of forward error-control (FEC) coding, possibly in conjunction with ARQ techniques, has emerged as a promising approach for video transport over ATM networks for cell-loss recovery and/or bit error correction, such as might be required for wireless links. Although FEC provides cell-loss recovery capabilities it also introduces transmission overhead which can possibly cause additional cell losses. A methodology is described to maximize the number of video sources multiplexed at a given quality of service (QoS), measured in terms of decoded cell loss probability, using interlaced FEC codes. The transport channel is modelled as a block interference channel (BIC) and the multiplexer as single server, deterministic service, finite buffer supporting N users. Based upon an information-theoretic characterization of the BIC and large deviation bounds on the buffer overflow probability, the described methodology provides theoretically achievable upper limits on the number of sources multiplexed. Performance of specific coding techniques using interlaced nonbinary Reed-Solomon (RS) codes and binary rate-compatible punctured convolutional (RCPC) codes is illustrated.
Diversity-optimal power loading for intensity modulated MIMO optical wireless communications.
Zhang, Yan-Yu; Yu, Hong-Yi; Zhang, Jian-Kang; Zhu, Yi-Jun
2016-04-18
In this paper, we consider the design of space code for an intensity modulated direct detection multi-input-multi-output optical wireless communication (IM/DD MIMO-OWC) system, in which channel coefficients are independent and non-identically log-normal distributed, with variances and means known at the transmitter and channel state information available at the receiver. Utilizing the existing space code design criterion for IM/DD MIMO-OWC with a maximum likelihood (ML) detector, we design a diversity-optimal space code (DOSC) that maximizes both large-scale diversity and small-scale diversity gains and prove that the spatial repetition code (RC) with a diversity-optimized power allocation is diversity-optimal among all the high dimensional nonnegative space code schemes under a commonly used optical power constraint. In addition, we show that one of significant advantages of the DOSC is to allow low-complexity ML detection. Simulation results indicate that in high signal-to-noise ratio (SNR) regimes, our proposed DOSC significantly outperforms RC, which is the best space code currently available for such system.
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
Lowet, Eric; Roberts, Mark; Hadjipapas, Avgis; Peter, Alina; van der Eerden, Jan; De Weerd, Peter
2015-02-01
Fine-scale temporal organization of cortical activity in the gamma range (∼25-80Hz) may play a significant role in information processing, for example by neural grouping ('binding') and phase coding. Recent experimental studies have shown that the precise frequency of gamma oscillations varies with input drive (e.g. visual contrast) and that it can differ among nearby cortical locations. This has challenged theories assuming widespread gamma synchronization at a fixed common frequency. In the present study, we investigated which principles govern gamma synchronization in the presence of input-dependent frequency modulations and whether they are detrimental for meaningful input-dependent gamma-mediated temporal organization. To this aim, we constructed a biophysically realistic excitatory-inhibitory network able to express different oscillation frequencies at nearby spatial locations. Similarly to cortical networks, the model was topographically organized with spatially local connectivity and spatially-varying input drive. We analyzed gamma synchronization with respect to phase-locking, phase-relations and frequency differences, and quantified the stimulus-related information represented by gamma phase and frequency. By stepwise simplification of our models, we found that the gamma-mediated temporal organization could be reduced to basic synchronization principles of weakly coupled oscillators, where input drive determines the intrinsic (natural) frequency of oscillators. The gamma phase-locking, the precise phase relation and the emergent (measurable) frequencies were determined by two principal factors: the detuning (intrinsic frequency difference, i.e. local input difference) and the coupling strength. In addition to frequency coding, gamma phase contained complementary stimulus information. Crucially, the phase code reflected input differences, but not the absolute input level. This property of relative input-to-phase conversion, contrasting with latency codes or slower oscillation phase codes, may resolve conflicting experimental observations on gamma phase coding. Our modeling results offer clear testable experimental predictions. We conclude that input-dependency of gamma frequencies could be essential rather than detrimental for meaningful gamma-mediated temporal organization of cortical activity.
Lowet, Eric; Roberts, Mark; Hadjipapas, Avgis; Peter, Alina; van der Eerden, Jan; De Weerd, Peter
2015-01-01
Fine-scale temporal organization of cortical activity in the gamma range (∼25–80Hz) may play a significant role in information processing, for example by neural grouping (‘binding’) and phase coding. Recent experimental studies have shown that the precise frequency of gamma oscillations varies with input drive (e.g. visual contrast) and that it can differ among nearby cortical locations. This has challenged theories assuming widespread gamma synchronization at a fixed common frequency. In the present study, we investigated which principles govern gamma synchronization in the presence of input-dependent frequency modulations and whether they are detrimental for meaningful input-dependent gamma-mediated temporal organization. To this aim, we constructed a biophysically realistic excitatory-inhibitory network able to express different oscillation frequencies at nearby spatial locations. Similarly to cortical networks, the model was topographically organized with spatially local connectivity and spatially-varying input drive. We analyzed gamma synchronization with respect to phase-locking, phase-relations and frequency differences, and quantified the stimulus-related information represented by gamma phase and frequency. By stepwise simplification of our models, we found that the gamma-mediated temporal organization could be reduced to basic synchronization principles of weakly coupled oscillators, where input drive determines the intrinsic (natural) frequency of oscillators. The gamma phase-locking, the precise phase relation and the emergent (measurable) frequencies were determined by two principal factors: the detuning (intrinsic frequency difference, i.e. local input difference) and the coupling strength. In addition to frequency coding, gamma phase contained complementary stimulus information. Crucially, the phase code reflected input differences, but not the absolute input level. This property of relative input-to-phase conversion, contrasting with latency codes or slower oscillation phase codes, may resolve conflicting experimental observations on gamma phase coding. Our modeling results offer clear testable experimental predictions. We conclude that input-dependency of gamma frequencies could be essential rather than detrimental for meaningful gamma-mediated temporal organization of cortical activity. PMID:25679780
Efficient Polar Coding of Quantum Information
NASA Astrophysics Data System (ADS)
Renes, Joseph M.; Dupuis, Frédéric; Renner, Renato
2012-08-01
Polar coding, introduced 2008 by Arıkan, is the first (very) efficiently encodable and decodable coding scheme whose information transmission rate provably achieves the Shannon bound for classical discrete memoryless channels in the asymptotic limit of large block sizes. Here, we study the use of polar codes for the transmission of quantum information. Focusing on the case of qubit Pauli channels and qubit erasure channels, we use classical polar codes to construct a coding scheme that asymptotically achieves a net transmission rate equal to the coherent information using efficient encoding and decoding operations and code construction. Our codes generally require preshared entanglement between sender and receiver, but for channels with a sufficiently low noise level we demonstrate that the rate of preshared entanglement required is zero.
The effect of multiple internal representations on context-rich instruction
NASA Astrophysics Data System (ADS)
Lasry, Nathaniel; Aulls, Mark W.
2007-11-01
We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.
Quantum image coding with a reference-frame-independent scheme
NASA Astrophysics Data System (ADS)
Chapeau-Blondeau, François; Belin, Etienne
2016-07-01
For binary images, or bit planes of non-binary images, we investigate the possibility of a quantum coding decodable by a receiver in the absence of reference frames shared with the emitter. Direct image coding with one qubit per pixel and non-aligned frames leads to decoding errors equivalent to a quantum bit-flip noise increasing with the misalignment. We show the feasibility of frame-invariant coding by using for each pixel a qubit pair prepared in one of two controlled entangled states. With just one common axis shared between the emitter and receiver, exact decoding for each pixel can be obtained by means of two two-outcome projective measurements operating separately on each qubit of the pair. With strictly no alignment information between the emitter and receiver, exact decoding can be obtained by means of a two-outcome projective measurement operating jointly on the qubit pair. In addition, the frame-invariant coding is shown much more resistant to quantum bit-flip noise compared to the direct non-invariant coding. For a cost per pixel of two (entangled) qubits instead of one, complete frame-invariant image coding and enhanced noise resistance are thus obtained.
Bachman, Peter; Reichenberg, Abraham; Rice, Patrick; Woolsey, Mary; Chaves, Olga; Martinez, David; Maples, Natalie; Velligan, Dawn I; Glahn, David C
2010-05-01
Cognitive processing inefficiency, often measured using digit symbol coding tasks, is a putative vulnerability marker for schizophrenia and a reliable indicator of illness severity and functional outcome. Indeed, performance on the digit symbol coding task may be the most severe neuropsychological deficit patients with schizophrenia display at the group level. Yet, little is known about the contributions of simpler cognitive processes to coding performance in schizophrenia (e.g. decision making, visual scanning, relational memory, motor ability). We developed an experimental behavioral task, based on a computerized digit symbol coding task, which allows the manipulation of demands placed on visual scanning efficiency and relational memory while holding decisional and motor requirements constant. Although patients (n=85) were impaired on all aspects of the task when compared to demographically matched healthy comparison subjects (n=30), they showed a particularly striking failure to benefit from the presence of predictable target information. These findings are consistent with predicted impairments in cognitive processing speed due to schizophrenia patients' well-known memory impairment, suggesting that this mnemonic deficit may have consequences for critical aspects of information processing that are traditionally considered quite separate from the memory domain. Future investigation into the mechanisms underlying the wide-ranging consequences of mnemonic deficits in schizophrenia should provide additional insight. Copyright (c) 2010 Elsevier B.V. All rights reserved.
21 CFR 172.215 - Coumarone-indene resin.
Code of Federal Regulations, 2010 CFR
2010-04-01
... examined at the National Archives and Records Administration (NARA). For information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov/federal_register/code_of... fresh-weight basis. (d) To assure safe use of the additive: (1) The label of the market package or any...
The reliability of cause-of-death coding in The Netherlands.
Harteloh, Peter; de Bruin, Kim; Kardaun, Jan
2010-08-01
Cause-of-death statistics are a major source of information for epidemiological research or policy decisions. Information on the reliability of these statistics is important for interpreting trends in time or differences between populations. Variations in coding the underlying cause of death could hinder the attribution of observed differences to determinants of health. Therefore we studied the reliability of cause-of-death statistics in The Netherlands. We performed a double coding study. Death certificates from the month of May 2005 were coded again in 2007. Each death certificate was coded manually by four coders. Reliability was measured by calculating agreement between coders (intercoder agreement) and by calculating the consistency of each individual coder in time (intracoder agreement). Our analysis covered an amount of 10,833 death certificates. The intercoder agreement of four coders on the underlying cause of death was 78%. In 2.2% of the cases coders agreed on a change of the code assigned in 2005. The (mean) intracoder agreement of four coders was 89%. Agreement was associated with the specificity of the ICD-10 code (chapter, three digits, four digits), the age of the deceased, the number of coders and the number of diseases reported on the death certificate. The reliability of cause-of-death statistics turned out to be high (>90%) for major causes of death such as cancers and acute myocardial infarction. For chronic diseases, such as diabetes and renal insufficiency, reliability was low (<70%). The reliability of cause-of-death statistics varies by ICD-10 code/chapter. A statistical office should provide coders with (additional) rules for coding diseases with a low reliability and evaluate these rules regularly. Users of cause-of-death statistics should exercise caution when interpreting causes of death with a low reliability. Studies of reliability should take into account the number of coders involved and the number of codes on a death certificate.
USSR Space Life Sciences Digest. Index to issues 1-4
NASA Technical Reports Server (NTRS)
Teeter, R.; Hooke, L. R.
1986-01-01
This document is an index to issues 1 to 4 of the USSR Space Life Sciences Digest and is arranged in three sections. In section 1, abstracts from the first four issues are grouped according to subject; please note the four letter codes in the upper right hand corner of the pages. Section 2 lists the categories according to which digest entries are grouped and cites additional entries relevant to that category by four letter code and entry number in section 1. Refer to section 1 for titles and other pertinent information. Key words are indexed in section 3.
Massot, Corentin; Chacron, Maurice J.
2011-01-01
Understanding how sensory neurons transmit information about relevant stimuli remains a major goal in neuroscience. Of particular relevance are the roles of neural variability and spike timing in neural coding. Peripheral vestibular afferents display differential variability that is correlated with the importance of spike timing; regular afferents display little variability and use a timing code to transmit information about sensory input. Irregular afferents, conversely, display greater variability and instead use a rate code. We studied how central neurons within the vestibular nuclei integrate information from both afferent classes by recording from a group of neurons termed vestibular only (VO) that are known to make contributions to vestibulospinal reflexes and project to higher-order centers. We found that, although individual central neurons had sensitivities that were greater than or equal to those of individual afferents, they transmitted less information. In addition, their velocity detection thresholds were significantly greater than those of individual afferents. This is because VO neurons display greater variability, which is detrimental to information transmission and signal detection. Combining activities from multiple VO neurons increased information transmission. However, the information rates were still much lower than those of equivalent afferent populations. Furthermore, combining responses from multiple VO neurons led to lower velocity detection threshold values approaching those measured from behavior (∼2.5 vs. 0.5–1°/s). Our results suggest that the detailed time course of vestibular stimuli encoded by afferents is not transmitted by VO neurons. Instead, they suggest that higher vestibular pathways must integrate information from central vestibular neuron populations to give rise to behaviorally observed detection thresholds. PMID:21307329
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Chirstopher O.; Kleb, Bil
2010-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, William L.
2013-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintain ability by eliminating the requirement for problem dependent recompilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2011-01-01
This users manual provides in-depth information concerning installation and execution of Laura, version 5. Laura is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 Laura code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, Laura now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2009-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2009-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
Leadership Class Configuration Interaction Code - Status and Opportunities
NASA Astrophysics Data System (ADS)
Vary, James
2011-10-01
With support from SciDAC-UNEDF (www.unedf.org) nuclear theorists have developed and are continuously improving a Leadership Class Configuration Interaction Code (LCCI) for forefront nuclear structure calculations. The aim of this project is to make state-of-the-art nuclear structure tools available to the entire community of researchers including graduate students. The project includes codes such as NuShellX, MFDn and BIGSTICK that run a range of computers from laptops to leadership class supercomputers. Codes, scripts, test cases and documentation have been assembled, are under continuous development and are scheduled for release to the entire research community in November 2011. A covering script that accesses the appropriate code and supporting files is under development. In addition, a Data Base Management System (DBMS) that records key information from large production runs and archived results of those runs has been developed (http://nuclear.physics.iastate.edu/info/) and will be released. Following an outline of the project, the code structure, capabilities, the DBMS and current efforts, I will suggest a path forward that would benefit greatly from a significant partnership between researchers who use the codes, code developers and the National Nuclear Data efforts. This research is supported in part by DOE under grant DE-FG02-87ER40371 and grant DE-FC02-09ER41582 (SciDAC-UNEDF).
Augmented reality warnings in vehicles: Effects of modality and specificity on effectiveness.
Schwarz, Felix; Fastenmeier, Wolfgang
2017-04-01
In the future, vehicles will be able to warn drivers of hidden dangers before they are visible. Specific warning information about these hazards could improve drivers' reactions and the warning effectiveness, but could also impair them, for example, by additional cognitive-processing costs. In a driving simulator study with 88 participants, we investigated the effects of modality (auditory vs. visual) and specificity (low vs. high) on warning effectiveness. For the specific warnings, we used augmented reality as an advanced technology to display the additional auditory or visual warning information. Part one of the study concentrates on the effectiveness of necessary warnings and part two on the drivers' compliance despite false alarms. For the first warning scenario, we found several positive main effects of specificity. However, subsequent effects of specificity were moderated by the modality of the warnings. The specific visual warnings were observed to have advantages over the three other warning designs concerning gaze and braking reaction times, passing speeds and collision rates. Besides the true alarms, braking reaction times as well as subjective evaluation after these warnings were still improved despite false alarms. The specific auditory warnings were revealed to have only a few advantages, but also several disadvantages. The results further indicate that the exact coding of additional information, beyond its mere amount and modality, plays an important role. Moreover, the observed advantages of the specific visual warnings highlight the potential benefit of augmented reality coding to improve future collision warnings. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atamturktur, Sez; Unal, Cetin; Hemez, Francois
The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed frameworkmore » is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this framework, the project team has focused on optimizing resource allocation for improving numerical models through further code development and experimentation. Related to further code development, we have developed a code prioritization index (CPI) for coupled numerical models. CPI is implemented to effectively improve the predictive capability of the coupled model by increasing the sophistication of constituent codes. In relation to designing new experiments, we investigated the information gained by the addition of each new experiment used for calibration and bias correction of a simulation model. Additionally, the variability of ‘information gain’ through the design domain has been investigated in order to identify the experiment settings where maximum information gain occurs and thus guide the experimenters in the selection of the experiment settings. This idea was extended to evaluate the information gain from each experiment can be improved by intelligently selecting the experiments, leading to the development of the Batch Sequential Design (BSD) technique. Additionally, we evaluated the importance of sufficiently exploring the domain of applicability in experiment-based validation of high-consequence modeling and simulation by developing a new metric to quantify coverage. This metric has also been incorporated into the design of new experiments. Finally, we have proposed a data-aware calibration approach for the calibration of numerical models. This new method considers the complexity of a numerical model (the number of parameters to be calibrated, parameter uncertainty, and form of the model) and seeks to identify the number of experiments necessary to calibrate the model based on the level of sophistication of the physics. The final component in the project team’s work to improve model calibration and validation methods is the incorporation of robustness to non-probabilistic uncertainty in the input parameters. This is an improvement to model validation and uncertainty quantification stemming beyond the originally proposed scope of the project. We have introduced a new metric for incorporating the concept of robustness into experiment-based validation of numerical models. This project has accounted for the graduation of two Ph.D. students (Kendra Van Buren and Josh Hegenderfer) and two M.S. students (Matthew Egeberg and Parker Shields). One of the doctoral students is now working in the nuclear engineering field and the other one is a post-doctoral fellow at the Los Alamos National Laboratory. Additionally, two more Ph.D. students (Garrison Stevens and Tunc Kulaksiz) who are working towards graduation have been supported by this project.« less
Improved Iterative Decoding of Network-Channel Codes for Multiple-Access Relay Channel.
Majumder, Saikat; Verma, Shrish
2015-01-01
Cooperative communication using relay nodes is one of the most effective means of exploiting space diversity for low cost nodes in wireless network. In cooperative communication, users, besides communicating their own information, also relay the information of other users. In this paper we investigate a scheme where cooperation is achieved using a common relay node which performs network coding to provide space diversity for two information nodes transmitting to a base station. We propose a scheme which uses Reed-Solomon error correcting code for encoding the information bit at the user nodes and convolutional code as network code, instead of XOR based network coding. Based on this encoder, we propose iterative soft decoding of joint network-channel code by treating it as a concatenated Reed-Solomon convolutional code. Simulation results show significant improvement in performance compared to existing scheme based on compound codes.
Ledford, Christy J W; Willett, Kristen L; Kreps, Gary L
2012-01-01
For 10 years, the National Network for Immunization Information (NNii) has pursued its goal to "provide the public, health professionals, policy makers, and the media with up-to-date, scientifically valid information related to immunizations to help them understand the issues and to make informed decisions." This investigation provides a critical evaluation of the strategic communication planning and implementation of NNii from conception to present day. The study uses a case study methodology, developing a systematic analysis of organizational documents, the media environment, and in-depth interviews by applying Weick's model of organizing as an interpretive framework. Iterative data analysis included open coding, axial coding, and thematic saturation. Themes were compared with phases of strategic communication and present study propositions. Major themes identified included the organization's informative nature, funding credibility, nonbranding, reflective evaluation, collaborative partnerships, and media strategy. NNii meets the requirements of requisite variety, nonsummativity, and organizational flexibility proposed by Weick's model of organizing. However, a lack of systematic evaluation of organization goals prevents it from adapting communication tactics and strategies. In addition, the authors recommend that NNii, while maintaining its informative nature, adopt persuasive strategies to attract and retain the attention of its target audiences.
Experimental demonstration of entanglement-assisted coding using a two-mode squeezed vacuum state
NASA Astrophysics Data System (ADS)
Mizuno, Jun; Wakui, Kentaro; Furusawa, Akira; Sasaki, Masahide
2005-01-01
We have experimentally realized the scheme initially proposed as quantum dense coding with continuous variables [
NASA Astrophysics Data System (ADS)
Couvreur, A.
2009-05-01
The theory of algebraic-geometric codes has been developed in the beginning of the 80's after a paper of V.D. Goppa. Given a smooth projective algebraic curve X over a finite field, there are two different constructions of error-correcting codes. The first one, called "functional", uses some rational functions on X and the second one, called "differential", involves some rational 1-forms on this curve. Hundreds of papers are devoted to the study of such codes. In addition, a generalization of the functional construction for algebraic varieties of arbitrary dimension is given by Y. Manin in an article of 1984. A few papers about such codes has been published, but nothing has been done concerning a generalization of the differential construction to the higher-dimensional case. In this thesis, we propose a differential construction of codes on algebraic surfaces. Afterwards, we study the properties of these codes and particularly their relations with functional codes. A pretty surprising fact is that a main difference with the case of curves appears. Indeed, if in the case of curves, a differential code is always the orthogonal of a functional one, this assertion generally fails for surfaces. Last observation motivates the study of codes which are the orthogonal of some functional code on a surface. Therefore, we prove that, under some condition on the surface, these codes can be realized as sums of differential codes. Moreover, we show that some answers to some open problems "a la Bertini" could give very interesting informations on the parameters of these codes.
How Confounder Strength Can Affect Allocation of Resources in Electronic Health Records.
Lynch, Kristine E; Whitcomb, Brian W; DuVall, Scott L
2018-01-01
When electronic health record (EHR) data are used, multiple approaches may be available for measuring the same variable, introducing potentially confounding factors. While additional information may be gleaned and residual confounding reduced through resource-intensive assessment methods such as natural language processing (NLP), whether the added benefits offset the added cost of the additional resources is not straightforward. We evaluated the implications of misclassification of a confounder when using EHRs. Using a combination of simulations and real data surrounding hospital readmission, we considered smoking as a potential confounder. We compared ICD-9 diagnostic code assignment, which is an easily available measure but has the possibility of substantial misclassification of smoking status, with NLP, a method of determining smoking status that more expensive and time-consuming than ICD-9 code assignment but has less potential for misclassification. Classification of smoking status with NLP consistently produced less residual confounding than the use of ICD-9 codes; however, when minimal confounding was present, differences between the approaches were small. When considerable confounding is present, investing in a superior measurement tool becomes advantageous.
Methods of evaluating the effects of coding on SAR data
NASA Technical Reports Server (NTRS)
Dutkiewicz, Melanie; Cumming, Ian
1993-01-01
It is recognized that mean square error (MSE) is not a sufficient criterion for determining the acceptability of an image reconstructed from data that has been compressed and decompressed using an encoding algorithm. In the case of Synthetic Aperture Radar (SAR) data, it is also deemed to be insufficient to display the reconstructed image (and perhaps error image) alongside the original and make a (subjective) judgment as to the quality of the reconstructed data. In this paper we suggest a number of additional evaluation criteria which we feel should be included as evaluation metrics in SAR data encoding experiments. These criteria have been specifically chosen to provide a means of ensuring that the important information in the SAR data is preserved. The paper also presents the results of an investigation into the effects of coding on SAR data fidelity when the coding is applied in (1) the signal data domain, and (2) the image domain. An analysis of the results highlights the shortcomings of the MSE criterion, and shows which of the suggested additional criterion have been found to be most important.
A topological system for delineation and codification of the Earth's river basins
Verdin, K.L.; Verdin, J.P.
1999-01-01
A comprehensive reference system for the Earth's river basins is proposed as a support to fiver basin management, global change research, and the pursuit of sustainable development. A natural system for delineation and codification of basins is presented which is based upon topographic control and the topology of the fiver network. These characteristics make the system well suited for implementation and use with digital elevation models (DEMs) and geographic information systems. A demonstration of these traits is made with the 30-arcsecond GTOPO30 DEM for North America. The system has additional appeal owing to its economy of digits and the topological information that they carry. This is illustrated through presentation of comparisons with USGS hydrologic unit codes and demonstration of the use of code numbers to reveal dependence or independence of water use activities within a basin.
Evaluation of liquefaction potential for building code
NASA Astrophysics Data System (ADS)
Nunziata, C.; De Nisco, G.; Panza, G. F.
2008-07-01
The standard approach for the evaluation of the liquefaction susceptibility is based on the estimation of a safety factor between the cyclic shear resistance to liquefaction and the earthquake induced shear stress. Recently, an updated procedure based on shear-wave velocities (Vs) has been proposed which could be more easily applied. These methods have been applied at La Plaja beach of Catania, that experienced liquefaction because of the 1693 earthquake. The detailed geotechnical and Vs information and the realistic ground motion computed for the 1693 event let us compare the two approaches. The successful application of the Vs procedure, slightly modified to fit historical and safety factor information, even if additional field performances are needed, encourages the development of a guide for liquefaction potential analysis, based on well defined Vs profiles to be included in the italian seismic code.
Matney, Susan; Bakken, Suzanne; Huff, Stanley M
2003-01-01
In recent years, the Logical Observation Identifiers, Names, and Codes (LOINC) Database has been expanded to include assessment items of relevance to nursing and in 2002 met the criteria for "recognition" by the American Nurses Association. Assessment measures in LOINC include those related to vital signs, obstetric measurements, clinical assessment scales, assessments from standardized nursing terminologies, and research instruments. In order for LOINC to be of greater use in implementing information systems that support nursing practice, additional content is needed. Moreover, those implementing systems for nursing practice must be aware of the manner in which LOINC codes for assessments can be appropriately linked with other aspects of the nursing process such as diagnoses and interventions. Such linkages are necessary to document nursing contributions to healthcare outcomes within the context of a multidisciplinary care environment and to facilitate building of nursing knowledge from clinical practice. The purposes of this paper are to provide an overview of the LOINC database, to describe examples of assessments of relevance to nursing contained in LOINC, and to illustrate linkages of LOINC assessments with other nursing concepts.
Effective declutter of complex flight displays using stereoptic 3-D cueing
NASA Technical Reports Server (NTRS)
Parrish, Russell V.; Williams, Steven P.; Nold, Dean E.
1994-01-01
The application of stereo technology to new, integrated pictorial display formats has been effective in situational awareness enhancements, and stereo has been postulated to be effective for the declutter of complex informational displays. This paper reports a full-factorial workstation experiment performed to verify the potential benefits of stereo cueing for the declutter function in a simulated tracking task. The experimental symbology was designed similar to that of a conventional flight director, although the format was an intentionally confused presentation that resulted in a very cluttered dynamic display. The subject's task was to use a hand controller to keep a tracking symbol, an 'X', on top of a target symbol, another X, which was being randomly driven. In the basic tracking task, both the target symbol and the tracking symbol were presented as red X's. The presence of color coding was used to provide some declutter, thus making the task more reasonable to perform. For this condition, the target symbol was coded red, and the tracking symbol was coded blue. Noise conditions, or additional clutter, were provided by the inclusion of randomly moving, differently colored X symbols. Stereo depth, which was hypothesized to declutter the display, was utilized by placing any noise in a plane in front of the display monitor, the tracking symbol at screen depth, and the target symbol behind the screen. The results from analyzing the performances of eight subjects revealed that the stereo presentation effectively offsets the cluttering effects of both the noise and the absence of color coding. The potential of stereo cueing to declutter complex informational displays has therefore been verified; this ability to declutter is an additional benefit from the application of stereoptic cueing to pictorial flight displays.
Industrial Facility Combustion Energy Use
McMillan, Colin
2016-08-01
Facility-level industrial combustion energy use is calculated from greenhouse gas emissions data reported by large emitters (>25,000 metric tons CO2e per year) under the U.S. EPA's Greenhouse Gas Reporting Program (GHGRP, https://www.epa.gov/ghgreporting). The calculation applies EPA default emissions factors to reported fuel use by fuel type. Additional facility information is included with calculated combustion energy values, such as industry type (six-digit NAICS code), location (lat, long, zip code, county, and state), combustion unit type, and combustion unit name. Further identification of combustion energy use is provided by calculating energy end use (e.g., conventional boiler use, co-generation/CHP use, process heating, other facility support) by manufacturing NAICS code. Manufacturing facilities are matched by their NAICS code and reported fuel type with the proportion of combustion fuel energy for each end use category identified in the 2010 Energy Information Administration Manufacturing Energy Consumption Survey (MECS, http://www.eia.gov/consumption/manufacturing/data/2010/). MECS data are adjusted to account for data that were withheld or whose end use was unspecified following the procedure described in Fox, Don B., Daniel Sutter, and Jefferson W. Tester. 2011. The Thermal Spectrum of Low-Temperature Energy Use in the United States, NY: Cornell Energy Institute.
A comparison of the effects of a secondary task and lorazepam on cognitive performance.
File, S E
1992-01-01
In order to test whether the lorazepam-induced impairments in a variety of cognitive tasks were similar to those of divided attention, the effects of lorazepam (2.5 mg) in healthy volunteers were compared with those requiring subjects to perform an additional task (detecting silences superimposed onto classical music). Neither treatment impaired implicit memory or judgements of frequency. Both treatments impaired performance in tests of speed, lorazepam having the greatest effect on number cancellation and the additional task having the greatest effect on simple reaction time. Both treatments impaired performance in a coding task, in a test of explicit episodic memory and in judgements of recency (indicating impaired coding of contextual information). Lorazepam significantly reduced performance in a word completion task, but this was unimpaired in the group performing the additional task. In general, the pattern of results suggests that there are similarities between the effects of divided attention and lorazepam treatment, and that lorazepam-induced cognitive impairments are not restricted to explicit tests of episodic memory.
Woolgar, Alexandra; Williams, Mark A; Rich, Anina N
2015-04-01
Selective attention is fundamental for human activity, but the details of its neural implementation remain elusive. One influential theory, the adaptive coding hypothesis (Duncan, 2001, An adaptive coding model of neural function in prefrontal cortex, Nature Reviews Neuroscience 2:820-829), proposes that single neurons in certain frontal and parietal regions dynamically adjust their responses to selectively encode relevant information. This selective representation may in turn support selective processing in more specialized brain regions such as the visual cortices. Here, we use multi-voxel decoding of functional magnetic resonance images to demonstrate selective representation of attended--and not distractor--objects in frontal, parietal, and visual cortices. In addition, we highlight a critical role for task demands in determining which brain regions exhibit selective coding. Strikingly, representation of attended objects in frontoparietal cortex was highest under conditions of high perceptual demand, when stimuli were hard to perceive and coding in early visual cortex was weak. Coding in early visual cortex varied as a function of attention and perceptual demand, while coding in higher visual areas was sensitive to the allocation of attention but robust to changes in perceptual difficulty. Consistent with high-profile reports, peripherally presented objects could also be decoded from activity at the occipital pole, a region which corresponds to the fovea. Our results emphasize the flexibility of frontoparietal and visual systems. They support the hypothesis that attention enhances the multi-voxel representation of information in the brain, and suggest that the engagement of this attentional mechanism depends critically on current task demands. Copyright © 2015 Elsevier Inc. All rights reserved.
Energy efficient rateless codes for high speed data transfer over free space optical channels
NASA Astrophysics Data System (ADS)
Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.
2015-03-01
Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.
Impact of jammer side information on the performance of anti-jam systems
NASA Astrophysics Data System (ADS)
Lim, Samuel
1992-03-01
The Chernoff bound parameter, D, provides a performance measure for all coded communication systems. D can be used to determine upper-bounds on bit error probabilities (BEPs) of Viterbi decoded convolutional codes. The impact on BEP bounds of channel measurements that provide additional side information can also be evaluated with D. This memo documents the results of a Chernoff bound parameter evaluation in optimum partial-band noise jamming (OPBNJ) for both BPSK and DPSK modulation schemes. Hard and soft quantized receivers, with and without jammer side information (JSI), were examined. The results of this analysis indicate that JSI does improve decoding performance. However, a knowledge of jammer presence alone achieves a performance level comparable to soft decision decoding with perfect JSI. Furthermore, performance degradation due to the lack of JSI can be compensated for by increasing the number of levels of quantization. Therefore, an anti-jam system without JSI can be made to perform almost as well as a system with JSI.
49 CFR 383.153 - Information on the CLP and CDL documents and applications.
Code of Federal Regulations, 2011 CFR
2011-10-01
... materials endorsements; (vi) S for school bus; and (vii) At the discretion of the State, additional codes... on the front or back of the CDL document. (10) The restriction(s) placed on the driver from operating... front or back of the CDL document. (b) Commercial Learner's Permit. (1) A CLP must not contain a...
49 CFR 383.153 - Information on the CLP and CDL documents and applications.
Code of Federal Regulations, 2012 CFR
2012-10-01
... materials endorsements; (vi) S for school bus; and (vii) At the discretion of the State, additional codes... on the front or back of the CDL document. (10) The restriction(s) placed on the driver from operating... front or back of the CDL document. (b) Commercial Learner's Permit. (1) A CLP must not contain a...
Defense Logistics Agency Disposition Services Afghanistan Disposal Process Needed Improvement
2013-11-08
audit, and management was proactive in correcting the deficiencies we identified. DLA DS eliminated backlogs, identified and corrected system ...problems, provided additional system training, corrected coding errors, added personnel to key positions, addressed scale issues, submitted debit...Service Automated Information System to the Reutilization Business Integration2 (RBI) solution. The implementation of RBI in Afghanistan occurred in
Station Search Coverage Maps Outages View Outages Report Outages Information General Information Receiver Information Reception Problems NWR Alarms Automated Voices FIPS Codes NWR - Special Needs SAME USING SAME SAME FIPS (Federal Information Processing Standards) code changes and / or SAME location code changes
Predicting couple therapy outcomes based on speech acoustic features
Nasir, Md; Baucom, Brian Robert; Narayanan, Shrikanth
2017-01-01
Automated assessment and prediction of marital outcome in couples therapy is a challenging task but promises to be a potentially useful tool for clinical psychologists. Computational approaches for inferring therapy outcomes using observable behavioral information obtained from conversations between spouses offer objective means for understanding relationship dynamics. In this work, we explore whether the acoustics of the spoken interactions of clinically distressed spouses provide information towards assessment of therapy outcomes. The therapy outcome prediction task in this work includes detecting whether there was a relationship improvement or not (posed as a binary classification) as well as discerning varying levels of improvement or decline in the relationship status (posed as a multiclass recognition task). We use each interlocutor’s acoustic speech signal characteristics such as vocal intonation and intensity, both independently and in relation to one another, as cues for predicting the therapy outcome. We also compare prediction performance with one obtained via standardized behavioral codes characterizing the relationship dynamics provided by human experts as features for automated classification. Our experiments, using data from a longitudinal clinical study of couples in distressed relations, showed that predictions of relationship outcomes obtained directly from vocal acoustics are comparable or superior to those obtained using human-rated behavioral codes as prediction features. In addition, combining direct signal-derived features with manually coded behavioral features improved the prediction performance in most cases, indicating the complementarity of relevant information captured by humans and machine algorithms. Additionally, considering the vocal properties of the interlocutors in relation to one another, rather than in isolation, showed to be important for improving the automatic prediction. This finding supports the notion that behavioral outcome, like many other behavioral aspects, is closely related to the dynamics and mutual influence of the interlocutors during their interaction and their resulting behavioral patterns. PMID:28934302
NASA Astrophysics Data System (ADS)
Markman, A.; Javidi, B.
2016-06-01
Quick-response (QR) codes are barcodes that can store information such as numeric data and hyperlinks. The QR code can be scanned using a QR code reader, such as those built into smartphone devices, revealing the information stored in the code. Moreover, the QR code is robust to noise, rotation, and illumination when scanning due to error correction built in the QR code design. Integral imaging is an imaging technique used to generate a three-dimensional (3D) scene by combining the information from two-dimensional (2D) elemental images (EIs) each with a different perspective of a scene. Transferring these 2D images in a secure manner can be difficult. In this work, we overview two methods to store and encrypt EIs in multiple QR codes. The first method uses run-length encoding with Huffman coding and the double-random-phase encryption (DRPE) to compress and encrypt an EI. This information is then stored in a QR code. An alternative compression scheme is to perform photon-counting on the EI prior to compression. Photon-counting is a non-linear transformation of data that creates redundant information thus improving image compression. The compressed data is encrypted using the DRPE. Once information is stored in the QR codes, it is scanned using a smartphone device. The information scanned is decompressed and decrypted and an EI is recovered. Once all EIs have been recovered, a 3D optical reconstruction is generated.
LACEwING: A New Moving Group Analysis Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riedel, Adric R.; Blunt, Sarah C.; Faherty, Jacqueline K.
We present a new nearby young moving group (NYMG) kinematic membership analysis code, LocAting Constituent mEmbers In Nearby Groups (LACEwING), a new Catalog of Suspected Nearby Young Stars, a new list of bona fide members of moving groups, and a kinematic traceback code. LACEwING is a convergence-style algorithm with carefully vetted membership statistics based on a large numerical simulation of the Solar Neighborhood. Given spatial and kinematic information on stars, LACEwING calculates membership probabilities in 13 NYMGs and three open clusters within 100 pc. In addition to describing the inputs, methods, and products of the code, we provide comparisons ofmore » LACEwING to other popular kinematic moving group membership identification codes. As a proof of concept, we use LACEwING to reconsider the membership of 930 stellar systems in the Solar Neighborhood (within 100 pc) that have reported measurable lithium equivalent widths. We quantify the evidence in support of a population of young stars not attached to any NYMGs, which is a possible sign of new as-yet-undiscovered groups or of a field population of young stars.« less
Henneberg, M.F.; Strause, J.L.
2002-01-01
This report presents the instructions required to use the Scour Critical Bridge Indicator (SCBI) Code and Scour Assessment Rating (SAR) calculator developed by the Pennsylvania Department of Transportation (PennDOT) and the U.S. Geological Survey to identify Pennsylvania bridges with excessive scour conditions or a high potential for scour. Use of the calculator will enable PennDOT bridge personnel to quickly calculate these scour indices if site conditions change, new bridges are constructed, or new information needs to be included. Both indices are calculated for a bridge simultaneously because they must be used together to be interpreted accurately. The SCBI Code and SAR calculator program is run by a World Wide Web browser from a remote computer. The user can 1) add additional scenarios for bridges in the SCBI Code and SAR calculator database or 2) enter data for new bridges and run the program to calculate the SCBI Code and calculate the SAR. The calculator program allows the user to print the results and to save multiple scenarios for a bridge.
Public sentiment and discourse about Zika virus on Instagram.
Seltzer, E K; Horst-Martz, E; Lu, M; Merchant, R M
2017-09-01
Social media have strongly influenced the awareness and perceptions of public health emergencies, and a considerable amount of social media content is now shared through images, rather than text alone. This content can impact preparedness and response due to the popularity and real-time nature of social media platforms. We sought to explore how the image-sharing platform Instagram is used for information dissemination and conversation during the current Zika outbreak. This was a retrospective review of publicly posted images about Zika on Instagram. Using the keyword '#zika' we identified 500 images posted on Instagram from May to August 2016. Images were coded by three reviewers and contextual information was collected for each image about sentiment, image type, content, audience, geography, reliability, and engagement. Of 500 images tagged with #zika, 342 (68%) contained content actually related to Zika. Of the 342 Zika-specific images, 299 were coded as 'health' and 193 were coded 'public interest'. Some images had multiple 'health' and 'public interest' codes. Health images tagged with #zika were primarily related to transmission (43%, 129/299) and prevention (48%, 145/299). Transmission-related posts were more often mosquito-human transmission (73%, 94/129) than human-human transmission (27%, 35/129). Mosquito bite prevention posts outnumbered safe sex prevention; (84%, 122/145) and (16%, 23/145) respectively. Images with a target audience were primarily aimed at women (95%, 36/38). Many posts (60%, 61/101) included misleading, incomplete, or unclear information about the virus. Additionally, many images expressed fear and negative sentiment, (79/156, 51%). Instagram can be used to characterize public sentiment and highlight areas of focus for public health, such as correcting misleading or incomplete information or expanding messages to reach diverse audiences. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
2016-01-01
Background Inclusion of information about a patient’s work, industry, and occupation, in the electronic health record (EHR) could facilitate occupational health surveillance, better health outcomes, prevention activities, and identification of workers’ compensation cases. The US National Institute for Occupational Safety and Health (NIOSH) has developed an autocoding system for “industry” and “occupation” based on 1990 Bureau of Census codes; its effectiveness requires evaluation in conjunction with promoting the mandatory addition of these variables to the EHR. Objective The objective of the study was to evaluate the intercoder reliability of NIOSH’s Industry and Occupation Computerized Coding System (NIOCCS) when applied to data collected in a community survey conducted under the Affordable Care Act; to determine the proportion of records that are autocoded using NIOCCS. Methods Standard Occupational Classification (SOC) codes are used by several federal agencies in databases that capture demographic, employment, and health information to harmonize variables related to work activities among these data sources. There are 359 industry and occupation responses that were hand coded by 2 investigators, who came to a consensus on every code. The same variables were autocoded using NIOCCS at the high and moderate criteria level. Results Kappa was .84 for agreement between hand coders and between the hand coder consensus code versus NIOCCS high confidence level codes for the first 2 digits of the SOC code. For 4 digits, NIOCCS coding versus investigator coding ranged from kappa=.56 to .70. In this study, NIOCCS was able to achieve production rates (ie, to autocode) 31%-36% of entered variables at the “high confidence” level and 49%-58% at the “medium confidence” level. Autocoding (production) rates are somewhat lower than those reported by NIOSH. Agreement between manually coded and autocoded data are “substantial” at the 2-digit level, but only “fair” to “good” at the 4-digit level. Conclusions This work serves as a baseline for performance of NIOCCS by investigators in the field. Further field testing will clarify NIOCCS effectiveness in terms of ability to assign codes and coding accuracy and will clarify its value as inclusion of these occupational variables in the EHR is promoted. PMID:26878932
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gharibyan, N.
In order to fully characterize the NIF neutron spectrum, SAND-II-SNL software was requested/received from the Radiation Safety Information Computational Center. The software is designed to determine the neutron energy spectrum through analysis of experimental activation data. However, given that the source code was developed in Sparcstation 10, it is not compatible with current version of FORTRAN. Accounts have been established through the Lawrence Livermore National Laboratory’s High Performance Computing in order to access different compiles for FORTRAN (e.g. pgf77, pgf90). Additionally, several of the subroutines included in the SAND-II-SNL package have required debugging efforts to allow for proper compiling ofmore » the code.« less
Meta-Modeling: A Knowledge-Based Approach to Facilitating Model Construction and Reuse
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Dungan, Jennifer L.
1997-01-01
In this paper, we introduce a new modeling approach called meta-modeling and illustrate its practical applicability to the construction of physically-based ecosystem process models. As a critical adjunct to modeling codes meta-modeling requires explicit specification of certain background information related to the construction and conceptual underpinnings of a model. This information formalizes the heretofore tacit relationship between the mathematical modeling code and the underlying real-world phenomena being investigated, and gives insight into the process by which the model was constructed. We show how the explicit availability of such information can make models more understandable and reusable and less subject to misinterpretation. In particular, background information enables potential users to better interpret an implemented ecosystem model without direct assistance from the model author. Additionally, we show how the discipline involved in specifying background information leads to improved management of model complexity and fewer implementation errors. We illustrate the meta-modeling approach in the context of the Scientists' Intelligent Graphical Modeling Assistant (SIGMA) a new model construction environment. As the user constructs a model using SIGMA the system adds appropriate background information that ties the executable model to the underlying physical phenomena under investigation. Not only does this information improve the understandability of the final model it also serves to reduce the overall time and programming expertise necessary to initially build and subsequently modify models. Furthermore, SIGMA's use of background knowledge helps eliminate coding errors resulting from scientific and dimensional inconsistencies that are otherwise difficult to avoid when building complex models. As a. demonstration of SIGMA's utility, the system was used to reimplement and extend a well-known forest ecosystem dynamics model: Forest-BGC.
Distributed polar-coded OFDM based on Plotkin's construction for half duplex wireless communication
NASA Astrophysics Data System (ADS)
Umar, Rahim; Yang, Fengfan; Mughal, Shoaib; Xu, HongJun
2018-07-01
A Plotkin-based polar-coded orthogonal frequency division multiplexing (P-PC-OFDM) scheme is proposed and its bit error rate (BER) performance over additive white gaussian noise (AWGN), frequency selective Rayleigh, Rician and Nakagami-m fading channels has been evaluated. The considered Plotkin's construction possesses a parallel split in its structure, which motivated us to extend the proposed P-PC-OFDM scheme in a coded cooperative scenario. As the relay's effective collaboration has always been pivotal in the design of cooperative communication therefore, an efficient selection criterion for choosing the information bits has been inculcated at the relay node. To assess the BER performance of the proposed cooperative scheme, we have also upgraded conventional polar-coded cooperative scheme in the context of OFDM as an appropriate bench marker. The Monte Carlo simulated results revealed that the proposed Plotkin-based polar-coded cooperative OFDM scheme convincingly outperforms the conventional polar-coded cooperative OFDM scheme by 0.5 0.6 dBs over AWGN channel. This prominent gain in BER performance is made possible due to the bit-selection criteria and the joint successive cancellation decoding adopted at the relay and the destination nodes, respectively. Furthermore, the proposed coded cooperative schemes outperform their corresponding non-cooperative schemes by a gain of 1 dB under an identical condition.
A motion compensation technique using sliced blocks and its application to hybrid video coding
NASA Astrophysics Data System (ADS)
Kondo, Satoshi; Sasai, Hisao
2005-07-01
This paper proposes a new motion compensation method using "sliced blocks" in DCT-based hybrid video coding. In H.264 ? MPEG-4 Advance Video Coding, a brand-new international video coding standard, motion compensation can be performed by splitting macroblocks into multiple square or rectangular regions. In the proposed method, on the other hand, macroblocks or sub-macroblocks are divided into two regions (sliced blocks) by an arbitrary line segment. The result is that the shapes of the segmented regions are not limited to squares or rectangles, allowing the shapes of the segmented regions to better match the boundaries between moving objects. Thus, the proposed method can improve the performance of the motion compensation. In addition, adaptive prediction of the shape according to the region shape of the surrounding macroblocks can reduce overheads to describe shape information in the bitstream. The proposed method also has the advantage that conventional coding techniques such as mode decision using rate-distortion optimization can be utilized, since coding processes such as frequency transform and quantization are performed on a macroblock basis, similar to the conventional coding methods. The proposed method is implemented in an H.264-based P-picture codec and an improvement in bit rate of 5% is confirmed in comparison with H.264.
Biro, Suzanne; Williamson, Tyler; Leggett, Jannet Ann; Barber, David; Morkem, Rachael; Moore, Kieran; Belanger, Paul; Mosley, Brian; Janssen, Ian
2016-03-11
Electronic medical records (EMRs) used in primary care contain a breadth of data that can be used in public health research. Patient data from EMRs could be linked with other data sources, such as a postal code linkage with Census data, to obtain additional information on environmental determinants of health. While promising, successful linkages between primary care EMRs with geographic measures is limited due to ethics review board concerns. This study tested the feasibility of extracting full postal code from primary care EMRs and linking this with area-level measures of the environment to demonstrate how such a linkage could be used to examine the determinants of disease. The association between obesity and area-level deprivation was used as an example to illustrate inequalities of obesity in adults. The analysis included EMRs of 7153 patients aged 20 years and older who visited a single, primary care site in 2011. Extracted patient information included demographics (date of birth, sex, postal code) and weight status (height, weight). Information extraction and management procedures were designed to mitigate the risk of individual re-identification when extracting full postal code from source EMRs. Based on patients' postal codes, area-based deprivation indexes were created using the smallest area unit used in Canadian censuses. Descriptive statistics and socioeconomic disparity summary measures of linked census and adult patients were calculated. The data extraction of full postal code met technological requirements for rendering health information extracted from local EMRs into anonymized data. The prevalence of obesity was 31.6 %. There was variation of obesity between deprivation quintiles; adults in the most deprived areas were 35 % more likely to be obese compared with adults in the least deprived areas (Chi-Square = 20.24(1), p < 0.0001). Maps depicting spatial representation of regional deprivation and obesity were created to highlight high risk areas. An area based socio-economic measure was linked with EMR-derived objective measures of height and weight to show a positive association between area-level deprivation and obesity. The linked dataset demonstrates a promising model for assessing health disparities and ecological factors associated with the development of chronic diseases with far reaching implications for informing public health and primary health care interventions and services.
[Medical records, DRG and intensive care patients].
Aardal, Sidsel; Berge, Kjersti; Breivik, Kjell; Flaatten, Hans K
2005-04-07
In order to control the quality of the medical report after a hospital stay with regards to the stay in the intensive care unit (ICU), and to cheque for correct DRG grouping, this study of 428 patients treated in our ICU in 2003 was conducted. All ICU patients from 2003 were found in our database, which includes specific ICD-10 diagnosis and specific ICU procedures. The medical record summarising the hospital stay (epicrisis) was retrieved for each patient from the hospital's electronic patient files and controlled for correct information regarding the ICU stay. DRG groups for each patient were retrieved from the hospital's administrative database. All stays were re-coded, with all information about the ICU stay was also included. The new DRG codes were compared with the old ones, and the difference in DRG points computed. The description of the stay in the ICU was missing or very insufficient in 46% of the records. In the DRG control we found that an additional 347.37 DRG points (18.4% of the original sum of all DRG points) were missing, corresponding to a loss to the hospital of 6.2 million NOK. In addition we discovered missing codes for tracheostomy corresponding to 2.8 million NOK, giving a total loss of 9 million NOK. This study confirms that an adequate description of the stay in the ICU is insufficient in a large number of medical records. This also leads to incorrect DRG grouping of many patients and significant financial losses to the hospital.
Team interaction during surgery: a systematic review of communication coding schemes.
Tiferes, Judith; Bisantz, Ann M; Guru, Khurshid A
2015-05-15
Communication problems have been systematically linked to human errors in surgery and a deep understanding of the underlying processes is essential. Although a number of tools exist to assess nontechnical skills, methods to study communication and other team-related processes are far from being standardized, making comparisons challenging. We conducted a systematic review to analyze methods used to study events in the operating room (OR) and to develop a synthesized coding scheme for OR team communication. Six electronic databases were accessed to search for articles that collected individual events during surgery and included detailed coding schemes. Additional articles were added based on cross-referencing. That collection was then classified based on type of events collected, environment type (real or simulated), number of procedures, type of surgical task, team characteristics, method of data collection, and coding scheme characteristics. All dimensions within each coding scheme were grouped based on emergent content similarity. Categories drawn from articles, which focused on communication events, were further analyzed and synthesized into one common coding scheme. A total of 34 of 949 articles met the inclusion criteria. The methodological characteristics and coding dimensions of the articles were summarized. A priori coding was used in nine studies. The synthesized coding scheme for OR communication included six dimensions as follows: information flow, period, statement type, topic, communication breakdown, and effects of communication breakdown. The coding scheme provides a standardized coding method for OR communication, which can be used to develop a priori codes for future studies especially in comparative effectiveness research. Copyright © 2015 Elsevier Inc. All rights reserved.
Todo, A S; Hiromoto, G; Turner, J E; Hamm, R N; Wright, H A
1982-12-01
Previous calculations of the initial energies of electrons produced in water irradiated by photons are extended to 1 GeV by including pair and triplet production. Calculations were performed with the Monte Carlo computer code PHOEL-3, which replaces the earlier code, PHOEL-2. Tables of initial electron energies are presented for single interactions of monoenergetic photons at a number of energies from 10 keV to 1 GeV. These tables can be used to compute kerma in water irradiated by photons with arbitrary energy spectra to 1 GeV. In addition, separate tables of Compton-and pair-electron spectra are given over this energy range. The code PHOEL-3 is available from the Radiation Shielding Information Center, Oak Ridge National Laboratory, Oak Ridge, TN 37830.
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
(I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.
van Rijnsoever, Frank J
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.
NASA Technical Reports Server (NTRS)
Rajpal, Sandeep; Rhee, DoJun; Lin, Shu
1997-01-01
In this paper, we will use the construction technique proposed in to construct multidimensional trellis coded modulation (TCM) codes for both the additive white Gaussian noise (AWGN) and the fading channels. Analytical performance bounds and simulation results show that these codes perform very well and achieve significant coding gains over uncoded reference modulation systems. In addition, the proposed technique can be used to construct codes which have a performance/decoding complexity advantage over the codes listed in literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Charley; Kamboj, Sunita; Wang, Cheng
2015-09-01
This handbook is an update of the 1993 version of the Data Collection Handbook and the Radionuclide Transfer Factors Report to support modeling the impact of radioactive material in soil. Many new parameters have been added to the RESRAD Family of Codes, and new measurement methodologies are available. A detailed review of available parameter databases was conducted in preparation of this new handbook. This handbook is a companion document to the user manuals when using the RESRAD (onsite) and RESRAD-OFFSITE code. It can also be used for RESRAD-BUILD code because some of the building-related parameters are included in this handbook.more » The RESRAD (onsite) has been developed for implementing U.S. Department of Energy Residual Radioactive Material Guidelines. Hydrogeological, meteorological, geochemical, geometrical (size, area, depth), crops and livestock, human intake, source characteristic, and building characteristic parameters are used in the RESRAD (onsite) code. The RESRAD-OFFSITE code is an extension of the RESRAD (onsite) code and can also model the transport of radionuclides to locations outside the footprint of the primary contamination. This handbook discusses parameter definitions, typical ranges, variations, and measurement methodologies. It also provides references for sources of additional information. Although this handbook was developed primarily to support the application of RESRAD Family of Codes, the discussions and values are valid for use of other pathway analysis models and codes.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-26
... Engineering Command, Southwest; Attn: Code EV21.AK; 1220 Pacific Highway, Building 1, 5th Floor; San Diego, CA... available for public review at the following Web site: http://www.cnic.navy.mil/cnrsw . In addition, paper... language informational materials will be made available on the Web site: http://www.cnic.navy.mil/cnrsw and...
Exploring Natural Pedagogy in Play with Preschoolers: Cues Parents Use and Relations among Them
ERIC Educational Resources Information Center
Sage, Kara; Baldwin, Dare
2012-01-01
Recent developmental work demonstrates a range of effects of pedagogical cues on childhood learning. The present work investigates natural pedagogy in informal parent-child play. Preschool-aged children participated in free play and a toy task with a parent in addition to a toy task with an experimenter. Sessions were extensively coded for use of…
Code of Federal Regulations, 2011 CFR
2011-07-01
... building code that have been incorporated to limit destruction of records. The report should make specific... Association, and any testing or modeling or other sources used in the design. (b) NARA action. (1) NARA will... determination. Before any consultation, NARA may ask the agency for additional clarifying information. NARA will...
Code of Federal Regulations, 2013 CFR
2013-07-01
... building code that have been incorporated to limit destruction of records. The report should make specific... Association, and any testing or modeling or other sources used in the design. (b) NARA action. (1) NARA will... determination. Before any consultation, NARA may ask the agency for additional clarifying information. NARA will...
Code of Federal Regulations, 2014 CFR
2014-07-01
... building code that have been incorporated to limit destruction of records. The report should make specific... Association, and any testing or modeling or other sources used in the design. (b) NARA action. (1) NARA will... determination. Before any consultation, NARA may ask the agency for additional clarifying information. NARA will...
Code of Federal Regulations, 2012 CFR
2012-07-01
... building code that have been incorporated to limit destruction of records. The report should make specific... Association, and any testing or modeling or other sources used in the design. (b) NARA action. (1) NARA will... determination. Before any consultation, NARA may ask the agency for additional clarifying information. NARA will...
Geographic Information Systems using CODES linked data (Crash outcome data evaluation system)
DOT National Transportation Integrated Search
2001-04-01
This report presents information about geographic information systems (GIS) and CODES linked data. Section one provides an overview of a GIS and the benefits of linking to CODES. Section two outlines the basic issues relative to the types of map data...
Burger, F; Walgenbach, M; Göbel, P; Parbs, S; Neugebauer, E
2017-04-01
Background: We investigated and evaluated the cost effectiveness of coding by health care economists in a centre for orthopaedics and trauma surgery in Germany, by quantifying and comparing the financial efficiency of physicians with basic knowledge of the DRG-system with the results of healthcare economists with in-depth knowledge (M.Sc.). In addition, a hospital survey was performed to establish how DRG-coding is being performed and the identity of the persons involved. Material and Methods: In a prospective and controlled study, 200 in-patients were coded by a healthcare economist (study group). Prior to that, the same cases were coded by physicians with basic training in the DRG-system, who made up the control group. All cases were picked randomly and blinded without informing the physicians coding the controls, in order to avoid any Hawthorne effect. We evaluated and measured the effective weighting within the G-DRG, the DRG returns per patient, the overall DRG return, and the additional time needed. For the survey, questionnaires were sent to 1200 German hospitals. The completed questionnaire was analysed using a statistical program. Results: The return difference per patient between controls and the study group was significantly greater (2472 ± 337 €; p < 0.05); the overall return was raised by 494,500 €. The mean additional time needed was 11.32 ± 0.8 min per case, resulting in an increase in proceeds of 218 ± 38 € per minute. 2.5 % of all cases had to be devaluated by the health economist after the initial coding by the control group. Returned sheets of 60 hospitals were evaluated. The median level of DRG case reports was 1277 (2500-62,300). Coding was performed in 69 % of cases by doctors, 19 % by skilled specialists for DRG coding and in 8 % together. Overall satisfaction with the DRG was described by 61 % of respondents as good or excellent. Conclusion: Our prospective and controlled study quantifies the cost efficiency of health economists in a centre of orthopaedics and trauma surgery in Germany for the first time. We provide some initial evidence that health economists can enhance the CMI, the resulting DRG return per patient as well as the overall DRG return. Data from the survey shows that in many hospitals there is great reluctance to leave the coding to specialists only. Georg Thieme Verlag KG Stuttgart · New York.
Mechanism on brain information processing: Energy coding
NASA Astrophysics Data System (ADS)
Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa
2006-09-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun
2004-05-01
Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decodingmore » in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-10
... limited to: Crop production (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be... Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-30
... limited to: Crop production (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be... commodities, Feed additives, Food additives, Pesticides and pests, Reporting and recordkeeping requirements...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
...: Crop production (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be exhaustive, but..., Agricultural commodities, Feed additives, Food additives, Pesticides and pests, Reporting and recordkeeping...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-08
...: Crop production (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be exhaustive, but... CFR Part 174 Environmental protection, Agricultural commodities, Feed additives, Food additives...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
... limited to: Crop production (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be... commodities, Feed additives, Food additives, Pesticides and pests, Reporting and recordkeeping requirements...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peck, T; Sparkman, D; Storch, N
''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less
Interactive QR code beautification with full background image embedding
NASA Astrophysics Data System (ADS)
Lin, Lijian; Wu, Song; Liu, Sijiang; Jiang, Bo
2017-06-01
QR (Quick Response) code is a kind of two dimensional barcode that was first developed in automotive industry. Nowadays, QR code has been widely used in commercial applications like product promotion, mobile payment, product information management, etc. Traditional QR codes in accordance with the international standard are reliable and fast to decode, but are lack of aesthetic appearance to demonstrate visual information to customers. In this work, we present a novel interactive method to generate aesthetic QR code. By given information to be encoded and an image to be decorated as full QR code background, our method accepts interactive user's strokes as hints to remove undesired parts of QR code modules based on the support of QR code error correction mechanism and background color thresholds. Compared to previous approaches, our method follows the intention of the QR code designer, thus can achieve more user pleasant result, while keeping high machine readability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony, Stephen
The Sandia hyperspectral upper-bound spectrum algorithm (hyper-UBS) is a cosmic ray despiking algorithm for hyperspectral data sets. When naturally-occurring, high-energy (gigaelectronvolt) cosmic rays impact the earth’s atmosphere, they create an avalanche of secondary particles which will register as a large, positive spike on any spectroscopic detector they hit. Cosmic ray spikes are therefore an unavoidable spectroscopic contaminant which can interfere with subsequent analysis. A variety of cosmic ray despiking algorithms already exist and can potentially be applied to hyperspectral data matrices, most notably the upper-bound spectrum data matrices (UBS-DM) algorithm by Dongmao Zhang and Dor Ben-Amotz which served as themore » basis for the hyper-UBS algorithm. However, the existing algorithms either cannot be applied to hyperspectral data, require information that is not always available, introduce undesired spectral bias, or have otherwise limited effectiveness for some experimentally relevant conditions. Hyper-UBS is more effective at removing a wider variety of cosmic ray spikes from hyperspectral data without introducing undesired spectral bias. In addition to the core algorithm the Sandia hyper-UBS software package includes additional source code useful in evaluating the effectiveness of the hyper-UBS algorithm. The accompanying source code includes code to generate simulated hyperspectral data contaminated by cosmic ray spikes, several existing despiking algorithms, and code to evaluate the performance of the despiking algorithms on simulated data.« less
Long non-coding RNA expression profile in cervical cancer tissues
Zhu, Hua; Chen, Xiangjian; Hu, Yan; Shi, Zhengzheng; Zhou, Qing; Zheng, Jingjie; Wang, Yifeng
2017-01-01
Cervical cancer (CC), one of the most common types of cancer of the female population, presents an enormous challenge in diagnosis and treatment. Long non-coding (lnc)RNAs, non-coding (nc)RNAs with length >200 nucleotides, have been identified to be associated with multiple types of cancer, including CC. This class of nc transcripts serves an important role in tumor suppression and oncogenic signaling pathways. In the present study, the microarray method was used to obtain the expression profile of lncRNAs and protein-coding mRNAs and to compare the expression of lncRNAs between CC tissues and corresponding adjacent non-cancerous tissues in order to screen potential lncRNAs for associations with CC. Overall, 3356 lncRNAs with significantly different expression pattern in CC tissues compared with adjacent non-cancerous tissues were identified, while 1,857 of them were upregulated. These differentially expressed lncRNAs were additionally classified into 5 subgroups. Reverse transcription quantitative polymerase chain reactions were performed to validate the expression pattern of 5 random selected lncRNAs, and 2lncRNAs were identified to have significantly different expression in CC samples compared with adjacent non-cancerous tissues. This finding suggests that those lncRNAs with different expression may serve important roles in the development of CC, and the expression data may provide information for additional study on the involvement of lncRNAs in CC. PMID:28789353
Evidence of translation efficiency adaptation of the coding regions of the bacteriophage lambda.
Goz, Eli; Mioduser, Oriah; Diament, Alon; Tuller, Tamir
2017-08-01
Deciphering the way gene expression regulatory aspects are encoded in viral genomes is a challenging mission with ramifications related to all biomedical disciplines. Here, we aimed to understand how the evolution shapes the bacteriophage lambda genes by performing a high resolution analysis of ribosomal profiling data and gene expression related synonymous/silent information encoded in bacteriophage coding regions.We demonstrated evidence of selection for distinct compositions of synonymous codons in early and late viral genes related to the adaptation of translation efficiency to different bacteriophage developmental stages. Specifically, we showed that evolution of viral coding regions is driven, among others, by selection for codons with higher decoding rates; during the initial/progressive stages of infection the decoding rates in early/late genes were found to be superior to those in late/early genes, respectively. Moreover, we argued that selection for translation efficiency could be partially explained by adaptation to Escherichia coli tRNA pool and the fact that it can change during the bacteriophage life cycle.An analysis of additional aspects related to the expression of viral genes, such as mRNA folding and more complex/longer regulatory signals in the coding regions, is also reported. The reported conclusions are likely to be relevant also to additional viruses. © The Author 2017. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.
1984-06-29
effort that requires hard copy documentation. As a result, there are generally numerous delays in providing current quality information. In the FoF...process have had fixed controls or were based on " hard -coded" information. A template, for example, is hard -coded information defining the shape of a...represents soft-coded control information. (Although manual handling of punch tapes still possess some of the limitations of " hard -coded" controls
(I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: “random chance,” which is based on probability sampling, “minimal information,” which yields at least one new code per sampling step, and “maximum information,” which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario. PMID:28746358
Ultra-narrow bandwidth voice coding
Holzrichter, John F [Berkeley, CA; Ng, Lawrence C [Danville, CA
2007-01-09
A system of removing excess information from a human speech signal and coding the remaining signal information, transmitting the coded signal, and reconstructing the coded signal. The system uses one or more EM wave sensors and one or more acoustic microphones to determine at least one characteristic of the human speech signal.
History of DCAS 1961. Volume V. Origins of the USAF Space Program 1945-1956
1961-01-01
AVAILABILITY CODES N! _______________ PHOTOGRAPH TIS SHEET AND RETURN TO DTIC-DDA-2 FORM DOCUMENT PROCESSING SHEE~~DTIC OC 970A OCT 79 AFSC HISTORICAL I...IZ ION OR HIGHER A THO ITY IN T E DI ECT LI COMMAN SIi Prepredundr te poviion ofAirFore Rgultio 21-1 nd ir ori i Sytem ComandSuplemnt N. Itheetoas...information as it appears in the narrative. It is to be hoped that additional information bearing on the formative years of the space program will appear as a
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-22
... production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532... . List of Subjects Environmental protection, Agricultural commodities, Feed additives, Food additives...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-27
... (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). B... Part 180 Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
... (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). If you have any questions regarding the applicability of this action to a... commodities, Feed additives, Food additives, Pesticides and pests, Reporting and recordkeeping requirements...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-11
... (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be exhaustive, but rather provides a guide for..., Agricultural commodities, Feed additives, Food additives, Pesticides and pests, Reporting and recordkeeping...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-21
... (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be exhaustive, but rather provides a guide for... commodities, Feed additives, Food additives, Pesticides and pests, Reporting and recordkeeping requirements...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-07
... (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). If you have any questions regarding the applicability of this action to a... Subjects Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides and...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-01
... (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be exhaustive, but rather provides a guide for... Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests...
ERIC Educational Resources Information Center
Ekdahl, Anna-Lena; Venkat, Hamsa; Runesson, Ulla
2016-01-01
In this article, we present a coding framework based on simultaneity and connections. The coding focuses on microlevel attention to three aspects of simultaneity and connections: between representations, within examples, and between examples. Criteria for coding that we viewed as mathematically important within part-whole additive relations…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-13
... (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be exhaustive, but rather provides a guide for... Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests...
Hao, Kun; Jin, Zhigang; Shen, Haifeng; Wang, Ying
2015-05-28
Efficient routing protocols for data packet delivery are crucial to underwater sensor networks (UWSNs). However, communication in UWSNs is a challenging task because of the characteristics of the acoustic channel. Network coding is a promising technique for efficient data packet delivery thanks to the broadcast nature of acoustic channels and the relatively high computation capabilities of the sensor nodes. In this work, we present GPNC, a novel geographic routing protocol for UWSNs that incorporates partial network coding to encode data packets and uses sensor nodes' location information to greedily forward data packets to sink nodes. GPNC can effectively reduce network delays and retransmissions of redundant packets causing additional network energy consumption. Simulation results show that GPNC can significantly improve network throughput and packet delivery ratio, while reducing energy consumption and network latency when compared with other routing protocols.
Relativistic quantum cryptography
NASA Astrophysics Data System (ADS)
Molotkov, S. N.; Nazin, S. S.
2003-07-01
The problem of unconditional security of quantum cryptography (i.e. the security which is guaranteed by the fundamental laws of nature rather than by technical limitations) is one of the central points in quantum information theory. We propose a relativistic quantum cryptosystem and prove its unconditional security against any eavesdropping attempts. Relativistitic causality arguments allow to demonstrate the security of the system in a simple way. Since the proposed protocol does not empoly collective measurements and quantum codes, the cryptosystem can be experimentally realized with the present state-of-art in fiber optics technologies. The proposed cryptosystem employs only the individual measurements and classical codes and, in addition, the key distribution problem allows to postpone the choice of the state encoding scheme until after the states are already received instead of choosing it before sending the states into the communication channel (i.e. to employ a sort of "antedate" coding).
A study on multiresolution lossless video coding using inter/intra frame adaptive prediction
NASA Astrophysics Data System (ADS)
Nakachi, Takayuki; Sawabe, Tomoko; Fujii, Tetsuro
2003-06-01
Lossless video coding is required in the fields of archiving and editing digital cinema or digital broadcasting contents. This paper combines a discrete wavelet transform and adaptive inter/intra-frame prediction in the wavelet transform domain to create multiresolution lossless video coding. The multiresolution structure offered by the wavelet transform facilitates interchange among several video source formats such as Super High Definition (SHD) images, HDTV, SDTV, and mobile applications. Adaptive inter/intra-frame prediction is an extension of JPEG-LS, a state-of-the-art lossless still image compression standard. Based on the image statistics of the wavelet transform domains in successive frames, inter/intra frame adaptive prediction is applied to the appropriate wavelet transform domain. This adaptation offers superior compression performance. This is achieved with low computational cost and no increase in additional information. Experiments on digital cinema test sequences confirm the effectiveness of the proposed algorithm.
Revisiting place and temporal theories of pitch
2014-01-01
The nature of pitch and its neural coding have been studied for over a century. A popular debate has revolved around the question of whether pitch is coded via “place” cues in the cochlea, or via timing cues in the auditory nerve. In the most recent incarnation of this debate, the role of temporal fine structure has been emphasized in conveying important pitch and speech information, particularly because the lack of temporal fine structure coding in cochlear implants might explain some of the difficulties faced by cochlear implant users in perceiving music and pitch contours in speech. In addition, some studies have postulated that hearing-impaired listeners may have a specific deficit related to processing temporal fine structure. This article reviews some of the recent literature surrounding the debate, and argues that much of the recent evidence suggesting the importance of temporal fine structure processing can also be accounted for using spectral (place) or temporal-envelope cues. PMID:25364292
The Fukushima Daiichi Accident Study Information Portal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shawn St. Germain; Curtis Smith; David Schwieder
This paper presents a description of The Fukushima Daiichi Accident Study Information Portal. The Information Portal was created by the Idaho National Laboratory as part of joint NRC and DOE project to assess the severe accident modeling capability of the MELCOR analysis code. The Fukushima Daiichi Accident Study Information Portal was created to collect, store, retrieve and validate information and data for use in reconstructing the Fukushima Daiichi accident. In addition to supporting the MELCOR simulations, the Portal will be the main DOE repository for all data, studies and reports related to the accident at the Fukushima Daiichi nuclear powermore » station. The data is stored in a secured (password protected and encrypted) repository that is searchable and accessible to researchers at diverse locations.« less
ERIC Educational Resources Information Center
VanBiervliet, Alan
A project to develop and evaluate a bar code reader system as a self-directed information and instructional aid for handicapped nonreaders is described. The bar code technology involves passing a light sensitive pen or laser over a printed code with bars which correspond to coded numbers. A system would consist of a portable device which could…
ICD Social Codes: An Underutilized Resource for Tracking Social Needs.
Torres, Jacqueline M; Lawlor, John; Colvin, Jeffrey D; Sills, Marion R; Bettenhausen, Jessica L; Davidson, Amber; Cutler, Gretchen J; Hall, Matt; Gottlieb, Laura M
2017-09-01
Social determinants of health (SDH) data collected in health care settings could have important applications for clinical decision-making, population health strategies, and the design of performance-based incentives and penalties. One source for cataloging SDH data is the International Statistical Classification of Diseases and Related Health Problems (ICD). To explore how SDH are captured with ICD Ninth revision SDH V codes in a national inpatient discharge database. Data come from the 2013 Healthcare Cost and Utilization Project (HCUP) National Inpatient Sample, a national stratified sample of discharges from 4363 hospitals from 44 US states. We estimate the rate of ICD-9 SDH V code utilization overall and by patient demographics and payer categories. We additionally estimate the rate of SDH V code utilization for: (a) the 5 most common reasons for hospitalization; and (b) the 5 conditions with the highest rates of SDH V code utilization. Fewer than 2% of overall discharges in the National Inpatient Sample were assigned an SDH V code. There were statistically significant differences in the rate of overall SDH V code utilization by age categories, race/ethnicity, sex, and payer (all P<0.001). Nevertheless, SDH V codes were assigned to <7% of discharges in any demographic or payer subgroup. SDH V code utilization was highest for major diagnostic categories related to mental health and alcohol/substance use-related discharges. SDH V codes are infrequently utilized in inpatient settings for discharges other than those related to mental health and alcohol/substance use. Utilization incentives will likely need to be developed to realize the potential benefits of cataloging SDH information.
Minucci, Angelo; Moradkhani, Kamran; Hwang, Ming Jing; Zuppi, Cecilia; Giardina, Bruno; Capoluongo, Ettore
2012-03-15
In the present paper we have updated the G6PD mutations database, including all the last discovered G6PD genetic variants. We underline that the last database has been published by Vulliamy et al. [1] who analytically reported 140 G6PD mutations: along with Vulliamy's database, there are two main sites, such as http://202.120.189.88/mutdb/ and www.LOVD.nl/MR, where almost all G6PD mutations can be found. Compared to the previous mutation reports, in our paper we have included for each mutation some additional information, such as: the secondary structure and the enzyme 3D position involving by mutation, the creation or abolition of a restriction site (with the enzyme involved) and the conservation score associated with each amino acid position. The mutations reported in the present tab have been divided according to the gene's region involved (coding and non-coding) and mutations affecting the coding region in: single, multiple (at least with two bases involved) and deletion. We underline that for the listed mutations, reported in italic, literature doesn't provide all the biochemical or bio-molecular information or the research data. Finally, for the "old" mutations, we tried to verify features previously reported and, when subsequently modified, we updated the specific information using the latest literature data. Copyright © 2012 Elsevier Inc. All rights reserved.
Bialkova, Svetlana; Grunert, Klaus G; Juhl, Hans Jørn; Wasowicz-Kirylo, Grazyna; Stysko-Kunkowska, Malgorzata; van Trijp, Hans C M
2014-05-01
In two eye-tracking studies, we explored whether and how attention to nutrition information mediates consumers' choice. Consumers had to select either the healthiest option or a product of their preference within an assortment. On each product a particular label (Choices logo, monochrome GDA label, or color-coded GDA label) communicated the product's nutrient profile. In study 1, participants had to select from 4 products differentiated, in addition to the nutrition information, by flavor (strawberry, muesli, apple, chocolate; varied within participants) and brand (local vs. global, varied between participants). Study 2 further explored brand effect within-participants, and thus only 2 flavors (strawberry, chocolate) were presented within an assortment. Actual choice made, response time and eye movements were recorded. Respondents fixated longer and more often on products with color-coded GDAs label than on products with monochrome GDAs or Choices logo. A health goal resulted in longer and more frequent fixations in comparison to a preference goal. Products with color-coded and monochrome GDAs had the highest likelihood of being chosen, and this effect was related to the attention-getting property of the label (irrespective of brand and flavor effects). The product fixated most had the highest likelihood of being chosen. These results suggest that attention mediates the effect of nutrition labels on choice. Copyright © 2014 Elsevier Ltd. All rights reserved.
Coherent diffractive imaging using randomly coded masks
Seaberg, Matthew H.; d'Aspremont, Alexandre; Turner, Joshua J.
2015-12-07
We experimentally demonstrate an extension to coherent diffractive imaging that encodes additional information through the use of a series of randomly coded masks, removing the need for typical object-domain constraints while guaranteeing a unique solution to the phase retrieval problem. Phase retrieval is performed using a numerical convex relaxation routine known as “PhaseCut,” an iterative algorithm known for its stability and for its ability to find the global solution, which can be found efficiently and which is robust to noise. As a result, the experiment is performed using a laser diode at 532.2 nm, enabling rapid prototyping for future X-raymore » synchrotron and even free electron laser experiments.« less
Personalized Clinical Diagnosis in Data Bases for Treatment Support in Phthisiology.
Lugovkina, T K; Skornyakov, S N; Golubev, D N; Egorov, E A; Medvinsky, I D
2016-01-01
The decision-making is a key event in the clinical practice. The program products with clinical decision support models in electronic data-base as well as with fixed decision moments of the real clinical practice and treatment results are very actual instruments for improving phthisiological practice and may be useful in the severe cases caused by the resistant strains of Mycobacterium tuberculosis. The methodology for gathering and structuring of useful information (critical clinical signals for decisions) is described. Additional coding of clinical diagnosis characteristics was implemented for numeric reflection of the personal situations. The created methodology for systematization and coding Clinical Events allowed to improve the clinical decision models for better clinical results.
Coherent diffractive imaging using randomly coded masks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seaberg, Matthew H., E-mail: seaberg@slac.stanford.edu; Linac Coherent Light Source, SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025; D'Aspremont, Alexandre
2015-12-07
We experimentally demonstrate an extension to coherent diffractive imaging that encodes additional information through the use of a series of randomly coded masks, removing the need for typical object-domain constraints while guaranteeing a unique solution to the phase retrieval problem. Phase retrieval is performed using a numerical convex relaxation routine known as “PhaseCut,” an iterative algorithm known for its stability and for its ability to find the global solution, which can be found efficiently and which is robust to noise. The experiment is performed using a laser diode at 532.2 nm, enabling rapid prototyping for future X-ray synchrotron and even freemore » electron laser experiments.« less
The design of wavefront coded imaging system
NASA Astrophysics Data System (ADS)
Lan, Shun; Cen, Zhaofeng; Li, Xiaotong
2016-10-01
Wavefront Coding is a new method to extend the depth of field, which combines optical design and signal processing together. By using optical design software ZEMAX ,we designed a practical wavefront coded imaging system based on a conventional Cooke triplet system .Unlike conventional optical system, the wavefront of this new system is modulated by a specially designed phase mask, which makes the point spread function (PSF)of optical system not sensitive to defocus. Therefore, a series of same blurred images obtained at the image plane. In addition, the optical transfer function (OTF) of the wavefront coded imaging system is independent of focus, which is nearly constant with misfocus and has no regions of zeros. All object information can be completely recovered through digital filtering at different defocus positions. The focus invariance of MTF is selected as merit function in this design. And the coefficients of phase mask are set as optimization goals. Compared to conventional optical system, wavefront coded imaging system obtains better quality images under different object distances. Some deficiencies appear in the restored images due to the influence of digital filtering algorithm, which are also analyzed in this paper. The depth of field of the designed wavefront coded imaging system is about 28 times larger than initial optical system, while keeping higher optical power and resolution at the image plane.
Utility of QR codes in biological collections
Diazgranados, Mauricio; Funk, Vicki A.
2013-01-01
Abstract The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers’ electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections. PMID:24198709
Utility of QR codes in biological collections.
Diazgranados, Mauricio; Funk, Vicki A
2013-01-01
The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers' electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-09
... code 112). [emsp14]Food manufacturing (NAICS code 311). [emsp14]Pesticide manufacturing (NAICS code... . List of Subjects Environmental protection, Agricultural commodities, Feed additives, Food additives...
Xenomicrobiology: a roadmap for genetic code engineering.
Acevedo-Rocha, Carlos G; Budisa, Nediljko
2016-09-01
Biology is an analytical and informational science that is becoming increasingly dependent on chemical synthesis. One example is the high-throughput and low-cost synthesis of DNA, which is a foundation for the research field of synthetic biology (SB). The aim of SB is to provide biotechnological solutions to health, energy and environmental issues as well as unsustainable manufacturing processes in the frame of naturally existing chemical building blocks. Xenobiology (XB) goes a step further by implementing non-natural building blocks in living cells. In this context, genetic code engineering respectively enables the re-design of genes/genomes and proteins/proteomes with non-canonical nucleic (XNAs) and amino (ncAAs) acids. Besides studying information flow and evolutionary innovation in living systems, XB allows the development of new-to-nature therapeutic proteins/peptides, new biocatalysts for potential applications in synthetic organic chemistry and biocontainment strategies for enhanced biosafety. In this perspective, we provide a brief history and evolution of the genetic code in the context of XB. We then discuss the latest efforts and challenges ahead for engineering the genetic code with focus on substitutions and additions of ncAAs as well as standard amino acid reductions. Finally, we present a roadmap for the directed evolution of artificial microbes for emancipating rare sense codons that could be used to introduce novel building blocks. The development of such xenomicroorganisms endowed with a 'genetic firewall' will also allow to study and understand the relation between code evolution and horizontal gene transfer. © 2016 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
MHD Advanced Power Train Phase I, Final Report, Volume 7
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. R. Jones
This appendix provides additional data in support of the MHD/Steam Power Plant Analyses reported in report Volume 5. The data is in the form of 3PA/SUMARY computer code printouts. The order of presentation in all four cases is as follows: (1) Overall Performance; (2) Component/Subsystem Information; (3) Plant Cost Accounts Summary; and (4) Plant Costing Details and Cost of Electricity.
Among biomacromolecules, RNA is the most versatile, and it plays indispensable roles in almost all aspects of biology. For example, in addition to serving as mRNAs coding for proteins, RNAs regulate gene expression, such as controlling where, when, and how efficiently a gene gets expressed, participate in RNA processing, encode the genetic information of some viruses, serve as
Rep. Issa, Darrell E. [R-CA-49
2009-07-09
Senate - 10/19/2009 Committee on Homeland Security and Governmental Affairs referred to Subcommittee on Federal Financial Management, Government Information, Federal Services, and International Security. (All Actions) Tracker: This bill has the status Passed HouseHere are the steps for Status of Legislation:
Validation of CFD/Heat Transfer Software for Turbine Blade Analysis
NASA Technical Reports Server (NTRS)
Kiefer, Walter D.
2004-01-01
I am an intern in the Turbine Branch of the Turbomachinery and Propulsion Systems Division. The division is primarily concerned with experimental and computational methods of calculating heat transfer effects of turbine blades during operation in jet engines and land-based power systems. These include modeling flow in internal cooling passages and film cooling, as well as calculating heat flux and peak temperatures to ensure safe and efficient operation. The branch is research-oriented, emphasizing the development of tools that may be used by gas turbine designers in industry. The branch has been developing a computational fluid dynamics (CFD) and heat transfer code called GlennHT to achieve the computational end of this analysis. The code was originally written in FORTRAN 77 and run on Silicon Graphics machines. However the code has been rewritten and compiled in FORTRAN 90 to take advantage of more modem computer memory systems. In addition the branch has made a switch in system architectures from SGI's to Linux PC's. The newly modified code therefore needs to be tested and validated. This is the primary goal of my internship. To validate the GlennHT code, it must be run using benchmark fluid mechanics and heat transfer test cases, for which there are either analytical solutions or widely accepted experimental data. From the solutions generated by the code, comparisons can be made to the correct solutions to establish the accuracy of the code. To design and create these test cases, there are many steps and programs that must be used. Before a test case can be run, pre-processing steps must be accomplished. These include generating a grid to describe the geometry, using a software package called GridPro. Also various files required by the GlennHT code must be created including a boundary condition file, a file for multi-processor computing, and a file to describe problem and algorithm parameters. A good deal of this internship will be to become familiar with these programs and the structure of the GlennHT code. Additional information is included in the original extended abstract.
Summary statistics in the attentional blink.
McNair, Nicolas A; Goodbourn, Patrick T; Shone, Lauren T; Harris, Irina M
2017-01-01
We used the attentional blink (AB) paradigm to investigate the processing stage at which extraction of summary statistics from visual stimuli ("ensemble coding") occurs. Experiment 1 examined whether ensemble coding requires attentional engagement with the items in the ensemble. Participants performed two sequential tasks on each trial: gender discrimination of a single face (T1) and estimating the average emotional expression of an ensemble of four faces (or of a single face, as a control condition) as T2. Ensemble coding was affected by the AB when the tasks were separated by a short temporal lag. In Experiment 2, the order of the tasks was reversed to test whether ensemble coding requires more working-memory resources, and therefore induces a larger AB, than estimating the expression of a single face. Each condition produced a similar magnitude AB in the subsequent gender-discrimination T2 task. Experiment 3 additionally investigated whether the previous results were due to participants adopting a subsampling strategy during the ensemble-coding task. Contrary to this explanation, we found different patterns of performance in the ensemble-coding condition and a condition in which participants were instructed to focus on only a single face within an ensemble. Taken together, these findings suggest that ensemble coding emerges automatically as a result of the deployment of attentional resources across the ensemble of stimuli, prior to information being consolidated in working memory.
Wang, Xiaogang; Chen, Wen; Chen, Xudong
2015-03-09
In this paper, we develop a new optical information authentication system based on compressed double-random-phase-encoded images and quick-response (QR) codes, where the parameters of optical lightwave are used as keys for optical decryption and the QR code is a key for verification. An input image attached with QR code is first optically encoded in a simplified double random phase encoding (DRPE) scheme without using interferometric setup. From the single encoded intensity pattern recorded by a CCD camera, a compressed double-random-phase-encoded image, i.e., the sparse phase distribution used for optical decryption, is generated by using an iterative phase retrieval technique with QR code. We compare this technique to the other two methods proposed in literature, i.e., Fresnel domain information authentication based on the classical DRPE with holographic technique and information authentication based on DRPE and phase retrieval algorithm. Simulation results show that QR codes are effective on improving the security and data sparsity of optical information encryption and authentication system.
Zhang, Qingfang; Wang, Cheng
2016-01-01
A central issue in written production concerns how phonological codes influence the output of orthographic codes. We used a picture-word interference paradigm combined with the event-related potential technique to investigate the temporal courses of phonological and orthographic activation and their interplay in Chinese writing. Distractors were orthographically related, phonologically related, orthographically plus phonologically related, or unrelated to picture names. The behavioral results replicated the classic facilitation effect for all three types of relatedness. The ERP results indicated an orthographic effect in the time window of 370–500 ms (onset latency: 370 ms), a phonological effect in the time window of 460–500 ms (onset latency: 464 ms), and an additive pattern of both effects in both time windows, thus indicating that orthographic codes were accessed earlier than, and independent of, phonological codes in written production. The orthographic activation originates from the semantic system, whereas the phonological effect results from the activation spreading from the orthographic lexicon to the phonological lexicon. These findings substantially strengthen the existing evidence that shows that access to orthographic codes is not mediated by phonological information, and they provide important support for the orthographic autonomy hypothesis. PMID:27605911
Continuous integration and quality control for scientific software
NASA Astrophysics Data System (ADS)
Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.
2013-08-01
Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.
Design of Provider-Provisioned Website Protection Scheme against Malware Distribution
NASA Astrophysics Data System (ADS)
Yagi, Takeshi; Tanimoto, Naoto; Hariu, Takeo; Itoh, Mitsutaka
Vulnerabilities in web applications expose computer networks to security threats, and many websites are used by attackers as hopping sites to attack other websites and user terminals. These incidents prevent service providers from constructing secure networking environments. To protect websites from attacks exploiting vulnerabilities in web applications, service providers use web application firewalls (WAFs). WAFs filter accesses from attackers by using signatures, which are generated based on the exploit codes of previous attacks. However, WAFs cannot filter unknown attacks because the signatures cannot reflect new types of attacks. In service provider environments, the number of exploit codes has recently increased rapidly because of the spread of vulnerable web applications that have been developed through cloud computing. Thus, generating signatures for all exploit codes is difficult. To solve these problems, our proposed scheme detects and filters malware downloads that are sent from websites which have already received exploit codes. In addition, to collect information for detecting malware downloads, web honeypots, which automatically extract the communication records of exploit codes, are used. According to the results of experiments using a prototype, our scheme can filter attacks automatically so that service providers can provide secure and cost-effective network environments.
48 CFR 452.219-70 - Size Standard and NAICS Code Information.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Size Standard and NAICS Code Information. 452.219-70 Section 452.219-70 Federal Acquisition Regulations System DEPARTMENT OF... System Code(s) and business size standard(s) describing the products and/or services to be acquired under...
Distributed and Dynamic Storage of Working Memory Stimulus Information in Extrastriate Cortex
Sreenivasan, Kartik K.; Vytlacil, Jason; D'Esposito, Mark
2015-01-01
The predominant neurobiological model of working memory (WM) posits that stimulus information is stored via stable elevated activity within highly selective neurons. Based on this model, which we refer to as the canonical model, the storage of stimulus information is largely associated with lateral prefrontal cortex (lPFC). A growing number of studies describe results that cannot be fully explained by the canonical model, suggesting that it is in need of revision. In the present study, we directly test key elements of the canonical model. We analyzed functional MRI data collected as participants performed a task requiring WM for faces and scenes. Multivariate decoding procedures identified patterns of activity containing information about the items maintained in WM (faces, scenes, or both). While information about WM items was identified in extrastriate visual cortex (EC) and lPFC, only EC exhibited a pattern of results consistent with a sensory representation. Information in both regions persisted even in the absence of elevated activity, suggesting that elevated population activity may not represent the storage of information in WM. Additionally, we observed that WM information was distributed across EC neural populations that exhibited a broad range of selectivity for the WM items rather than restricted to highly selective EC populations. Finally, we determined that activity patterns coding for WM information were not stable, but instead varied over the course of a trial, indicating that the neural code for WM information is dynamic rather than static. Together, these findings challenge the canonical model of WM. PMID:24392897
Avidan, Alexander; Weissman, Charles; Levin, Phillip D
2015-04-01
Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Establishing Malware Attribution and Binary Provenance Using Multicompilation Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramshaw, M. J.
2017-07-28
Malware is a serious problem for computer systems and costs businesses and customers billions of dollars a year in addition to compromising their private information. Detecting malware is particularly difficult because malware source code can be compiled in many different ways and generate many different digital signatures, which causes problems for most anti-malware programs that rely on static signature detection. Our project uses a convolutional neural network to identify malware programs but these require large amounts of data to be effective. Towards that end, we gather thousands of source code files from publicly available programming contest sites and compile themmore » with several different compilers and flags. Building upon current research, we then transform these binary files into image representations and use them to train a long-term recurrent convolutional neural network that will eventually be used to identify how a malware binary was compiled. This information will include the compiler, version of the compiler and the options used in compilation, information which can be critical in determining where a malware program came from and even who authored it.« less
Djordjevic, Ivan B
2011-08-15
In addition to capacity, the future high-speed optical transport networks will also be constrained by energy consumption. In order to solve the capacity and energy constraints simultaneously, in this paper we propose the use of energy-efficient hybrid D-dimensional signaling (D>4) by employing all available degrees of freedom for conveyance of the information over a single carrier including amplitude, phase, polarization and orbital angular momentum (OAM). Given the fact that the OAM eigenstates, associated with the azimuthal phase dependence of the complex electric field, are orthogonal, they can be used as basis functions for multidimensional signaling. Since the information capacity is a linear function of number of dimensions, through D-dimensional signal constellations we can significantly improve the overall optical channel capacity. The energy-efficiency problem is solved, in this paper, by properly designing the D-dimensional signal constellation such that the mutual information is maximized, while taking the energy constraint into account. We demonstrate high-potential of proposed energy-efficient hybrid D-dimensional coded-modulation scheme by Monte Carlo simulations. © 2011 Optical Society of America
Modes of Visual Recognition and Perceptually Relevant Sketch-based Coding for Images
NASA Technical Reports Server (NTRS)
Jobson, Daniel J.
1991-01-01
A review of visual recognition studies is used to define two levels of information requirements. These two levels are related to two primary subdivisions of the spatial frequency domain of images and reflect two distinct different physical properties of arbitrary scenes. In particular, pathologies in recognition due to cerebral dysfunction point to a more complete split into two major types of processing: high spatial frequency edge based recognition vs. low spatial frequency lightness (and color) based recognition. The former is more central and general while the latter is more specific and is necessary for certain special tasks. The two modes of recognition can also be distinguished on the basis of physical scene properties: the highly localized edges associated with reflectance and sharp topographic transitions vs. smooth topographic undulation. The extreme case of heavily abstracted images is pursued to gain an understanding of the minimal information required to support both modes of recognition. Here the intention is to define the semantic core of transmission. This central core of processing can then be fleshed out with additional image information and coding and rendering techniques.
Contribution of finger tracing to the recognition of Chinese characters.
Yim-Ng, Y Y; Varley, R; Andrade, J
2000-01-01
Finger tracing is a simulation of the act of writing without the use of pen and paper. It is claimed to help in the processing of Chinese characters, possibly by providing additional motor coding. In this study, blindfolded subjects were equally good at identifying Chinese characters and novel visual stimuli through passive movements made with the index finger of the preferred hand and those made with the last finger of that hand. This suggests that finger tracing provides a relatively high level of coding specific to individual characters, but non-specific to motor effectors. Beginning each stroke from the same location, i.e. removing spatial information, impaired recognition of the familiar characters and the novel nonsense figures. Passively tracing the strokes in a random sequence also impaired recognition of the characters. These results therefore suggest that the beneficial effect of finger tracing on writing or recall of Chinese characters is mediated by sequence and spatial information embedded in the motor movements, and that proprioceptive channel may play a part in mediating visuo-spatial information. Finger tracing may be a useful strategy for remediation of Chinese language impairments.
What determines successful implementation of inpatient information technology systems?
Spetz, Joanne; Burgess, James F; Phibbs, Ciaran S
2012-03-01
To identify the factors and strategies that were associated with successful implementation of hospital-based information technology (IT) systems in US Department of Veterans Affairs (VA) hospitals, and how these might apply to other hospitals. Qualitative analysis of 118 interviews conducted at 7 VA hospitals. The study focused on the inpatient setting, where nurses are the main patient-care providers; thus, the research emphasized the impact of Computerized Patient Record System and Bar Code Medication Administration on nurses. Hospitals were selected to represent a range of IT implementation dates, facility sizes, and geography. The subjects included nurses, pharmacists, physicians, IT staff, and managers. Interviews were guided by a semi-structured interview protocol, and a thematic analysis was conducted, with initial codes drawn from the content of the interview guides. Additional themes were proposed as the coding was conducted. Five broad themes arose as factors which affected the process and success of implementation: (1) organizational stability and implementation team leadership, (2) implementation timelines, (3) equipment availability and reliability, (4) staff training, and (5) changes in work flow Overall IT implementation success in the VA depended on: (1) whether there was support for change from both leaders and staff, (2) development of a gradual and flexible implementation approach, (3) allocation of adequate resources for equipment and infrastructure, hands-on support, and deployment of additional staff, and (4) how the implementation team planned for setbacks, and continued the process to achieve success. Problems that developed in the early stages of implementation tended to become persistent, and poor implementation can lead to patient harm.
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
Code inspection instructional validation
NASA Technical Reports Server (NTRS)
Orr, Kay; Stancil, Shirley
1992-01-01
The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.
Conceptual-driven classification for coding advise in health insurance reimbursement.
Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando
2011-01-01
With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in aspects of patient, hospital, and healthcare system. Copyright © 2010 Elsevier B.V. All rights reserved.
Rethinking mobile delivery: using Quick Response codes to access information at the point of need.
Lombardo, Nancy T; Morrow, Anne; Le Ber, Jeanne
2012-01-01
This article covers the use of Quick Response (QR) codes to provide instant mobile access to information, digital collections, educational offerings, library website, subject guides, text messages, videos, and library personnel. The array of uses and the value of using QR codes to push customized information to patrons are explained. A case is developed for using QR codes for mobile delivery of customized information to patrons. Applications in use at the Libraries of the University of Utah will be reviewed to provide readers with ideas for use in their library. Copyright © Taylor & Francis Group, LLC
Promotion by the British pharmaceutical industry, 1983-8: a critical analysis of self regulation.
Herxheimer, A; Collier, J
1990-01-01
Since 1958 the Association of the British Pharmaceutical Industry (ABPI) has attempted to regulate the promotion of prescription medicines through its code of practice. This regulation is described and analysed for the six years 1983-8 using the reports on 302 complaints considered by its code of practice committee and annual reports. The complaints came mainly from doctors (143, 48%) and competing companies (103, 33%). The committee found a total of 379 breaches of the code in 192 (63%) of the complaints. Additional breaches were detected by informational scrutiny of advertisements by the ABPI secretariat. Analysis showed that 270 (71%) of these breaches involved possible breaches of the Medicines Act. The rules that forbid misleading or unsubstantiated information and misleading claims or comparisons were broken most often. The committee found the most frequent offenders to be Organon (32 breaches), Smith Kline and French (23), Glaxo (21), A H Robins (18), Bayer (17), Merck Sharp and Dohme (17), and Lederle (16). Often the promotion of one product led to several breaches. The promotional wars over histamine H2 receptor antagonists accounted for 33 breaches. It is estimated that in 1983-8 about 100 breaches of the code were detected a year. In the 18 years 1972-88 the Medicines Act was breached probably over 1200 times. Health ministers, by not enforcing the regulations controlling promotion, have abrogated their responsibility to the ABPI, but the evidence suggests that the code has failed to deter promotional excesses. The ABPI's wish to secure compliance with the code seems weaker than its wish to pre-empt outside criticism and action: its self regulation seems to be a service to itself rather than to the public. It is suggested that the code of practice committee should become publicly accountable, that the majority of its members should represent the health professions and the public, and that effective sanctions are needed. PMID:2106963
Witham, Claire L; Baker, Stuart N
2015-01-01
There is considerable debate over whether the brain codes information using neural firing rate or the fine-grained structure of spike timing. We investigated this issue in spike discharge recorded from single units in the sensorimotor cortex, deep cerebellar nuclei, and dorsal root ganglia in macaque monkeys trained to perform a finger flexion task. The task required flexion to four different displacements against two opposing torques; the eight possible conditions were randomly interleaved. We used information theory to assess coding of task condition in spike rate, discharge irregularity, and spectral power in the 15- to 25-Hz band during the period of steady holding. All three measures coded task information in all areas tested. Information coding was most often independent between irregularity and 15-25 Hz power (60% of units), moderately redundant between spike rate and irregularity (56% of units redundant), and highly redundant between spike rate and power (93%). Most simultaneously recorded unit pairs coded using the same measure independently (86%). Knowledge of two measures often provided extra information about task, compared with knowledge of only one alone. We conclude that sensorimotor systems use both rate and temporal codes to represent information about a finger movement task. As well as offering insights into neural coding, this work suggests that incorporating spike irregularity into algorithms used for brain-machine interfaces could improve decoding accuracy. Copyright © 2015 the American Physiological Society.
Xenobiology: State-of-the-Art, Ethics, and Philosophy of New-to-Nature Organisms.
Schmidt, Markus; Pei, Lei; Budisa, Nediljko
The basic chemical constitution of all living organisms in the context of carbon-based chemistry consists of a limited number of small molecules and polymers. Until the twenty-first century, biology was mainly an analytical science and has now reached a point where it merges with engineering science, paving the way for synthetic biology. One of the objectives of synthetic biology is to try to change the chemical compositions of living cells, that is, to create an artificial biological diversity, which in turn fosters a new sub-field of synthetic biology, xenobiology. In particular, the genetic code in living systems is based on highly standardized chemistry composed of the same "letters" or nucleotides as informational polymers (DNA, RNA) and the 20 amino acids which serve as basic building blocks for proteins. The universality of the genetic code enables not only vertical gene transfer within the same species but also horizontal gene transfer across biological taxa, which require a high degree of standardization and interconnectivity. Although some minor alterations of the standard genetic code are found in nature (e.g., proteins containing non-conical amino acids exist in nature, and some organisms use alternated coding systems), all structurally deep chemistry changes within living systems are generally lethal, making the creation of artificial biological system an extremely difficult challenge.In this context, one of the great challenges for bioscience is the development of a strategy for expanding the standard basic chemical repertoire of living cells. Attempts to alter the meaning of the genetic information stored in DNA as an informational polymer by changing the chemistry of the polymer (i.e., xeno-nucleic acids) or by changes in the genetic code have already yielded successful results. In the future this should enable the partial or full redirection of the biological information flow to generate "new" version(s) of the genetic code derived from the "old" biological world.In addition to the scientific challenges, the attempt to increase biochemical diversity also raises important ethical and philosophical issues. Although promotors of this branch of synthetic biology highlight the many potential applications to come (e.g., novel tools for diagnostics and fighting infection diseases), such developments could also bring risks affecting social, political, and other structures of nearly all societies.
A novel use of QR code stickers after orthopaedic cast application.
Gough, A T; Fieraru, G; Gaffney, Pav; Butler, M; Kincaid, R J; Middleton, R G
2017-07-01
INTRODUCTION We present a novel solution to ensure that information and contact details are always available to patients while in cast. An information sticker containing both telephone numbers and a Quick Response (QR) code is applied to the cast. When scanned with a smartphone, the QR code loads the plaster team's webpage. This contains information and videos about cast care, complications and enhancing recovery. METHODS A sticker was designed and applied to all synthetic casts fitted in our fracture clinic. On cast removal, patients completed a questionnaire about the sticker. A total of 101 patients were surveyed between November 2015 and February 2016. The questionnaire comprised ten binary choice questions. RESULTS The vast majority (97%) of patients had the sticker still on their cast when they returned to clinic for cast removal. Eighty-four per cent of all patients felt reassured by the presence of the QR code sticker. Nine per cent used the contact details on the cast to seek advice. Over half (56%) had a smartphone and a third (33%) of these scanned the QR code. Of those who scanned the code, 95% found the information useful. CONCLUSIONS This study indicates that use of a QR code reassures patients and is an effective tool in the proactive management of potential cast problems. The QR code sticker is now applied to all casts across our trust. In line with NHS England's Five Year Forward View calling for enhanced use of smartphone technology, our trust is continuing to expand its portfolio of patient information accessible via QR codes. Other branches of medicine may benefit from incorporating QR codes as portals to access such information.
Performance Bounds on Two Concatenated, Interleaved Codes
NASA Technical Reports Server (NTRS)
Moision, Bruce; Dolinar, Samuel
2010-01-01
A method has been developed of computing bounds on the performance of a code comprised of two linear binary codes generated by two encoders serially concatenated through an interleaver. Originally intended for use in evaluating the performances of some codes proposed for deep-space communication links, the method can also be used in evaluating the performances of short-block-length codes in other applications. The method applies, more specifically, to a communication system in which following processes take place: At the transmitter, the original binary information that one seeks to transmit is first processed by an encoder into an outer code (Co) characterized by, among other things, a pair of numbers (n,k), where n (n > k)is the total number of code bits associated with k information bits and n k bits are used for correcting or at least detecting errors. Next, the outer code is processed through either a block or a convolutional interleaver. In the block interleaver, the words of the outer code are processed in blocks of I words. In the convolutional interleaver, the interleaving operation is performed bit-wise in N rows with delays that are multiples of B bits. The output of the interleaver is processed through a second encoder to obtain an inner code (Ci) characterized by (ni,ki). The output of the inner code is transmitted over an additive-white-Gaussian- noise channel characterized by a symbol signal-to-noise ratio (SNR) Es/No and a bit SNR Eb/No. At the receiver, an inner decoder generates estimates of bits. Depending on whether a block or a convolutional interleaver is used at the transmitter, the sequence of estimated bits is processed through a block or a convolutional de-interleaver, respectively, to obtain estimates of code words. Then the estimates of the code words are processed through an outer decoder, which generates estimates of the original information along with flags indicating which estimates are presumed to be correct and which are found to be erroneous. From the perspective of the present method, the topic of major interest is the performance of the communication system as quantified in the word-error rate and the undetected-error rate as functions of the SNRs and the total latency of the interleaver and inner code. The method is embodied in equations that describe bounds on these functions. Throughout the derivation of the equations that embody the method, it is assumed that the decoder for the outer code corrects any error pattern of t or fewer errors, detects any error pattern of s or fewer errors, may detect some error patterns of more than s errors, and does not correct any patterns of more than t errors. Because a mathematically complete description of the equations that embody the method and of the derivation of the equations would greatly exceed the space available for this article, it must suffice to summarize by reporting that the derivation includes consideration of several complex issues, including relationships between latency and memory requirements for block and convolutional codes, burst error statistics, enumeration of error-event intersections, and effects of different interleaving depths. In a demonstration, the method was used to calculate bounds on the performances of several communication systems, each based on serial concatenation of a (63,56) expurgated Hamming code with a convolutional inner code through a convolutional interleaver. The bounds calculated by use of the method were compared with results of numerical simulations of performances of the systems to show the regions where the bounds are tight (see figure).
On codes with multi-level error-correction capabilities
NASA Technical Reports Server (NTRS)
Lin, Shu
1987-01-01
In conventional coding for error control, all the information symbols of a message are regarded equally significant, and hence codes are devised to provide equal protection for each information symbol against channel errors. However, in some occasions, some information symbols in a message are more significant than the other symbols. As a result, it is desired to devise codes with multilevel error-correcting capabilities. Another situation where codes with multi-level error-correcting capabilities are desired is in broadcast communication systems. An m-user broadcast channel has one input and m outputs. The single input and each output form a component channel. The component channels may have different noise levels, and hence the messages transmitted over the component channels require different levels of protection against errors. Block codes with multi-level error-correcting capabilities are also known as unequal error protection (UEP) codes. Structural properties of these codes are derived. Based on these structural properties, two classes of UEP codes are constructed.
Air Traffic Controller Working Memory: Considerations in Air Traffic Control Tactical Operations
1993-09-01
INFORMATION PROCESSING SYSTEM 3 2. AIR TRAFFIC CONTROLLER MEMORY 5 2.1 MEMORY CODES 6 21.1 Visual Codes 7 2.1.2 Phonetic Codes 7 2.1.3 Semantic Codes 8...raise an awareness of the memory re- quirements of ATC tactical operations by presenting information on working memory processes that are relevant to...working v memory permeates every aspect of the controller’s ability to process air traffic information and control live traffic. The
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, P. L.
An SNM attribute Information Barrier (IB) system was developed for a 2011 US/UK Exercise. The system was modified and extensively tested in a 2013-2014 US-UK Measurement Campaign. This work demonstrated rapid deployment of an IB system for potential treaty use. The system utilizes an Ortec Fission Meter neutron multiplicity counter and custom computer code. The system demonstrates a proof-of-principle automated Pu-240 mass determination with an information barrier. After a software start command is issued, the system automatically acquires and downloads data, performs an analysis, and displays the results. This system conveys the results of a Pu mass threshold measurements inmore » a way the does not reveal sensitive information. In full IB mode, only red/green ‘lights’ are displayed in the software. In test mode, more detailed information is displayed. The code can also read in, analyze, and display results from previously acquired or simulated data. Because the equipment is commercial-off-the-shelf (COTS), the system demonstrates a low-cost short-lead-time technology for treaty SNM attribute measurements. A deployed system will likely require integration of additional authentication and tamper-indicating technologies. This will be discussed for the project in this and future progress reports.« less
Revilla-López, Guillem; Rodríguez-Ropero, Francisco; Curcó, David; Torras, Juan; Calaza, M. Isabel; Zanuy, David; Jiménez, Ana I.; Cativiela, Carlos; Nussinov, Ruth; Alemán, Carlos
2011-01-01
Recently, we reported a database (NCAD, Non-Coded Amino acids Database; http://recerca.upc.edu/imem/index.htm) that was built to compile information about the intrinsic conformational preferences of non-proteinogenic residues determined by quantum mechanical calculations, as well as bibliographic information about their synthesis, physical and spectroscopic characterization, the experimentally-established conformational propensities, and applications (J. Phys. Chem. B 2010, 114, 7413). The database initially contained the information available for α-tetrasubstituted α-amino acids. In this work, we extend NCAD to three families of compounds, which can be used to engineer peptides and proteins incorporating modifications at the –NHCO– peptide bond. Such families are: N-substituted α-amino acids, thio-α-amino acids, and diamines and diacids used to build retropeptides. The conformational preferences of these compounds have been analyzed and described based on the information captured in the database. In addition, we provide an example of the utility of the database and of the compounds it compiles in protein and peptide engineering. Specifically, the symmetry of a sequence engineered to stabilize the 310-helix with respect to the α-helix has been broken without perturbing significantly the secondary structure through targeted replacements using the information contained in the database. PMID:21491493
Entanglement-assisted quantum convolutional coding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilde, Mark M.; Brun, Todd A.
2010-04-15
We show how to protect a stream of quantum information from decoherence induced by a noisy quantum communication channel. We exploit preshared entanglement and a convolutional coding structure to develop a theory of entanglement-assisted quantum convolutional coding. Our construction produces a Calderbank-Shor-Steane (CSS) entanglement-assisted quantum convolutional code from two arbitrary classical binary convolutional codes. The rate and error-correcting properties of the classical convolutional codes directly determine the corresponding properties of the resulting entanglement-assisted quantum convolutional code. We explain how to encode our CSS entanglement-assisted quantum convolutional codes starting from a stream of information qubits, ancilla qubits, and shared entangled bits.
Coding for urologic office procedures.
Dowling, Robert A; Painter, Mark
2013-11-01
This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.
Journalism as health education: media coverage of a nonbranded pharma web site.
Mackert, Michael; Love, Brad; Holton, Avery E
2011-03-01
As healthcare consumers increasingly use the Internet as a source for health information, direct-to-consumer (DTC) prescription drug advertising online merits additional attention. The purpose of this research was to investigate media coverage of the joint marketing program linking the movie Happy Feet and the nonbranded disease education Web site FluFacts-a resource from Tamiflu flu treatment manufacturer Roche Laboratories Inc. Twenty-nine articles (n = 29) were found covering the Happy Feet-FluFacts marketing campaign. A coding guide was developed to assess elements of the articles, including those common in the sample and information that ideally would be included in these articles. Two coders independently coded the articles, achieving intercoder agreement of κ = 0.98 before resolving disagreements to arrive at a final dataset. The majority of articles reported that Roche operated FluFacts (51.7%) and mentioned the product Tamiflu (58.6%). Almost half (48.3%) reported FluFacts was an educational resource; yet, no articles mentioned other antiviral medications or nonmedical options for preventing the flu. Almost a quarter of the articles (24.1%) provided a call to action-telling readers to visit FluFacts or providing a link for them to do so. Findings suggest that journalists' coverage of this novel campaign-likely one of the goals of the campaign-helped spread the message of the Happy Feet-FluFacts relationship, often omitting other useful health information. Additional research is needed to better understand online DTC campaigns and how consumers react to these campaigns and resulting media coverage and to inform the policymakers' decisions regarding DTC advertising online.
Yu, Lianchun; Shen, Zhou; Wang, Chen; Yu, Yuguo
2018-01-01
Selective pressure may drive neural systems to process as much information as possible with the lowest energy cost. Recent experiment evidence revealed that the ratio between synaptic excitation and inhibition (E/I) in local cortex is generally maintained at a certain value which may influence the efficiency of energy consumption and information transmission of neural networks. To understand this issue deeply, we constructed a typical recurrent Hodgkin-Huxley network model and studied the general principles that governs the relationship among the E/I synaptic current ratio, the energy cost and total amount of information transmission. We observed in such a network that there exists an optimal E/I synaptic current ratio in the network by which the information transmission achieves the maximum with relatively low energy cost. The coding energy efficiency which is defined as the mutual information divided by the energy cost, achieved the maximum with the balanced synaptic current. Although background noise degrades information transmission and imposes an additional energy cost, we find an optimal noise intensity that yields the largest information transmission and energy efficiency at this optimal E/I synaptic transmission ratio. The maximization of energy efficiency also requires a certain part of energy cost associated with spontaneous spiking and synaptic activities. We further proved this finding with analytical solution based on the response function of bistable neurons, and demonstrated that optimal net synaptic currents are capable of maximizing both the mutual information and energy efficiency. These results revealed that the development of E/I synaptic current balance could lead a cortical network to operate at a highly efficient information transmission rate at a relatively low energy cost. The generality of neuronal models and the recurrent network configuration used here suggest that the existence of an optimal E/I cell ratio for highly efficient energy costs and information maximization is a potential principle for cortical circuit networks. Summary We conducted numerical simulations and mathematical analysis to examine the energy efficiency of neural information transmission in a recurrent network as a function of the ratio of excitatory and inhibitory synaptic connections. We obtained a general solution showing that there exists an optimal E/I synaptic ratio in a recurrent network at which the information transmission as well as the energy efficiency of this network achieves a global maximum. These results reflect general mechanisms for sensory coding processes, which may give insight into the energy efficiency of neural communication and coding. PMID:29773979
Yu, Lianchun; Shen, Zhou; Wang, Chen; Yu, Yuguo
2018-01-01
Selective pressure may drive neural systems to process as much information as possible with the lowest energy cost. Recent experiment evidence revealed that the ratio between synaptic excitation and inhibition (E/I) in local cortex is generally maintained at a certain value which may influence the efficiency of energy consumption and information transmission of neural networks. To understand this issue deeply, we constructed a typical recurrent Hodgkin-Huxley network model and studied the general principles that governs the relationship among the E/I synaptic current ratio, the energy cost and total amount of information transmission. We observed in such a network that there exists an optimal E/I synaptic current ratio in the network by which the information transmission achieves the maximum with relatively low energy cost. The coding energy efficiency which is defined as the mutual information divided by the energy cost, achieved the maximum with the balanced synaptic current. Although background noise degrades information transmission and imposes an additional energy cost, we find an optimal noise intensity that yields the largest information transmission and energy efficiency at this optimal E/I synaptic transmission ratio. The maximization of energy efficiency also requires a certain part of energy cost associated with spontaneous spiking and synaptic activities. We further proved this finding with analytical solution based on the response function of bistable neurons, and demonstrated that optimal net synaptic currents are capable of maximizing both the mutual information and energy efficiency. These results revealed that the development of E/I synaptic current balance could lead a cortical network to operate at a highly efficient information transmission rate at a relatively low energy cost. The generality of neuronal models and the recurrent network configuration used here suggest that the existence of an optimal E/I cell ratio for highly efficient energy costs and information maximization is a potential principle for cortical circuit networks. We conducted numerical simulations and mathematical analysis to examine the energy efficiency of neural information transmission in a recurrent network as a function of the ratio of excitatory and inhibitory synaptic connections. We obtained a general solution showing that there exists an optimal E/I synaptic ratio in a recurrent network at which the information transmission as well as the energy efficiency of this network achieves a global maximum. These results reflect general mechanisms for sensory coding processes, which may give insight into the energy efficiency of neural communication and coding.
Schinkel, Sanne; Schouten, Barbara C; van Weert, Julia C M
2013-02-01
This study aims to assess unfulfilled information needs of native-Dutch and Turkish-Dutch general practitioner (GP) patients in the Netherlands. In addition, the relation between perceived and recorded information provision by GPs is studied. Unfulfilled information needs of native-Dutch (N=117) and Turkish-Dutch patients (N=74) were assessed through pre- and post-consultation questionnaires. Audiotapes of GP consultations were made to code GPs' information provision. Turkish-Dutch patients experience more unfulfilled information needs than native-Dutch patients, in particular those who identify equally with Dutch and Turkish culture. Overall, perceived information provision is hardly related to recorded information provision. GPs insufficiently provide Turkish-Dutch patients and, to a lesser extent, native-Dutch patients as well, the information they need. GPs should be trained in giving adequate, tailored information to patients with various ethnic and cultural backgrounds. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Exploring the role of contextual information in bloodstain pattern analysis: A qualitative approach.
Osborne, Nikola K P; Taylor, Michael C; Zajac, Rachel
2016-03-01
During Bloodstain Pattern Analysis (BPA), an analyst may encounter various sources of contextual information. Although contextual bias has emerged as a valid concern for the discipline, little is understood about how contextual information informs BPA. To address this issue, we asked 15 experienced bloodstain pattern analysts from New Zealand and Australia to think aloud as they classified bloodstain patterns from two homicide cases. Analysts could request items of contextual information, and were required to state how each item would inform their analysis. Pathology reports and additional photographs of the scene were the most commonly requested items of information. We coded analysts' reasons for requesting contextual information--and the way in which they integrated this information--according to thematic analysis. We identified considerable variation in both of these variables, raising important questions about the role and necessity of contextual information in decisions about bloodstain pattern evidence. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Lawrence, Renée H; Tomolo, Anne M
2011-03-01
Although practice-based learning and improvement (PBLI) is now recognized as a fundamental and necessary skill set, we are still in need of tools that yield specific information about gaps in knowledge and application to help nurture the development of quality improvement (QI) skills in physicians in a proficient and proactive manner. We developed a questionnaire and coding system as an assessment tool to evaluate and provide feedback regarding PBLI self-efficacy, knowledge, and application skills for residency programs and related professional requirements. Five nationally recognized QI experts/leaders reviewed and completed our questionnaire. Through an iterative process, a coding system based on identifying key variables needed for ideal responses was developed to score project proposals. The coding system comprised 14 variables related to the QI projects, and an additional 30 variables related to the core knowledge concepts related to PBLI. A total of 86 residents completed the questionnaire, and 2 raters coded their open-ended responses. Interrater reliability was assessed by percentage agreement and Cohen κ for individual variables and Lin concordance correlation for total scores for knowledge and application. Discriminative validity (t test to compare known groups) and coefficient of reproducibility as an indicator of construct validity (item difficulty hierarchy) were also assessed. Interrater reliability estimates were good (percentage of agreements, above 90%; κ, above 0.4 for most variables; concordances for total scores were R = .88 for knowledge and R = .98 for application). Despite the residents' limited range of experiences in the group with prior PBLI exposure, our tool met our goal of differentiating between the 2 groups in our preliminary analyses. Correcting for chance agreement identified some variables that are potentially problematic. Although additional evaluation is needed, our tool may prove helpful and provide detailed information about trainees' progress and the curriculum.
Lawrence, Renée H; Tomolo, Anne M
2011-01-01
Background Although practice-based learning and improvement (PBLI) is now recognized as a fundamental and necessary skill set, we are still in need of tools that yield specific information about gaps in knowledge and application to help nurture the development of quality improvement (QI) skills in physicians in a proficient and proactive manner. We developed a questionnaire and coding system as an assessment tool to evaluate and provide feedback regarding PBLI self-efficacy, knowledge, and application skills for residency programs and related professional requirements. Methods Five nationally recognized QI experts/leaders reviewed and completed our questionnaire. Through an iterative process, a coding system based on identifying key variables needed for ideal responses was developed to score project proposals. The coding system comprised 14 variables related to the QI projects, and an additional 30 variables related to the core knowledge concepts related to PBLI. A total of 86 residents completed the questionnaire, and 2 raters coded their open-ended responses. Interrater reliability was assessed by percentage agreement and Cohen κ for individual variables and Lin concordance correlation for total scores for knowledge and application. Discriminative validity (t test to compare known groups) and coefficient of reproducibility as an indicator of construct validity (item difficulty hierarchy) were also assessed. Results Interrater reliability estimates were good (percentage of agreements, above 90%; κ, above 0.4 for most variables; concordances for total scores were R = .88 for knowledge and R = .98 for application). Conclusion Despite the residents' limited range of experiences in the group with prior PBLI exposure, our tool met our goal of differentiating between the 2 groups in our preliminary analyses. Correcting for chance agreement identified some variables that are potentially problematic. Although additional evaluation is needed, our tool may prove helpful and provide detailed information about trainees' progress and the curriculum. PMID:22379522
Potential roles of cholinergic modulation in the neural coding of location and movement speed
Dannenberg, Holger; Hinman, James R.; Hasselmo, Michael E.
2016-01-01
Behavioral data suggest that cholinergic modulation may play a role in certain aspects of spatial memory, and neurophysiological data demonstrate neurons that fire in response to spatial dimensions, including grid cells and place cells that respond on the basis of location and running speed. These neurons show firing responses that depend upon the visual configuration of the environment, due to coding in visually-responsive regions of the neocortex. This review focuses on the physiological effects of acetylcholine that may influence the sensory coding of spatial dimensions relevant to behavior. In particular, the local circuit effects of acetylcholine within the cortex regulate the influence of sensory input relative to internal memory representations, via presynaptic inhibition of excitatory and inhibitory synaptic transmission, and the modulation of intrinsic currents in cortical excitatory and inhibitory neurons. In addition, circuit effects of acetylcholine regulate the dynamics of cortical circuits including oscillations at theta and gamma frequencies. These effects of acetylcholine on local circuits and network dynamics could underlie the role of acetylcholine in coding of spatial information for the performance of spatial memory tasks. PMID:27677935
MCNP Version 6.2 Release Notes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werner, Christopher John; Bull, Jeffrey S.; Solomon, C. J.
Monte Carlo N-Particle or MCNP ® is a general-purpose Monte Carlo radiation-transport code designed to track many particle types over broad ranges of energies. This MCNP Version 6.2 follows the MCNP6.1.1 beta version and has been released in order to provide the radiation transport community with the latest feature developments and bug fixes for MCNP. Since the last release of MCNP major work has been conducted to improve the code base, add features, and provide tools to facilitate ease of use of MCNP version 6.2 as well as the analysis of results. These release notes serve as a general guidemore » for the new/improved physics, source, data, tallies, unstructured mesh, code enhancements and tools. For more detailed information on each of the topics, please refer to the appropriate references or the user manual which can be found at http://mcnp.lanl.gov. This release of MCNP version 6.2 contains 39 new features in addition to 172 bug fixes and code enhancements. There are still some 33 known issues the user should familiarize themselves with (see Appendix).« less
Information quality measurement of medical encoding support based on usability.
Puentes, John; Montagner, Julien; Lecornu, Laurent; Cauvin, Jean-Michel
2013-12-01
Medical encoding support systems for diagnoses and medical procedures are an emerging technology that begins to play a key role in billing, reimbursement, and health policies decisions. A significant problem to exploit these systems is how to measure the appropriateness of any automatically generated list of codes, in terms of fitness for use, i.e. their quality. Until now, only information retrieval performance measurements have been applied to estimate the accuracy of codes lists as quality indicator. Such measurements do not give the value of codes lists for practical medical encoding, and cannot be used to globally compare the quality of multiple codes lists. This paper defines and validates a new encoding information quality measure that addresses the problem of measuring medical codes lists quality. It is based on a usability study of how expert coders and physicians apply computer-assisted medical encoding. The proposed measure, named ADN, evaluates codes Accuracy, Dispersion and Noise, and is adapted to the variable length and content of generated codes lists, coping with limitations of previous measures. According to the ADN measure, the information quality of a codes list is fully represented by a single point, within a suitably constrained feature space. Using one scheme, our approach is reliable to measure and compare the information quality of hundreds of codes lists, showing their practical value for medical encoding. Its pertinence is demonstrated by simulation and application to real data corresponding to 502 inpatient stays in four clinic departments. Results are compared to the consensus of three expert coders who also coded this anonymized database of discharge summaries, and to five information retrieval measures. Information quality assessment applying the ADN measure showed the degree of encoding-support system variability from one clinic department to another, providing a global evaluation of quality measurement trends. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Health information management: an introduction to disease classification and coding.
Mony, Prem Kumar; Nagaraj, C
2007-01-01
Morbidity and mortality data constitute an important component of a health information system and their coding enables uniform data collation and analysis as well as meaningful comparisons between regions or countries. Strengthening the recording and reporting systems for health monitoring is a basic requirement for an efficient health information management system. Increased advocacy for and awareness of a uniform coding system together with adequate capacity building of physicians, coders and other allied health and information technology personnel would pave the way for a valid and reliable health information management system in India. The core requirements for the implementation of disease coding are: (i) support from national/institutional health administrators, (ii) widespread availability of the ICD-10 material for morbidity and mortality coding; (iii) enhanced human and financial resources; and (iv) optimal use of informatics. We describe the methodology of a disease classification and codification system as also its applications for developing and maintaining an effective health information management system for India.
Allocentric information is used for memory-guided reaching in depth: A virtual reality study.
Klinghammer, Mathias; Schütz, Immo; Blohm, Gunnar; Fiehler, Katja
2016-12-01
Previous research has demonstrated that humans use allocentric information when reaching to remembered visual targets, but most of the studies are limited to 2D space. Here, we study allocentric coding of memorized reach targets in 3D virtual reality. In particular, we investigated the use of allocentric information for memory-guided reaching in depth and the role of binocular and monocular (object size) depth cues for coding object locations in 3D space. To this end, we presented a scene with objects on a table which were located at different distances from the observer and served as reach targets or allocentric cues. After free visual exploration of this scene and a short delay the scene reappeared, but with one object missing (=reach target). In addition, the remaining objects were shifted horizontally or in depth. When objects were shifted in depth, we also independently manipulated object size by either magnifying or reducing their size. After the scene vanished, participants reached to the remembered target location on the blank table. Reaching endpoints deviated systematically in the direction of object shifts, similar to our previous results from 2D presentations. This deviation was stronger for object shifts in depth than in the horizontal plane and independent of observer-target-distance. Reaching endpoints systematically varied with changes in object size. Our results suggest that allocentric information is used for coding targets for memory-guided reaching in depth. Thereby, retinal disparity and vergence as well as object size provide important binocular and monocular depth cues. Copyright © 2016 Elsevier Ltd. All rights reserved.
Coherent concepts are computed in the anterior temporal lobes.
Lambon Ralph, Matthew A; Sage, Karen; Jones, Roy W; Mayberry, Emily J
2010-02-09
In his Philosophical Investigations, Wittgenstein famously noted that the formation of semantic representations requires more than a simple combination of verbal and nonverbal features to generate conceptually based similarities and differences. Classical and contemporary neuroscience has tended to focus upon how different neocortical regions contribute to conceptualization through the summation of modality-specific information. The additional yet critical step of computing coherent concepts has received little attention. Some computational models of semantic memory are able to generate such concepts by the addition of modality-invariant information coded in a multidimensional semantic space. By studying patients with semantic dementia, we demonstrate that this aspect of semantic memory becomes compromised following atrophy of the anterior temporal lobes and, as a result, the patients become increasingly influenced by superficial rather than conceptual similarities.
Craig, Elizabeth; Kerr, Neal; McDonald, Gabrielle
2017-03-01
In New Zealand, there is a paucity of information on children with chronic conditions and disabilities (CCD). One reason is that many are managed in hospital outpatients where diagnostic coding of health-care events does not occur. This study explores the feasibility of coding paediatric outpatient data to provide health planners with information on children with CCD. Thirty-seven clinicians from six District Health Boards (DHBs) trialled coding over 12 weeks. In five DHBs, the International Classification of Diseases and Related Health Problems, 10th Edition, Australian Modification (ICD-10-AM) and Systematised Nomenclature of Medicine Clinical Terms (SNOMED-CT) were trialled for 6 weeks each. In one DHB, ICD-10-AM was trialled for 12 weeks. A random sample (30%) of ICD-10-AM coded events were also coded by clinical coders. A mix of paper and electronic methods were used. In total 2,604 outpatient events were coded in ICD-10-AM and 693 in SNOMED-CT. Dual coding occurred for 770 (29.6%) ICD-10-AM events. Overall, 34% of ICD-10-AM and 40% of SNOMED-CT events were for developmental and behavioural disorders. Chronic medical conditions were also common. Clinicians were concerned about the workload impacts, particularly for paper-based methods. Coder's were concerned about clinician's adherence to coding guidelines and the poor quality of documentation in some notes. Coded outpatient data could provide planners with a rich source of information on children with CCD. However, coding is also resource intensive. Thus its costs need to be weighed against the costs of managing a much larger health budget using very limited information. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).
Codes of Discipline: Developments, Dimensions, Directions.
ERIC Educational Resources Information Center
Goldsmith, Arthur H.
1982-01-01
Well-drafted codes of discipline can help to eliminate the ambiguity and arbitrariness that often have been associated with school discipline. Discipline codes should be characterized by fairness, fact-finding provisions, completeness of information, frankness, flexibility, informality, firmness, concern with disciplinary suitability,…
Deployable and Inflatable Fendering Apparatus and Method
2009-09-25
for information should be addressed to: TECHNOLOGY PARTNERSHIP ENTERPRISE OFFICE NAVAL UNDERSEA WARFARE CENTER 117 6 HOWELL ST. CODE 07TP, BLDG...can result in impact and abrasion damage to a watercraft’s hull or other marine structures. [0005] Many types of watercraft fender designs ...cause inflation of the bladder, the same air also creates additional pressure maintaining the bladder coupled to the base. An air compressor and
Extending Case-Based Reasoning (CBR) Approaches to Semi-automated Network Alert Reporting
2013-04-01
connecting to the domain is likely infected with malware, or may have been exposed to malicious code. -- Detailed Information: The Sourcefire VRT ...to be generated by malware. After applying an extensive whitelist, the VRT pulls out the most commonly visited domains and adds them to its...malicious software. The VRT recommends ClamAV for Windows 3.0. 39 -- Contributors: Sourcefire Vulnerability Research Team -- Additional
User's Manual for LEWICE Version 3.2
NASA Technical Reports Server (NTRS)
Wright, William
2008-01-01
A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 3.2 of this software, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications, the addition of automated Navier-Stokes analysis, an empirical model for supercooled large droplets (SLD) and a pneumatic boot option. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this software.
Chazard, Emmanuel; Mouret, Capucine; Ficheur, Grégoire; Schaffar, Aurélien; Beuscart, Jean-Baptiste; Beuscart, Régis
2014-04-01
Medical free-text records enable to get rich information about the patients, but often need to be de-identified by removing the Protected Health Information (PHI), each time the identification of the patient is not mandatory. Pattern matching techniques require pre-defined dictionaries, and machine learning techniques require an extensive training set. Methods exist in French, but either bring weak results or are not freely available. The objective is to define and evaluate FASDIM, a Fast And Simple De-Identification Method for French medical free-text records. FASDIM consists in removing all the words that are not present in the authorized word list, and in removing all the numbers except those that match a list of protection patterns. The corresponding lists are incremented in the course of the iterations of the method. For the evaluation, the workload is estimated in the course of records de-identification. The efficiency of the de-identification is assessed by independent medical experts on 508 discharge letters that are randomly selected and de-identified by FASDIM. Finally, the letters are encoded after and before de-identification according to 3 terminologies (ATC, ICD10, CCAM) and the codes are compared. The construction of the list of authorized words is progressive: 12h for the first 7000 letters, 16 additional hours for 20,000 additional letters. The Recall (proportion of removed Protected Health Information, PHI) is 98.1%, the Precision (proportion of PHI within the removed token) is 79.6% and the F-measure (harmonic mean) is 87.9%. In average 30.6 terminology codes are encoded per letter, and 99.02% of those codes are preserved despite the de-identification. FASDIM gets good results in French and is freely available. It is easy to implement and does not require any predefined dictionary. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Energy coding in biological neural networks
Zhang, Zhikang
2007-01-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513
Ekinci, Yunus Levent
2016-01-01
This paper presents an easy-to-use open source computer algorithm (code) for estimating the depths of isolated single thin dike-like source bodies by using numerical second-, third-, and fourth-order horizontal derivatives computed from observed magnetic anomalies. The approach does not require a priori information and uses some filters of successive graticule spacings. The computed higher-order horizontal derivative datasets are used to solve nonlinear equations for depth determination. The solutions are independent from the magnetization and ambient field directions. The practical usability of the developed code, designed in MATLAB R2012b (MathWorks Inc.), was successfully examined using some synthetic simulations with and without noise. The algorithm was then used to estimate the depths of some ore bodies buried in different regions (USA, Sweden, and Canada). Real data tests clearly indicated that the obtained depths are in good agreement with those of previous studies and drilling information. Additionally, a state-of-the-art inversion scheme based on particle swarm optimization produced comparable results to those of the higher-order horizontal derivative analyses in both synthetic and real anomaly cases. Accordingly, the proposed code is verified to be useful in interpreting isolated single thin dike-like magnetized bodies and may be an alternative processing technique. The open source code can be easily modified and adapted to suit the benefits of other researchers.
Goz, Eli; Zafrir, Zohar; Tuller, Tamir
2018-04-30
Understanding how viruses co-evolve with their hosts and adapt various genomic level strategies in order to ensure their fitness may have essential implications in unveiling the secrets of viral evolution, and in developing new vaccines and therapeutic approaches. Here, based on a novel genomic analysis of 2,625 different viruses and 439 corresponding host organisms, we provide evidence of universal evolutionary selection for high dimensional 'silent' patterns of information hidden in the redundancy of viral genetic code. Our model suggests that long substrings of nucleotides in the coding regions of viruses from all classes, often also repeat in the corresponding viral hosts from all domains of life. Selection for these substrings cannot be explained only by such phenomena as codon usage bias, horizontal gene transfer, and the encoded proteins. Genes encoding structural proteins responsible for building the core of the viral particles were found to include more host-repeating substrings, and these substrings tend to appear in the middle parts of the viral coding regions. In addition, in human viruses these substrings tend to be enriched with motives related to transcription factors and RNA binding proteins. The host-repeating substrings are possibly related to the evolutionary pressure on the viruses to effectively interact with host's intracellular factors and to efficiently escape from the host's immune system. tamirtul@post.tau.ac.il (TT). Supplementary data are available at Bioinformatics online.
Multiple description distributed image coding with side information for mobile wireless transmission
NASA Astrophysics Data System (ADS)
Wu, Min; Song, Daewon; Chen, Chang Wen
2005-03-01
Multiple description coding (MDC) is a source coding technique that involves coding the source information into multiple descriptions, and then transmitting them over different channels in packet network or error-prone wireless environment to achieve graceful degradation if parts of descriptions are lost at the receiver. In this paper, we proposed a multiple description distributed wavelet zero tree image coding system for mobile wireless transmission. We provide two innovations to achieve an excellent error resilient capability. First, when MDC is applied to wavelet subband based image coding, it is possible to introduce correlation between the descriptions in each subband. We consider using such a correlation as well as potentially error corrupted description as side information in the decoding to formulate the MDC decoding as a Wyner Ziv decoding problem. If only part of descriptions is lost, however, their correlation information is still available, the proposed Wyner Ziv decoder can recover the description by using the correlation information and the error corrupted description as side information. Secondly, in each description, single bitstream wavelet zero tree coding is very vulnerable to the channel errors. The first bit error may cause the decoder to discard all subsequent bits whether or not the subsequent bits are correctly received. Therefore, we integrate the multiple description scalar quantization (MDSQ) with the multiple wavelet tree image coding method to reduce error propagation. We first group wavelet coefficients into multiple trees according to parent-child relationship and then code them separately by SPIHT algorithm to form multiple bitstreams. Such decomposition is able to reduce error propagation and therefore improve the error correcting capability of Wyner Ziv decoder. Experimental results show that the proposed scheme not only exhibits an excellent error resilient performance but also demonstrates graceful degradation over the packet loss rate.
Vector Adaptive/Predictive Encoding Of Speech
NASA Technical Reports Server (NTRS)
Chen, Juin-Hwey; Gersho, Allen
1989-01-01
Vector adaptive/predictive technique for digital encoding of speech signals yields decoded speech of very good quality after transmission at coding rate of 9.6 kb/s and of reasonably good quality at 4.8 kb/s. Requires 3 to 4 million multiplications and additions per second. Combines advantages of adaptive/predictive coding, and code-excited linear prediction, yielding speech of high quality but requires 600 million multiplications and additions per second at encoding rate of 4.8 kb/s. Vector adaptive/predictive coding technique bridges gaps in performance and complexity between adaptive/predictive coding and code-excited linear prediction.
Energy information data base: report number codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-09-01
Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes eachmore » has used. (RWR)« less
Jiang, Zhehan; Skorupski, William
2017-12-12
In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.
Distinct cortical codes and temporal dynamics for conscious and unconscious percepts
Salti, Moti; Monto, Simo; Charles, Lucie; King, Jean-Remi; Parkkonen, Lauri; Dehaene, Stanislas
2015-01-01
The neural correlates of consciousness are typically sought by comparing the overall brain responses to perceived and unperceived stimuli. However, this comparison may be contaminated by non-specific attention, alerting, performance, and reporting confounds. Here, we pursue a novel approach, tracking the neuronal coding of consciously and unconsciously perceived contents while keeping behavior identical (blindsight). EEG and MEG were recorded while participants reported the spatial location and visibility of a briefly presented target. Multivariate pattern analysis demonstrated that considerable information about spatial location traverses the cortex on blindsight trials, but that starting ≈270 ms post-onset, information unique to consciously perceived stimuli, emerges in superior parietal and superior frontal regions. Conscious access appears characterized by the entry of the perceived stimulus into a series of additional brain processes, each restricted in time, while the failure of conscious access results in the breaking of this chain and a subsequent slow decay of the lingering unconscious activity. DOI: http://dx.doi.org/10.7554/eLife.05652.001 PMID:25997100
Zydziak, Nicolas; Konrad, Waldemar; Feist, Florian; Afonin, Sergii; Weidner, Steffen; Barner-Kowollik, Christopher
2016-01-01
Designing artificial macromolecules with absolute sequence order represents a considerable challenge. Here we report an advanced light-induced avenue to monodisperse sequence-defined functional linear macromolecules up to decamers via a unique photochemical approach. The versatility of the synthetic strategy—combining sequential and modular concepts—enables the synthesis of perfect macromolecules varying in chemical constitution and topology. Specific functions are placed at arbitrary positions along the chain via the successive addition of monomer units and blocks, leading to a library of functional homopolymers, alternating copolymers and block copolymers. The in-depth characterization of each sequence-defined chain confirms the precision nature of the macromolecules. Decoding of the functional information contained in the molecular structure is achieved via tandem mass spectrometry without recourse to their synthetic history, showing that the sequence information can be read. We submit that the presented photochemical strategy is a viable and advanced concept for coding individual monomer units along a macromolecular chain. PMID:27901024
Diehl, Geoffrey W.; Hon, Olivia J.; Leutgeb, Stefan; Leutgeb, Jill K.
2017-01-01
Summary The medial entorhinal cortex (mEC) has been identified as a hub for spatial information processing by the discovery of grid, border, and head-direction cells. Here we find that in addition to these well characterized classes, nearly all of the remaining two thirds of mEC cells can be categorized as spatially selective. We refer to these cells as non-grid spatial cells and confirmed that their spatial firing patterns were unrelated to running speed and highly reproducible within the same environment. However, in response to manipulations of environmental features, such as box shape or box color, non-grid spatial cells completely reorganized their spatial firing patterns. At the same time, grid cells retained their spatial alignment and predominantly responded with redistributed firing rates across their grid fields. Thus, mEC contains a joint representation of both spatial and environmental feature content, with specialized cell types showing different types of integrated coding of multimodal information. PMID:28343867
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jansen, S.D.
1981-09-01
The report was prepared as part of the Ohio River Basin Energy Study (ORBES), a multidisciplinary policy research program. The ORBES region consists of all of Kentucky, most of West Virginia, substantial parts of Illinois, Indiana, and Ohio, and southwestern Pennsylvania. The inventory lists installed electrical generating capacity in commercial service as of December 1, 1976, and scheduled capacity additions and removals between 1977 and 1986 in the six ORBES states (Illinois, Indiana, Kentucky, Ohio, Pennsylvania, and West Virginia). The following information is included for each electrical generating unit: unit ID code, company index, whether joint or industrial ownership, plantmore » name, whether inside or outside the ORBES region, FIPS county code, type of unit, size in megawatts, type of megawatt rating, status of unit, date of commercial operation (actual or scheduled), scheduled retirement date (if any), primary fuel, alternate fuel, type of cooling, source of cooling water, and source of information.« less
[Quality assurance using routine data. Is outcome quality now measurable?].
Kostuj, T; Smektala, R
2010-12-01
Health service quality in Germany can be shown by the data from the external quality assurance program (BQS) but as these records are limited to the period of in-hospital stay no information about outcome after discharge from hospital can be obtained. Secondary routine administrative data contain information about long-term outcome, such as mortality, subsequent revision and the need for care following surgical treatment due to a hip fracture.Experiences in the use of secondary data dealing with treatment of hip fractures from the BQS are available in our department. In addition we analyzed routine administrative data from the health insurance companies Knappschaft Bahn-See and AOK in a cooperative study with the WidO (scientific institute of the AOK). These routine data clearly show a bias because of poor quality in coding as well as broad interpretation possibilities of some of the ICD-10 codes used.Consequently quality assurance using routine data is less valid than register-based conclusions. Nevertheless medical expertise is necessary to avoid misinterpretation of routine administrative data.
Zydziak, Nicolas; Konrad, Waldemar; Feist, Florian; Afonin, Sergii; Weidner, Steffen; Barner-Kowollik, Christopher
2016-11-30
Designing artificial macromolecules with absolute sequence order represents a considerable challenge. Here we report an advanced light-induced avenue to monodisperse sequence-defined functional linear macromolecules up to decamers via a unique photochemical approach. The versatility of the synthetic strategy-combining sequential and modular concepts-enables the synthesis of perfect macromolecules varying in chemical constitution and topology. Specific functions are placed at arbitrary positions along the chain via the successive addition of monomer units and blocks, leading to a library of functional homopolymers, alternating copolymers and block copolymers. The in-depth characterization of each sequence-defined chain confirms the precision nature of the macromolecules. Decoding of the functional information contained in the molecular structure is achieved via tandem mass spectrometry without recourse to their synthetic history, showing that the sequence information can be read. We submit that the presented photochemical strategy is a viable and advanced concept for coding individual monomer units along a macromolecular chain.
Health Care Information in African-American Churches
Harmon, Brook E.; Kim, Sei-Hill; Blake, Christine E.; Hébert, James R.
2014-01-01
Churches are a trusted resource in African American communities; however, little is known about their presentation of health care information. This study characterized health care information disseminated by 11 African American churches. Content analysis conducted on print media systematically collected over one year used a coding scheme with .77 intercoder reliability. Health care information was identified in 243 items and represented three topics (screening, medical services, health insurance). Screening was the most common topic (n=156), flyers/handouts most often used (n=90), and the church the most common source (n=71). Using chi-square tests, information was assessed over time with health insurance information showing a statistically significant increase (χ2=6.08, p <.05). Study churches provided health care information at varying levels of detail with most coming from church and community publications. Future research should examine additional characteristics of health care information, its presence in other churches and community settings, and how exposure influences behaviors. PMID:24509024
PDB file parser and structure class implemented in Python.
Hamelryck, Thomas; Manderick, Bernard
2003-11-22
The biopython project provides a set of bioinformatics tools implemented in Python. Recently, biopython was extended with a set of modules that deal with macromolecular structure. Biopython now contains a parser for PDB files that makes the atomic information available in an easy-to-use but powerful data structure. The parser and data structure deal with features that are often left out or handled inadequately by other packages, e.g. atom and residue disorder (if point mutants are present in the crystal), anisotropic B factors, multiple models and insertion codes. In addition, the parser performs some sanity checking to detect obvious errors. The Biopython distribution (including source code and documentation) is freely available (under the Biopython license) from http://www.biopython.org
Adaptive transmission based on multi-relay selection and rate-compatible LDPC codes
NASA Astrophysics Data System (ADS)
Su, Hualing; He, Yucheng; Zhou, Lin
2017-08-01
In order to adapt to the dynamical changeable channel condition and improve the transmissive reliability of the system, a cooperation system of rate-compatible low density parity check (RC-LDPC) codes combining with multi-relay selection protocol is proposed. In traditional relay selection protocol, only the channel state information (CSI) of source-relay and the CSI of relay-destination has been considered. The multi-relay selection protocol proposed by this paper takes the CSI between relays into extra account in order to obtain more chances of collabration. Additionally, the idea of hybrid automatic request retransmission (HARQ) and rate-compatible are introduced. Simulation results show that the transmissive reliability of the system can be significantly improved by the proposed protocol.
Development of Low Cost Satellite Communications System for Helicopters and General Aviation
NASA Technical Reports Server (NTRS)
Farazian, K.; Abbe, B.; Divsalar, D.; Raphaeli, D.; Tulintseff, A.; Wu, T.; Hinedi, S.
1994-01-01
In this paper, the development of low-cost satellite communications (SATCOM) system for helicopters and General Aviation (GA) aircrafts is described. System design and standards analysis have been conducted to meet the low-cost, light-weight, small-size and low-power system requirements for helicopters and GA aircraft environments. Other specific issues investigated include coding schemes, spatial diversity, and antenna arraying techniques. Coding schemes employing Channel State Information (CSI) and inverleaving have been studied in order to mitigate severe banking angle fading and the periodic RF signal blockage due to the helicopter rotor blades. In addition, space diversity and antenna arraying techniques have been investigated to further reduce the fading effects and increase the link margin.
Combinatorial neural codes from a mathematical coding theory perspective.
Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L
2013-07-01
Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.
Developing Information Power Grid Based Algorithms and Software
NASA Technical Reports Server (NTRS)
Dongarra, Jack
1998-01-01
This was an exploratory study to enhance our understanding of problems involved in developing large scale applications in a heterogeneous distributed environment. It is likely that the large scale applications of the future will be built by coupling specialized computational modules together. For example, efforts now exist to couple ocean and atmospheric prediction codes to simulate a more complete climate system. These two applications differ in many respects. They have different grids, the data is in different unit systems and the algorithms for inte,-rating in time are different. In addition the code for each application is likely to have been developed on different architectures and tend to have poor performance when run on an architecture for which the code was not designed, if it runs at all. Architectural differences may also induce differences in data representation which effect precision and convergence criteria as well as data transfer issues. In order to couple such dissimilar codes some form of translation must be present. This translation should be able to handle interpolation from one grid to another as well as construction of the correct data field in the correct units from available data. Even if a code is to be developed from scratch, a modular approach will likely be followed in that standard scientific packages will be used to do the more mundane tasks such as linear algebra or Fourier transform operations. This approach allows the developers to concentrate on their science rather than becoming experts in linear algebra or signal processing. Problems associated with this development approach include difficulties associated with data extraction and translation from one module to another, module performance on different nodal architectures, and others. In addition to these data and software issues there exists operational issues such as platform stability and resource management.
Extension of analog network coding in wireless information exchange
NASA Astrophysics Data System (ADS)
Chen, Cheng; Huang, Jiaqing
2012-01-01
Ever since the concept of analog network coding(ANC) was put forward by S.Katti, much attention has been focused on how to utilize analog network coding to take advantage of wireless interference, which used to be considered generally harmful, to improve throughput performance. Previously, only the case of two nodes that need to exchange information has been fully discussed while the issue of extending analog network coding to more than three nodes remains undeveloped. In this paper, we propose a practical transmission scheme to extend analog network coding to more than two nodes that need to exchange information among themselves. We start with the case of three nodes that need to exchange information and demonstrate that through utilizing our algorithm, the throughput can achieve 33% and 20% increase compared with that of traditional transmission scheduling and digital network coding, respectively. Then, we generalize the algorithm so that it can fit for occasions with any number of nodes. We also discuss some technical issues and throughput analysis as well as the bit error rate.
Mobile Code: The Future of the Internet
1999-01-01
code ( mobile agents) to multiple proxies or servers " Customization " (e.g., re-formatting, filtering, metasearch) Information overload Diversified... Mobile code is necessary, rather than client-side code, since many customization features (such as information monitoring) do not work if the...economic foundation for Web sites, many Web sites earn money solely from advertisements . If these sites allow mobile agents to easily access the content
Performance Analysis of New Binary User Codes for DS-CDMA Communication
NASA Astrophysics Data System (ADS)
Usha, Kamle; Jaya Sankar, Kottareddygari
2016-03-01
This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.
Lam, Raymond; Kruger, Estie; Tennant, Marc
2014-12-01
One disadvantage of the remarkable achievements in dentistry is that treatment options have never been more varied or confusing. This has made the concept of Evidenced Based Dentistry more applicable to modern dental practice. Despite merit in the concept whereby clinical decisions are guided by scientific evidence, there are problems with establishing a scientific base. This is no more challenging than in modern dentistry where the gap between rapidly developing products/procedures and its evidence base are widening. Furthermore, the burden of oral disease continues to remain high at the population level. These problems have prompted new approaches to enhancing research. The aim of this paper is to outline how a modified approach to dental coding may benefit clinical and population level research. Using publically assessable data obtained from the Australian Chronic Disease Dental Scheme and item codes contained within the Australian Schedule of Dental Services and Glossary, a suggested approach to dental informatics is illustrated. A selection of item codes have been selected and expanded with the addition of suffixes. These suffixes provided circumstantial information that will assist in assessing clinical outcomes such as success rates and prognosis. The use of item codes in administering the CDDS yielded a large database of item codes. These codes are amenable to dental informatics which has been shown to enhance research at both the clinical and population level. This is a cost effective method to supplement existing research methods. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1987-01-01
This handbook is a guide for the use of all personnel engaged in handling NASA files. It is issued in accordance with the regulations of the National Archives and Records Administration, in the Code of Federal Regulations Title 36, Part 1224, Files Management; and the Federal Information Resources Management Regulation, Subpart 201-45.108, Files Management. It is intended to provide a standardized classification and filing scheme to achieve maximum uniformity and ease in maintaining and using agency records. It is a framework for consistent organization of information in an arrangement that will be useful to current and future researchers. The NASA Uniform Files Index coding structure is composed of the subject classification table used for NASA management directives and the subject groups in the NASA scientific and technical information system. It is designed to correlate files throughout NASA and it is anticipated that it may be useful with automated filing systems. It is expected that in the conversion of current files to this arrangement it will be necessary to add tertiary subjects and make further subdivisions under the existing categories. Established primary and secondary subject categories may not be changed arbitrarily. Proposals for additional subject categories of NASA-wide applicability, and suggestions for improvement in this handbook, should be addressed to the Records Program Manager at the pertinent installation who will forward it to the NASA Records Management Office, Code NTR, for approval. This handbook is issued in loose-leaf form and will be revised by page changes.
Robust video transmission with distributed source coded auxiliary channel.
Wang, Jiajun; Majumdar, Abhik; Ramchandran, Kannan
2009-12-01
We propose a novel solution to the problem of robust, low-latency video transmission over lossy channels. Predictive video codecs, such as MPEG and H.26x, are very susceptible to prediction mismatch between encoder and decoder or "drift" when there are packet losses. These mismatches lead to a significant degradation in the decoded quality. To address this problem, we propose an auxiliary codec system that sends additional information alongside an MPEG or H.26x compressed video stream to correct for errors in decoded frames and mitigate drift. The proposed system is based on the principles of distributed source coding and uses the (possibly erroneous) MPEG/H.26x decoder reconstruction as side information at the auxiliary decoder. The distributed source coding framework depends upon knowing the statistical dependency (or correlation) between the source and the side information. We propose a recursive algorithm to analytically track the correlation between the original source frame and the erroneous MPEG/H.26x decoded frame. Finally, we propose a rate-distortion optimization scheme to allocate the rate used by the auxiliary encoder among the encoding blocks within a video frame. We implement the proposed system and present extensive simulation results that demonstrate significant gains in performance both visually and objectively (on the order of 2 dB in PSNR over forward error correction based solutions and 1.5 dB in PSNR over intrarefresh based solutions for typical scenarios) under tight latency constraints.
How Clean is your Local Air? Here's an app for that
NASA Astrophysics Data System (ADS)
Maskey, M.; Yang, E.; Christopher, S. A.; Keiser, K.; Nair, U. S.; Graves, S. J.
2011-12-01
Air quality is a vital element of our environment. Accurate and localized air quality information is critical for characterizing environmental impacts at the local and regional levels. Advances in location-aware handheld devices and air quality modeling have enabled a group of UAHuntsville scientists to develop a mobile app, LocalAQI, that informs users of current conditions and forecasts of up to twenty-four hours, of air quality indices. The air quality index is based on Community Multiscale Air Quality Modeling System (CMAQ). UAHuntsville scientists have used satellite remote sensing products as inputs to CMAQ, resulting in forecast guidance for particulate matter air quality. The CMAQ output is processed to compute a standardized air quality index. Currently, the air quality index is available for the eastern half of the United States. LocalAQI consists of two main views: air quality index view and map view. The air quality index view displays current air quality for the zip code of a location of interest. Air quality index value is translated into a color-coded advisory system. In addition, users are able to cycle through available hourly forecasts for a location. This location-aware app defaults to the current air quality of user's location. The map view displays color-coded air quality information for the eastern US with an ability to animate through the available forecasts. The app is developed using a cross-platform native application development tool, appcelerator; hence LocalAQI is available for iOS and Android-based phones and pads.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code [email protected] . List of Subjects Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests, Reporting and recordkeeping requirements. Dated: December 9, 2010. Lois...
Present Scenario of Long Non-Coding RNAs in Plants
Bhatia, Garima; Goyal, Neetu; Sharma, Shailesh; Upadhyay, Santosh Kumar; Singh, Kashmir
2017-01-01
Small non-coding RNAs have been extensively studied in plants over the last decade. In contrast, genome-wide identification of plant long non-coding RNAs (lncRNAs) has recently gained momentum. LncRNAs are now being recognized as important players in gene regulation, and their potent regulatory roles are being studied comprehensively in eukaryotes. LncRNAs were first reported in humans in 1992. Since then, research in animals, particularly in humans, has rapidly progressed, and a vast amount of data has been generated, collected, and organized using computational approaches. Additionally, numerous studies have been conducted to understand the roles of these long RNA species in several diseases. However, the status of lncRNA investigation in plants lags behind that in animals (especially humans). Efforts are being made in this direction using computational tools and high-throughput sequencing technologies, such as the lncRNA microarray technique, RNA-sequencing (RNA-seq), RNA capture sequencing, (RNA CaptureSeq), etc. Given the current scenario, significant amounts of data have been produced regarding plant lncRNAs, and this amount is likely to increase in the subsequent years. In this review we have documented brief information about lncRNAs and their status of research in plants, along with the plant-specific resources/databases for information retrieval on lncRNAs. PMID:29657289
Sum of the Magnitude for Hard Decision Decoding Algorithm Based on Loop Update Detection.
Meng, Jiahui; Zhao, Danfeng; Tian, Hai; Zhang, Liang
2018-01-15
In order to improve the performance of non-binary low-density parity check codes (LDPC) hard decision decoding algorithm and to reduce the complexity of decoding, a sum of the magnitude for hard decision decoding algorithm based on loop update detection is proposed. This will also ensure the reliability, stability and high transmission rate of 5G mobile communication. The algorithm is based on the hard decision decoding algorithm (HDA) and uses the soft information from the channel to calculate the reliability, while the sum of the variable nodes' (VN) magnitude is excluded for computing the reliability of the parity checks. At the same time, the reliability information of the variable node is considered and the loop update detection algorithm is introduced. The bit corresponding to the error code word is flipped multiple times, before this is searched in the order of most likely error probability to finally find the correct code word. Simulation results show that the performance of one of the improved schemes is better than the weighted symbol flipping (WSF) algorithm under different hexadecimal numbers by about 2.2 dB and 2.35 dB at the bit error rate (BER) of 10 -5 over an additive white Gaussian noise (AWGN) channel, respectively. Furthermore, the average number of decoding iterations is significantly reduced.
Upgrades of Two Computer Codes for Analysis of Turbomachinery
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.; Liou, Meng-Sing
2005-01-01
Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.
Robust information propagation through noisy neural circuits
Pouget, Alexandre
2017-01-01
Sensory neurons give highly variable responses to stimulation, which can limit the amount of stimulus information available to downstream circuits. Much work has investigated the factors that affect the amount of information encoded in these population responses, leading to insights about the role of covariability among neurons, tuning curve shape, etc. However, the informativeness of neural responses is not the only relevant feature of population codes; of potentially equal importance is how robustly that information propagates to downstream structures. For instance, to quantify the retina’s performance, one must consider not only the informativeness of the optic nerve responses, but also the amount of information that survives the spike-generating nonlinearity and noise corruption in the next stage of processing, the lateral geniculate nucleus. Our study identifies the set of covariance structures for the upstream cells that optimize the ability of information to propagate through noisy, nonlinear circuits. Within this optimal family are covariances with “differential correlations”, which are known to reduce the information encoded in neural population activities. Thus, covariance structures that maximize information in neural population codes, and those that maximize the ability of this information to propagate, can be very different. Moreover, redundancy is neither necessary nor sufficient to make population codes robust against corruption by noise: redundant codes can be very fragile, and synergistic codes can—in some cases—optimize robustness against noise. PMID:28419098
McDiarmid, Roy W.; Heyer, W. Ronald; Donnelly, Maureen A.; McDiarmid, Roy W.; Hayek, Lee-Ann C.; Foster, Mercedes S.
1994-01-01
The many individual salamanders, frogs, caecilians, and their larvae encountered during the course of an inventory or monitoring project will have to be identified to species. Depending on the goals and sampling method(s) used, some individuals will be identified from a distance by their calls, others will be handled. At the same time, some will be marked for recapture, and others will be sampled as vouchers. For each, certain minimum data should be recorded. In this section, data pertaining to locality and sampling methodology are considered, information on microhabitats and specimen vouchers is covered in sections that follow. I feel strongly that the data outlined here should be the minimum for any project. Investigators with specific goals may require additional types of data as well.Standardized, printed sheets containing the required data categories provide a convenient, inexpensive, and effective way to ensure that all the desired information is recorded in a consistent format, Data sheets should be well organized, printed on good-quality paper (75%-100% cotton content) and include extra space (e.g., other side of sheet) for notes that do not fit preestablished categoriesData should be recorded in the field with permanent (waterproof) ink as simply and directly as possible. I strongly recommend against the use of data codes in the field; it is too easy to forget codes or to enter the wrong code. Original data sheets can be photocopied for security, but they should not be copied by hand. If data are to be coded for computer analysis, the original or photocopied sheets should be used for data entry to minimize transcription errors. Some workers prefer recording information on small tape recorders; this also works well if a list of the standard data categories is checked during taping to ensure that all required information is recorded. Information recorded on tapes should be transcribed to data sheets or into a computer within 24 hours of the sample.
Views of Health Information Management Staff on the Medical Coding Software in Mashhad, Iran.
Kimiafar, Khalil; Hemmati, Fatemeh; Banaye Yazdipour, Alireza; Sarbaz, Masoumeh
2018-01-01
Systematic evaluation of Health Information Technology (HIT) and users' views leads to the modification and development of these technologies in accordance with their needs. The purpose of this study was to investigate the views of Health Information Management (HIM) staff on the quality of medical coding software. A descriptive cross-sectional study was conducted between May to July 2016 in 26 hospitals (academic and non-academic) in Mashhad, north-eastern Iran. The study population consisted of the chairs of HIM departments and medical coders (58 staff). Data were collected through a valid and reliable questionnaire. The data were analyzed using the SPSS version 16.0. From the views of staff, the advantages of coding software such as reducing coding time had the highest average (Mean=3.82) while cost reduction had the lowest average (Mean =3.20), respectively. Meanwhile, concern about losing job opportunities was the least important disadvantage (15.5%) to the use of coding software. In general, the results of this study showed that coding software in some cases have deficiencies. Designers and developers of health information coding software should pay more attention to technical aspects, in-work reminders, help in deciding on proper codes selection by access coding rules, maintenance services, link to other relevant databases and the possibility of providing brief and detailed reports in different formats.
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
3D video coding: an overview of present and upcoming standards
NASA Astrophysics Data System (ADS)
Merkle, Philipp; Müller, Karsten; Wiegand, Thomas
2010-07-01
An overview of existing and upcoming 3D video coding standards is given. Various different 3D video formats are available, each with individual pros and cons. The 3D video formats can be separated into two classes: video-only formats (such as stereo and multiview video) and depth-enhanced formats (such as video plus depth and multiview video plus depth). Since all these formats exist of at least two video sequences and possibly additional depth data, efficient compression is essential for the success of 3D video applications and technologies. For the video-only formats the H.264 family of coding standards already provides efficient and widely established compression algorithms: H.264/AVC simulcast, H.264/AVC stereo SEI message, and H.264/MVC. For the depth-enhanced formats standardized coding algorithms are currently being developed. New and specially adapted coding approaches are necessary, as the depth or disparity information included in these formats has significantly different characteristics than video and is not displayed directly, but used for rendering. Motivated by evolving market needs, MPEG has started an activity to develop a generic 3D video standard within the 3DVC ad-hoc group. Key features of the standard are efficient and flexible compression of depth-enhanced 3D video representations and decoupling of content creation and display requirements.
NASA Astrophysics Data System (ADS)
Miki, Nobuhiko; Kishiyama, Yoshihisa; Higuchi, Kenichi; Sawahashi, Mamoru; Nakagawa, Masao
In the Evolved UTRA (UMTS Terrestrial Radio Access) downlink, Orthogonal Frequency Division Multiplexing (OFDM) based radio access was adopted because of its inherent immunity to multipath interference and flexible accommodation of different spectrum arrangements. This paper presents the optimum adaptive modulation and channel coding (AMC) scheme when resource blocks (RBs) is simultaneously assigned to the same user when frequency and time domain channel-dependent scheduling is assumed in the downlink OFDMA radio access with single-antenna transmission. We start by presenting selection methods for the modulation and coding scheme (MCS) employing mutual information both for RB-common and RB-dependent modulation schemes. Simulation results show that, irrespective of the application of power adaptation to RB-dependent modulation, the improvement in the achievable throughput of the RB-dependent modulation scheme compared to that for the RB-common modulation scheme is slight, i.e., 4 to 5%. In addition, the number of required control signaling bits in the RB-dependent modulation scheme becomes greater than that for the RB-common modulation scheme. Therefore, we conclude that the RB-common modulation and channel coding rate scheme is preferred, when multiple RBs of the same coded stream are assigned to one user in the case of single-antenna transmission.
2012-01-01
Background The feline genome is valuable to the veterinary and model organism genomics communities because the cat is an obligate carnivore and a model for endangered felids. The initial public release of the Felis catus genome assembly provided a framework for investigating the genomic basis of feline biology. However, the entire set of protein coding genes has not been elucidated. Results We identified and characterized 1227 protein coding feline sequences, of which 913 map to public sequences and 314 are novel. These sequences have been deposited into NCBI's genbank database and complement public genomic resources by providing additional protein coding sequences that fill in some of the gaps in the feline genome assembly. Through functional and comparative genomic analyses, we gained an understanding of the role of these sequences in feline development, nutrition and health. Specifically, we identified 104 orthologs of human genes associated with Mendelian disorders. We detected negative selection within sequences with gene ontology annotations associated with intracellular trafficking, cytoskeleton and muscle functions. We detected relatively less negative selection on protein sequences encoding extracellular networks, apoptotic pathways and mitochondrial gene ontology annotations. Additionally, we characterized feline cDNA sequences that have mouse orthologs associated with clinical, nutritional and developmental phenotypes. Together, this analysis provides an overview of the value of our cDNA sequences and enhances our understanding of how the feline genome is similar to, and different from other mammalian genomes. Conclusions The cDNA sequences reported here expand existing feline genomic resources by providing high-quality sequences annotated with comparative genomic information providing functional, clinical, nutritional and orthologous gene information. PMID:22257742
Cresswell, Kathrin; Morrison, Zoe; Kalra, Dipak; Sheikh, Aziz
2012-01-01
We sought to understand how clinical information relating to the management of depression is routinely coded in different clinical settings and the perspectives of and implications for different stakeholders with a view to understanding how these may be aligned. Qualitative investigation exploring the views of a purposefully selected range of healthcare professionals, managers, and clinical coders spanning primary and secondary care. Our dataset comprised 28 semi-structured interviews, a focus group, documents relating to clinical coding standards and participant observation of clinical coding activities. We identified a range of approaches to coding clinical information including templates and order entry systems. The challenges inherent in clearly establishing a diagnosis, identifying appropriate clinical codes and possible implications of diagnoses for patients were particularly prominent in primary care. Although a range of managerial and research benefits were identified, there were no direct benefits from coded clinical data for patients or professionals. Secondary care staff emphasized the role of clinical coders in ensuring data quality, which was at odds with the policy drive to increase real-time clinical coding. There was overall no evidence of clear-cut direct patient care benefits to inform immediate care decisions, even in primary care where data on patients with depression were more extensively coded. A number of important secondary uses were recognized by healthcare staff, but the coding of clinical data to serve these ends was often poorly aligned with clinical practice and patient-centered considerations. The current international drive to encourage clinical coding by healthcare professionals during the clinical encounter may need to be critically examined.
Fault-tolerance in Two-dimensional Topological Systems
NASA Astrophysics Data System (ADS)
Anderson, Jonas T.
This thesis is a collection of ideas with the general goal of building, at least in the abstract, a local fault-tolerant quantum computer. The connection between quantum information and topology has proven to be an active area of research in several fields. The introduction of the toric code by Alexei Kitaev demonstrated the usefulness of topology for quantum memory and quantum computation. Many quantum codes used for quantum memory are modeled by spin systems on a lattice, with operators that extract syndrome information placed on vertices or faces of the lattice. It is natural to wonder whether the useful codes in such systems can be classified. This thesis presents work that leverages ideas from topology and graph theory to explore the space of such codes. Homological stabilizer codes are introduced and it is shown that, under a set of reasonable assumptions, any qubit homological stabilizer code is equivalent to either a toric code or a color code. Additionally, the toric code and the color code correspond to distinct classes of graphs. Many systems have been proposed as candidate quantum computers. It is very desirable to design quantum computing architectures with two-dimensional layouts and low complexity in parity-checking circuitry. Kitaev's surface codes provided the first example of codes satisfying this property. They provided a new route to fault tolerance with more modest overheads and thresholds approaching 1%. The recently discovered color codes share many properties with the surface codes, such as the ability to perform syndrome extraction locally in two dimensions. Some families of color codes admit a transversal implementation of the entire Clifford group. This work investigates color codes on the 4.8.8 lattice known as triangular codes. I develop a fault-tolerant error-correction strategy for these codes in which repeated syndrome measurements on this lattice generate a three-dimensional space-time combinatorial structure. I then develop an integer program that analyzes this structure and determines the most likely set of errors consistent with the observed syndrome values. I implement this integer program to find the threshold for depolarizing noise on small versions of these triangular codes. Because the threshold for magic-state distillation is likely to be higher than this value and because logical
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-17
...] Agency Information Collection Activities; Proposed Collection; Comment Request; Bar Code Label... allow 60 days for public comment in response to the notice. This notice solicits comments on the bar... technology. Bar Code Label Requirement for Human Drug and Biological Products--(OMB Control Number 0910-0537...
Code of Federal Regulations, 2014 CFR
2014-10-01
... definitions apply: Code set means any set of codes used to encode data elements, such as tables of terms... code sets inherent to a transaction, and not related to the format of the transaction. Data elements... information in a transaction. Data set means a semantically meaningful unit of information exchanged between...
Code of Federal Regulations, 2013 CFR
2013-10-01
... definitions apply: Code set means any set of codes used to encode data elements, such as tables of terms... code sets inherent to a transaction, and not related to the format of the transaction. Data elements... information in a transaction. Data set means a semantically meaningful unit of information exchanged between...
Development of structured ICD-10 and its application to computer-assisted ICD coding.
Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko
2010-01-01
This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.
Methods for Coding Tobacco-Related Twitter Data: A Systematic Review
Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai
2017-01-01
Background As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. Objective The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Methods Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. Results E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter’s Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Conclusions Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter’s databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. PMID:28363883
Modelling, Information, Processing, and Control
1989-01-15
PAGE COUNT Sc..JA I, ll4,4 FROM I S*,LTON SepSk 15. SUPPLEMENTARY NOTATION 17. COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necenary and...and graduate re- search assistants, and also short term consultants and visitors. In addition to salary support, funds were used to support scien- tific...and Optimization, 34 (1986), pp. 1276-1308. 2. D. L. Russell: A Floquet Decomposition for Volterra Equations with Periodic Kernel and a Transform
Ethical perspectives in neuroscience nursing practice.
Murphy, W J; Olsen, B J
1999-09-01
The role of neuroscience nurses in relation to ethical issues has become increasingly complex. Knowledge of ethical principles and theories assists the nurse in the development of a theoretical basis for resolution of ethical issues or concerns. Additionally, the nurse must possess information regarding practice codes or standards as well as legislative requirements. The nurse must act as an advocate for the patient and society through active participation in institutional ethics committees and legislative forums.
Applying thematic analysis theory to practice: a researcher's experience.
Tuckett, Anthony G
2005-01-01
This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.
Improvements to Busquet's Non LTE algorithm in NRL's Hydro code
NASA Astrophysics Data System (ADS)
Klapisch, M.; Colombant, D.
1996-11-01
Implementation of the Non LTE model RADIOM (M. Busquet, Phys. Fluids B, 5, 4191 (1993)) in NRL's RAD2D Hydro code in conservative form was reported previously(M. Klapisch et al., Bull. Am. Phys. Soc., 40, 1806 (1995)).While the results were satisfactory, the algorithm was slow and not always converging. We describe here modifications that address the latter two shortcomings. This method is quicker and more stable than the original. It also gives information about the validity of the fitting. It turns out that the number and distribution of groups in the multigroup diffusion opacity tables - a basis for the computation of radiation effects in the ionization balance in RADIOM- has a large influence on the robustness of the algorithm. These modifications give insight about the algorithm, and allow to check that the obtained average charge state is the true average. In addition, code optimization resulted in greatly reduced computing time: The ratio of Non LTE to LTE computing times being now between 1.5 and 2.
A review of high-speed, convective, heat-transfer computation methods
NASA Technical Reports Server (NTRS)
Tauber, Michael E.
1989-01-01
The objective of this report is to provide useful engineering formulations and to instill a modest degree of physical understanding of the phenomena governing convective aerodynamic heating at high flight speeds. Some physical insight is not only essential to the application of the information presented here, but also to the effective use of computer codes which may be available to the reader. A discussion is given of cold-wall, laminar boundary layer heating. A brief presentation of the complex boundary layer transition phenomenon follows. Next, cold-wall turbulent boundary layer heating is discussed. This topic is followed by a brief coverage of separated flow-region and shock-interaction heating. A review of heat protection methods follows, including the influence of mass addition on laminar and turbulent boundary layers. Also discussed are a discussion of finite-difference computer codes and a comparison of some results from these codes. An extensive list of references is also provided from sources such as the various AIAA journals and NASA reports which are available in the open literature.
A review of high-speed, convective, heat-transfer computation methods
NASA Technical Reports Server (NTRS)
Tauber, Michael E.
1989-01-01
The objective is to provide useful engineering formulations and to instill a modest degree of physical understanding of the phenomena governing convective aerodynamic heating at high flight speeds. Some physical insight is not only essential to the application of the information presented here, but also to the effective use of computer codes which may be available to the reader. Given first is a discussion of cold-wall, laminar boundary layer heating. A brief presentation of the complex boundary layer transition phenomenon follows. Next, cold-wall turbulent boundary layer heating is discussed. This topic is followed by a brief coverage of separated flow-region and shock-interaction heating. A review of heat protection methods follows, including the influence of mass addition on laminar and turbulent boundary layers. Next is a discussion of finite-difference computer codes and a comparison of some results from these codes. An extensive list of references is also provided from sources such as the various AIAA journals and NASA reports which are available in the open literature.
Complementary codes for odor identity and intensity in olfactory cortex
Bolding, Kevin A; Franks, Kevin M
2017-01-01
The ability to represent both stimulus identity and intensity is fundamental for perception. Using large-scale population recordings in awake mice, we find distinct coding strategies facilitate non-interfering representations of odor identity and intensity in piriform cortex. Simply knowing which neurons were activated is sufficient to accurately represent odor identity, with no additional information about identity provided by spike time or spike count. Decoding analyses indicate that cortical odor representations are not sparse. Odorant concentration had no systematic effect on spike counts, indicating that rate cannot encode intensity. Instead, odor intensity can be encoded by temporal features of the population response. We found a subpopulation of rapid, largely concentration-invariant responses was followed by another population of responses whose latencies systematically decreased at higher concentrations. Cortical inhibition transforms olfactory bulb output to sharpen these dynamics. Our data therefore reveal complementary coding strategies that can selectively represent distinct features of a stimulus. DOI: http://dx.doi.org/10.7554/eLife.22630.001 PMID:28379135
Neighboring block based disparity vector derivation for multiview compatible 3D-AVC
NASA Astrophysics Data System (ADS)
Kang, Jewon; Chen, Ying; Zhang, Li; Zhao, Xin; Karczewicz, Marta
2013-09-01
3D-AVC being developed under Joint Collaborative Team on 3D Video Coding (JCT-3V) significantly outperforms the Multiview Video Coding plus Depth (MVC+D) which simultaneously encodes texture views and depth views with the multiview extension of H.264/AVC (MVC). However, when the 3D-AVC is configured to support multiview compatibility in which texture views are decoded without depth information, the coding performance becomes significantly degraded. The reason is that advanced coding tools incorporated into the 3D-AVC do not perform well due to the lack of a disparity vector converted from the depth information. In this paper, we propose a disparity vector derivation method utilizing only the information of texture views. Motion information of neighboring blocks is used to determine a disparity vector for a macroblock, so that the derived disparity vector is efficiently used for the coding tools in 3D-AVC. The proposed method significantly improves a coding gain of the 3D-AVC in the multiview compatible mode about 20% BD-rate saving in the coded views and 26% BD-rate saving in the synthesized views on average.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-19
...]Animal production (NAICS code 112). [emsp14]Food manufacturing (NAICS code 311). [emsp14]Pesticide manufacturing (NAICS code 32532). If you have any questions regarding the applicability of this action to a... protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests, Reporting and...
78 FR 59265 - FD&C Yellow No. 5; Exemption From the Requirement of a Tolerance
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-26
... (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). B. How.... 5 is a FDA permanently listed color additive used in food, drugs and cosmetics, including drugs and cosmetics for the eye area. FDA's color additive evaluation included the consideration of an extensive set...
2011-01-01
Introduction Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Methods Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. Results The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. Conclusions The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available. PMID:21548991
Palmer, Cameron S; Franklyn, Melanie; Read-Allsopp, Christine; McLellan, Susan; Niggemeyer, Louise E
2011-05-08
Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available.
Second-Order Asymptotics for the Classical Capacity of Image-Additive Quantum Channels
NASA Astrophysics Data System (ADS)
Tomamichel, Marco; Tan, Vincent Y. F.
2015-08-01
We study non-asymptotic fundamental limits for transmitting classical information over memoryless quantum channels, i.e. we investigate the amount of classical information that can be transmitted when a quantum channel is used a finite number of times and a fixed, non-vanishing average error is permissible. In this work we consider the classical capacity of quantum channels that are image-additive, including all classical to quantum channels, as well as the product state capacity of arbitrary quantum channels. In both cases we show that the non-asymptotic fundamental limit admits a second-order approximation that illustrates the speed at which the rate of optimal codes converges to the Holevo capacity as the blocklength tends to infinity. The behavior is governed by a new channel parameter, called channel dispersion, for which we provide a geometrical interpretation.
Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark
2011-01-01
Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.
Online Hierarchical Sparse Representation of Multifeature for Robust Object Tracking
Qu, Shiru
2016-01-01
Object tracking based on sparse representation has given promising tracking results in recent years. However, the trackers under the framework of sparse representation always overemphasize the sparse representation and ignore the correlation of visual information. In addition, the sparse coding methods only encode the local region independently and ignore the spatial neighborhood information of the image. In this paper, we propose a robust tracking algorithm. Firstly, multiple complementary features are used to describe the object appearance; the appearance model of the tracked target is modeled by instantaneous and stable appearance features simultaneously. A two-stage sparse-coded method which takes the spatial neighborhood information of the image patch and the computation burden into consideration is used to compute the reconstructed object appearance. Then, the reliability of each tracker is measured by the tracking likelihood function of transient and reconstructed appearance models. Finally, the most reliable tracker is obtained by a well established particle filter framework; the training set and the template library are incrementally updated based on the current tracking results. Experiment results on different challenging video sequences show that the proposed algorithm performs well with superior tracking accuracy and robustness. PMID:27630710
Supranormal orientation selectivity of visual neurons in orientation-restricted animals.
Sasaki, Kota S; Kimura, Rui; Ninomiya, Taihei; Tabuchi, Yuka; Tanaka, Hiroki; Fukui, Masayuki; Asada, Yusuke C; Arai, Toshiya; Inagaki, Mikio; Nakazono, Takayuki; Baba, Mika; Kato, Daisuke; Nishimoto, Shinji; Sanada, Takahisa M; Tani, Toshiki; Imamura, Kazuyuki; Tanaka, Shigeru; Ohzawa, Izumi
2015-11-16
Altered sensory experience in early life often leads to remarkable adaptations so that humans and animals can make the best use of the available information in a particular environment. By restricting visual input to a limited range of orientations in young animals, this investigation shows that stimulus selectivity, e.g., the sharpness of tuning of single neurons in the primary visual cortex, is modified to match a particular environment. Specifically, neurons tuned to an experienced orientation in orientation-restricted animals show sharper orientation tuning than neurons in normal animals, whereas the opposite was true for neurons tuned to non-experienced orientations. This sharpened tuning appears to be due to elongated receptive fields. Our results demonstrate that restricted sensory experiences can sculpt the supranormal functions of single neurons tailored for a particular environment. The above findings, in addition to the minimal population response to orientations close to the experienced one, agree with the predictions of a sparse coding hypothesis in which information is represented efficiently by a small number of activated neurons. This suggests that early brain areas adopt an efficient strategy for coding information even when animals are raised in a severely limited visual environment where sensory inputs have an unnatural statistical structure.
The Extent of Consumer Product Involvement in Paediatric Injuries
Catchpoole, Jesani; Walker, Sue; Vallmuur, Kirsten
2016-01-01
A challenge in utilising health sector injury data for Product Safety purposes is that clinically coded data have limited ability to inform regulators about product involvement in injury events, given data entry is bound by a predefined set of codes. Text narratives collected in emergency departments can potentially address this limitation by providing relevant product information with additional accompanying context. This study aims to identify and quantify consumer product involvement in paediatric injuries recorded in emergency department-based injury surveillance data. A total of 7743 paediatric injuries were randomly selected from Queensland Injury Surveillance Unit database and associated text narratives were manually reviewed to determine product involvement in the injury event. A Product Involvement Factor classification system was used to categorise these injury cases. Overall, 44% of all reviewed cases were associated with consumer products, with proximity factor (25%) being identified as the most common involvement of a product in an injury event. Only 6% were established as being directly due to the product. The study highlights the importance of utilising injury data to inform product safety initiatives where text narratives can be used to identify the type and involvement of products in injury cases. PMID:27399744
The Extent of Consumer Product Involvement in Paediatric Injuries.
Catchpoole, Jesani; Walker, Sue; Vallmuur, Kirsten
2016-07-07
A challenge in utilising health sector injury data for Product Safety purposes is that clinically coded data have limited ability to inform regulators about product involvement in injury events, given data entry is bound by a predefined set of codes. Text narratives collected in emergency departments can potentially address this limitation by providing relevant product information with additional accompanying context. This study aims to identify and quantify consumer product involvement in paediatric injuries recorded in emergency department-based injury surveillance data. A total of 7743 paediatric injuries were randomly selected from Queensland Injury Surveillance Unit database and associated text narratives were manually reviewed to determine product involvement in the injury event. A Product Involvement Factor classification system was used to categorise these injury cases. Overall, 44% of all reviewed cases were associated with consumer products, with proximity factor (25%) being identified as the most common involvement of a product in an injury event. Only 6% were established as being directly due to the product. The study highlights the importance of utilising injury data to inform product safety initiatives where text narratives can be used to identify the type and involvement of products in injury cases.
Supranormal orientation selectivity of visual neurons in orientation-restricted animals
Sasaki, Kota S.; Kimura, Rui; Ninomiya, Taihei; Tabuchi, Yuka; Tanaka, Hiroki; Fukui, Masayuki; Asada, Yusuke C.; Arai, Toshiya; Inagaki, Mikio; Nakazono, Takayuki; Baba, Mika; Kato, Daisuke; Nishimoto, Shinji; Sanada, Takahisa M.; Tani, Toshiki; Imamura, Kazuyuki; Tanaka, Shigeru; Ohzawa, Izumi
2015-01-01
Altered sensory experience in early life often leads to remarkable adaptations so that humans and animals can make the best use of the available information in a particular environment. By restricting visual input to a limited range of orientations in young animals, this investigation shows that stimulus selectivity, e.g., the sharpness of tuning of single neurons in the primary visual cortex, is modified to match a particular environment. Specifically, neurons tuned to an experienced orientation in orientation-restricted animals show sharper orientation tuning than neurons in normal animals, whereas the opposite was true for neurons tuned to non-experienced orientations. This sharpened tuning appears to be due to elongated receptive fields. Our results demonstrate that restricted sensory experiences can sculpt the supranormal functions of single neurons tailored for a particular environment. The above findings, in addition to the minimal population response to orientations close to the experienced one, agree with the predictions of a sparse coding hypothesis in which information is represented efficiently by a small number of activated neurons. This suggests that early brain areas adopt an efficient strategy for coding information even when animals are raised in a severely limited visual environment where sensory inputs have an unnatural statistical structure. PMID:26567927
Davis, Matthew H.
2016-01-01
Successful perception depends on combining sensory input with prior knowledge. However, the underlying mechanism by which these two sources of information are combined is unknown. In speech perception, as in other domains, two functionally distinct coding schemes have been proposed for how expectations influence representation of sensory evidence. Traditional models suggest that expected features of the speech input are enhanced or sharpened via interactive activation (Sharpened Signals). Conversely, Predictive Coding suggests that expected features are suppressed so that unexpected features of the speech input (Prediction Errors) are processed further. The present work is aimed at distinguishing between these two accounts of how prior knowledge influences speech perception. By combining behavioural, univariate, and multivariate fMRI measures of how sensory detail and prior expectations influence speech perception with computational modelling, we provide evidence in favour of Prediction Error computations. Increased sensory detail and informative expectations have additive behavioural and univariate neural effects because they both improve the accuracy of word report and reduce the BOLD signal in lateral temporal lobe regions. However, sensory detail and informative expectations have interacting effects on speech representations shown by multivariate fMRI in the posterior superior temporal sulcus. When prior knowledge was absent, increased sensory detail enhanced the amount of speech information measured in superior temporal multivoxel patterns, but with informative expectations, increased sensory detail reduced the amount of measured information. Computational simulations of Sharpened Signals and Prediction Errors during speech perception could both explain these behavioural and univariate fMRI observations. However, the multivariate fMRI observations were uniquely simulated by a Prediction Error and not a Sharpened Signal model. The interaction between prior expectation and sensory detail provides evidence for a Predictive Coding account of speech perception. Our work establishes methods that can be used to distinguish representations of Prediction Error and Sharpened Signals in other perceptual domains. PMID:27846209
2014-01-01
Background Coronary heart disease and stroke are major contributors to preventable mortality. Evidence links work conditions to these diseases; however, occupational data are perceived to be difficult to collect for large population-based cohorts. We report methodological details and the feasibility of conducting an occupational ancillary study for a large U.S. prospective cohort being followed longitudinally for cardiovascular disease and stroke. Methods Current and historical occupational information were collected from active participants of the REasons for Geographic And Racial Differences in Stroke (REGARDS) Study. A survey was designed to gather quality occupational data among this national cohort of black and white men and women aged 45 years and older (enrolled 2003–2007). Trained staff conducted Computer-Assisted Telephone Interviews (CATI). After a brief pilot period, interviewers received additional training in the collection of narrative industry and occupation data before administering the survey to remaining cohort members. Trained coders used a computer-assisted coding system to assign U.S. Census codes for industry and occupation. All data were double coded; discrepant codes were independently resolved. Results Over a 2-year period, 17,648 participants provided consent and completed the occupational survey (87% response rate). A total of 20,427 jobs were assigned Census codes. Inter-rater reliability was 80% for industry and 74% for occupation. Less than 0.5% of the industry and occupation data were uncodable, compared with 12% during the pilot period. Concordance between the current and longest-held jobs was moderately high. The median time to collect employment status plus narrative and descriptive job information by CATI was 1.6 to 2.3 minutes per job. Median time to assign Census codes was 1.3 minutes per rater. Conclusions The feasibility of conducting high-quality occupational data collection and coding for a large heterogeneous population-based sample was demonstrated. We found that training for interview staff was important in ensuring that narrative responses for industry and occupation were adequately specified for coding. Estimates of survey administration time and coding from digital records provide an objective basis for planning future studies. The social and environmental conditions of work are important understudied risk factors that can be feasibly integrated into large population-based health studies. PMID:24512119
MacDonald, Leslie A; Pulley, LeaVonne; Hein, Misty J; Howard, Virginia J
2014-02-10
Coronary heart disease and stroke are major contributors to preventable mortality. Evidence links work conditions to these diseases; however, occupational data are perceived to be difficult to collect for large population-based cohorts. We report methodological details and the feasibility of conducting an occupational ancillary study for a large U.S. prospective cohort being followed longitudinally for cardiovascular disease and stroke. Current and historical occupational information were collected from active participants of the REasons for Geographic And Racial Differences in Stroke (REGARDS) Study. A survey was designed to gather quality occupational data among this national cohort of black and white men and women aged 45 years and older (enrolled 2003-2007). Trained staff conducted Computer-Assisted Telephone Interviews (CATI). After a brief pilot period, interviewers received additional training in the collection of narrative industry and occupation data before administering the survey to remaining cohort members. Trained coders used a computer-assisted coding system to assign U.S. Census codes for industry and occupation. All data were double coded; discrepant codes were independently resolved. Over a 2-year period, 17,648 participants provided consent and completed the occupational survey (87% response rate). A total of 20,427 jobs were assigned Census codes. Inter-rater reliability was 80% for industry and 74% for occupation. Less than 0.5% of the industry and occupation data were uncodable, compared with 12% during the pilot period. Concordance between the current and longest-held jobs was moderately high. The median time to collect employment status plus narrative and descriptive job information by CATI was 1.6 to 2.3 minutes per job. Median time to assign Census codes was 1.3 minutes per rater. The feasibility of conducting high-quality occupational data collection and coding for a large heterogeneous population-based sample was demonstrated. We found that training for interview staff was important in ensuring that narrative responses for industry and occupation were adequately specified for coding. Estimates of survey administration time and coding from digital records provide an objective basis for planning future studies. The social and environmental conditions of work are important understudied risk factors that can be feasibly integrated into large population-based health studies.
Quantum steganography and quantum error-correction
NASA Astrophysics Data System (ADS)
Shaw, Bilal A.
Quantum error-correcting codes have been the cornerstone of research in quantum information science (QIS) for more than a decade. Without their conception, quantum computers would be a footnote in the history of science. When researchers embraced the idea that we live in a world where the effects of a noisy environment cannot completely be stripped away from the operations of a quantum computer, the natural way forward was to think about importing classical coding theory into the quantum arena to give birth to quantum error-correcting codes which could help in mitigating the debilitating effects of decoherence on quantum data. We first talk about the six-qubit quantum error-correcting code and show its connections to entanglement-assisted error-correcting coding theory and then to subsystem codes. This code bridges the gap between the five-qubit (perfect) and Steane codes. We discuss two methods to encode one qubit into six physical qubits. Each of the two examples corrects an arbitrary single-qubit error. The first example is a degenerate six-qubit quantum error-correcting code. We explicitly provide the stabilizer generators, encoding circuits, codewords, logical Pauli operators, and logical CNOT operator for this code. We also show how to convert this code into a non-trivial subsystem code that saturates the subsystem Singleton bound. We then prove that a six-qubit code without entanglement assistance cannot simultaneously possess a Calderbank-Shor-Steane (CSS) stabilizer and correct an arbitrary single-qubit error. A corollary of this result is that the Steane seven-qubit code is the smallest single-error correcting CSS code. Our second example is the construction of a non-degenerate six-qubit CSS entanglement-assisted code. This code uses one bit of entanglement (an ebit) shared between the sender (Alice) and the receiver (Bob) and corrects an arbitrary single-qubit error. The code we obtain is globally equivalent to the Steane seven-qubit code and thus corrects an arbitrary error on the receiver's half of the ebit as well. We prove that this code is the smallest code with a CSS structure that uses only one ebit and corrects an arbitrary single-qubit error on the sender's side. We discuss the advantages and disadvantages for each of the two codes. In the second half of this thesis we explore the yet uncharted and relatively undiscovered area of quantum steganography. Steganography is the process of hiding secret information by embedding it in an "innocent" message. We present protocols for hiding quantum information in a codeword of a quantum error-correcting code passing through a channel. Using either a shared classical secret key or shared entanglement Alice disguises her information as errors in the channel. Bob can retrieve the hidden information, but an eavesdropper (Eve) with the power to monitor the channel, but without the secret key, cannot distinguish the message from channel noise. We analyze how difficult it is for Eve to detect the presence of secret messages, and estimate rates of steganographic communication and secret key consumption for certain protocols. We also provide an example of how Alice hides quantum information in the perfect code when the underlying channel between Bob and her is the depolarizing channel. Using this scheme Alice can hide up to four stego-qubits.
Bar Coding and Tracking in Pathology.
Hanna, Matthew G; Pantanowitz, Liron
2016-03-01
Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology. Copyright © 2016 Elsevier Inc. All rights reserved.
Bar Coding and Tracking in Pathology.
Hanna, Matthew G; Pantanowitz, Liron
2015-06-01
Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology. Copyright © 2015 Elsevier Inc. All rights reserved.
Clustering of neural code words revealed by a first-order phase transition
NASA Astrophysics Data System (ADS)
Huang, Haiping; Toyoizumi, Taro
2016-06-01
A network of neurons in the central nervous system collectively represents information by its spiking activity states. Typically observed states, i.e., code words, occupy only a limited portion of the state space due to constraints imposed by network interactions. Geometrical organization of code words in the state space, critical for neural information processing, is poorly understood due to its high dimensionality. Here, we explore the organization of neural code words using retinal data by computing the entropy of code words as a function of Hamming distance from a particular reference codeword. Specifically, we report that the retinal code words in the state space are divided into multiple distinct clusters separated by entropy-gaps, and that this structure is shared with well-known associative memory networks in a recallable phase. Our analysis also elucidates a special nature of the all-silent state. The all-silent state is surrounded by the densest cluster of code words and located within a reachable distance from most code words. This code-word space structure quantitatively predicts typical deviation of a state-trajectory from its initial state. Altogether, our findings reveal a non-trivial heterogeneous structure of the code-word space that shapes information representation in a biological network.
Visual Search Asymmetries within Color-Coded and Intensity-Coded Displays
ERIC Educational Resources Information Center
Yamani, Yusuke; McCarley, Jason S.
2010-01-01
Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information.…
Delays in using chromatic and luminance information to correct rapid reaches.
Kane, Adam; Wade, Alex; Ma-Wyatt, Anna
2011-09-07
People can use feedback to make online corrections to movements but only if there is sufficient time to integrate the new information and make the correction. A key variable in this process is therefore the speed at which the new information about the target location is coded. Conduction velocities for chromatic signals are lower than for achromatic signals so it may take longer to correct reaches to chromatic stimuli. In addition to this delay, the sensorimotor system may prefer achromatic information over the chromatic information as delayed information may be less valuable when movements are made under time pressure. A down-weighting of chromatic information may result in additional latencies for chromatically directed reaches. In our study, participants made online corrections to reaches to achromatic, (L-M)-cone, and S-cone stimuli. Our chromatic stimuli were carefully adjusted to minimize stimulation of achromatic pathways, and we equated stimuli both in terms of detection thresholds and also by their estimated neural responses. Similar stimuli were used throughout the subjective adjustments and final reaching experiment. Using this paradigm, we found that responses to achromatic stimuli were only slightly faster than responses to (L-M)-cone and S-cone stimuli. We conclude that the sensorimotor system treats chromatic and achromatic information similarly and that the delayed chromatic responses primarily reflect early conduction delays.
Cresswell, Kathrin; Morrison, Zoe; Sheikh, Aziz; Kalra, Dipak
2012-01-01
Background We sought to understand how clinical information relating to the management of depression is routinely coded in different clinical settings and the perspectives of and implications for different stakeholders with a view to understanding how these may be aligned. Materials and Methods Qualitative investigation exploring the views of a purposefully selected range of healthcare professionals, managers, and clinical coders spanning primary and secondary care. Results Our dataset comprised 28 semi-structured interviews, a focus group, documents relating to clinical coding standards and participant observation of clinical coding activities. We identified a range of approaches to coding clinical information including templates and order entry systems. The challenges inherent in clearly establishing a diagnosis, identifying appropriate clinical codes and possible implications of diagnoses for patients were particularly prominent in primary care. Although a range of managerial and research benefits were identified, there were no direct benefits from coded clinical data for patients or professionals. Secondary care staff emphasized the role of clinical coders in ensuring data quality, which was at odds with the policy drive to increase real-time clinical coding. Conclusions There was overall no evidence of clear-cut direct patient care benefits to inform immediate care decisions, even in primary care where data on patients with depression were more extensively coded. A number of important secondary uses were recognized by healthcare staff, but the coding of clinical data to serve these ends was often poorly aligned with clinical practice and patient-centered considerations. The current international drive to encourage clinical coding by healthcare professionals during the clinical encounter may need to be critically examined. PMID:22937106
Reinventing radiology reimbursement.
Marshall, John; Adema, Denise
2005-01-01
Lee Memorial Health System (LMHS), located in southwest Florida, consists of 5 hospitals, a home health agency, a skilled nursing facility, multiple outpatient centers, walk-in medical centers, and primary care physician offices. LMHS annually performs more than 300,000 imaging procedures with gross imaging revenues exceeding dollar 350 million. In fall 2002, LMHS received the results of an independent audit of its IR coding. The overall IR coding error rate was determined to be 84.5%. The projected net financial impact of these errors was an annual reimbursement loss of dollar 182,000. To address the issues of coding errors and reimbursement loss, LMHS implemented its clinical reimbursementspecialist (CRS) system in October 2003, as an extension of financial services' reimbursement division. LMHS began with CRSs in 3 service lines: emergency department, cardiac catheterization, and radiology. These 3 CRSs coordinate all facets of their respective areas' chargemaster, patient charges, coding, and reimbursement functions while serving as a resident coding expert within their clinical areas. The radiology reimbursement specialist (RRS) combines an experienced radiologic technologist, interventional technologist, medical records coder, financial auditor, reimbursement specialist, and biller into a single position. The RRS's radiology experience and technologist knowledge are key assets to resolving coding conflicts and handling complex interventional coding. In addition, performing a daily charge audit and an active code review are essential if an organization is to eliminate coding errors. One of the inherent effects of eliminating coding errors is the capturing of additional RVUs and units of service. During its first year, based on account level detail, the RRS system increased radiology productivity through the additional capture of just more than 3,000 RVUs and 1,000 additional units of service. In addition, the physicians appreciate having someone who "keeps up with all the coding changes" and looks out for the charges. By assisting a few physicians' staff with coding questions, providing coding updates, and allowing them to sit in on educational sessions, at least 2 physicians have transferred some their volume to LMHS from a competitor. The provision of a "clean account," without coding errors, allows the biller to avoid the rework and billing delays caused by coding issues. During the first quarter of the RRS system, the billers referred an average of 9 accounts per day for coding resolution. During the fourth quarter of the system, these referrals were reduced to less than one per day. Prior to the RRS system, resolving these issues took an average of 4 business days. Now the conflicts are resolved within 24 hours.
78 FR 55210 - Pennsylvania Regulatory Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-10
... revise its program at 25 Pa. Code 86.1, 86.3, and 86.17, to reflect the addition of new definitions and... application fee'' at 25 Pa. Code 86.1 Pennsylvania proposes the addition of a new term; the definition of... Pennsylvania definition of ``minor amendment,'' found at 25 Pa. Code 92a.2, directly mirrors, with a few...
Functional analysis of ultra high information rates conveyed by rat vibrissal primary afferents
Chagas, André M.; Theis, Lucas; Sengupta, Biswa; Stüttgen, Maik C.; Bethge, Matthias; Schwarz, Cornelius
2013-01-01
Sensory receptors determine the type and the quantity of information available for perception. Here, we quantified and characterized the information transferred by primary afferents in the rat whisker system using neural system identification. Quantification of “how much” information is conveyed by primary afferents, using the direct method (DM), a classical information theoretic tool, revealed that primary afferents transfer huge amounts of information (up to 529 bits/s). Information theoretic analysis of instantaneous spike-triggered kinematic stimulus features was used to gain functional insight on “what” is coded by primary afferents. Amongst the kinematic variables tested—position, velocity, and acceleration—primary afferent spikes encoded velocity best. The other two variables contributed to information transfer, but only if combined with velocity. We further revealed three additional characteristics that play a role in information transfer by primary afferents. Firstly, primary afferent spikes show preference for well separated multiple stimuli (i.e., well separated sets of combinations of the three instantaneous kinematic variables). Secondly, neurons are sensitive to short strips of the stimulus trajectory (up to 10 ms pre-spike time), and thirdly, they show spike patterns (precise doublet and triplet spiking). In order to deal with these complexities, we used a flexible probabilistic neuron model fitting mixtures of Gaussians to the spike triggered stimulus distributions, which quantitatively captured the contribution of the mentioned features and allowed us to achieve a full functional analysis of the total information rate indicated by the DM. We found that instantaneous position, velocity, and acceleration explained about 50% of the total information rate. Adding a 10 ms pre-spike interval of stimulus trajectory achieved 80–90%. The final 10–20% were found to be due to non-linear coding by spike bursts. PMID:24367295
Maund, Emma; Tendal, Britta; Hróbjartsson, Asbjørn; Lundh, Andreas; Gøtzsche, Peter C
2014-06-04
To assess the effects of coding and coding conventions on summaries and tabulations of adverse events data on suicidality within clinical study reports. Systematic electronic search for adverse events of suicidality in tables, narratives, and listings of adverse events in individual patients within clinical study reports. Where possible, for each event we extracted the original term reported by the investigator, the term as coded by the medical coding dictionary, medical coding dictionary used, and the patient's trial identification number. Using the patient's trial identification number, we attempted to reconcile data on the same event between the different formats for presenting data on adverse events within the clinical study report. 9 randomised placebo controlled trials of duloxetine for major depressive disorder submitted to the European Medicines Agency for marketing approval. Clinical study reports obtained from the EMA in 2011. Six trials used the medical coding dictionary COSTART (Coding Symbols for a Thesaurus of Adverse Reaction Terms) and three used MedDRA (Medical Dictionary for Regulatory Activities). Suicides were clearly identifiable in all formats of adverse event data in clinical study reports. Suicide attempts presented in tables included both definitive and provisional diagnoses. Suicidal ideation and preparatory behaviour were obscured in some tables owing to the lack of specificity of the medical coding dictionary, especially COSTART. Furthermore, we found one event of suicidal ideation described in narrative text that was absent from tables and adverse event listings of individual patients. The reason for this is unclear, but may be due to the coding conventions used. Data on adverse events in tables in clinical study reports may not accurately represent the underlying patient data because of the medical dictionaries and coding conventions used. In clinical study reports, the listings of adverse events for individual patients and narratives of adverse events can provide additional information, including original investigator reported adverse event terms, which can enable a more accurate estimate of harms. © Maund et al 2014.
Using QR codes to enable quick access to information in acute cancer care.
Upton, Joanne; Olsson-Brown, Anna; Marshall, Ernie; Sacco, Joseph
2017-05-25
Quick access to toxicity management information ensures timely access to steroids/immunosuppressive treatment for cancer patients experiencing immune-related adverse events, thus reducing length of hospital stays or avoiding hospital admission entirely. This article discusses a project to add a QR (quick response) code to a patient-held immunotherapy alert card. As QR code generation is free and the immunotherapy clinical management algorithms were already publicly available through the trust's clinical network website, the costs of integrating a QR code into the alert card, after printing, were low, while the potential benefits are numerous. Patient-held alert cards are widely used for patients receiving anti-cancer treatment, and this established standard of care has been modified to enable rapid access of information through the incorporation of a QR code.
A Combinatorial Geometry Computer Description of the M9 ACE (Armored Combat Earthmover) Vehicle
1984-12-01
program requires as input the M9 target descriptions as processed by the Geometric Information for Targets ( GIFT ) ’ computer code. The first step is...model of the target. This COM-GEOM target description is used as input to the Geometric Information For Targets ( GIFT ) computer code. Among other...things, the GIFT code traces shotlines through a COM-GEOM description from any specified aspect, listing pertinent information about each component hit
Frohlich, Dennis Owen; Zmyslinski-Seelig, Anne
2012-01-01
The purpose of this study was to explore the types of social support messages YouTube users posted on medical videos. Specifically, the study compared messages posted on inflammatory bowel disease-related videos and ostomy-related videos. Additionally, the study analyzed the differences in social support messages posted on lay-created videos and professionally-created videos. Conducting a content analysis, the researchers unitized the comments on each video; the total number of thought units amounted to 5,960. Researchers coded each thought unit through the use of a coding scheme modified from a previous study. YouTube users posted informational support messages most frequently (65.1%), followed by emotional support messages (18.3%), and finally, instrumental support messages (8.2%).
Novel numerical and graphical representation of DNA sequences and proteins.
Randić, M; Novic, M; Vikić-Topić, D; Plavsić, D
2006-12-01
We have introduced novel numerical and graphical representations of DNA, which offer a simple and unique characterization of DNA sequences. The numerical representation of a DNA sequence is given as a sequence of real numbers derived from a unique graphical representation of the standard genetic code. There is no loss of information on the primary structure of a DNA sequence associated with this numerical representation. The novel representations are illustrated with the coding sequences of the first exon of beta-globin gene of half a dozen species in addition to human. The method can be extended to proteins as is exemplified by humanin, a 24-aa peptide that has recently been identified as a specific inhibitor of neuronal cell death induced by familial Alzheimer's disease mutant genes.
Graphical User Interface for the NASA FLOPS Aircraft Performance and Sizing Code
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Curlett, Brian P.
1994-01-01
XFLOPS is an X-Windows/Motif graphical user interface for the aircraft performance and sizing code FLOPS. This new interface simplifies entering data and analyzing results, thereby reducing analysis time and errors. Data entry is simpler because input windows are used for each of the FLOPS namelists. These windows contain fields to input the variable's values along with help information describing the variable's function. Analyzing results is simpler because output data are displayed rapidly. This is accomplished in two ways. First, because the output file has been indexed, users can view particular sections with the click of a mouse button. Second, because menu picks have been created, users can plot engine and aircraft performance data. In addition, XFLOPS has a built-in help system and complete on-line documentation for FLOPS.
Potential digitization/compression techniques for Shuttle video
NASA Technical Reports Server (NTRS)
Habibi, A.; Batson, B. H.
1978-01-01
The Space Shuttle initially will be using a field-sequential color television system but it is possible that an NTSC color TV system may be used for future missions. In addition to downlink color TV transmission via analog FM links, the Shuttle will use a high resolution slow-scan monochrome system for uplink transmission of text and graphics information. This paper discusses the characteristics of the Shuttle video systems, and evaluates digitization and/or bandwidth compression techniques for the various links. The more attractive techniques for the downlink video are based on a two-dimensional DPCM encoder that utilizes temporal and spectral as well as the spatial correlation of the color TV imagery. An appropriate technique for distortion-free coding of the uplink system utilizes two-dimensional HCK codes.
Biological Information Transfer Beyond the Genetic Code: The Sugar Code
NASA Astrophysics Data System (ADS)
Gabius, H.-J.
In the era of genetic engineering, cloning, and genome sequencing the focus of research on the genetic code has received an even further accentuation in the public eye. In attempting, however, to understand intra- and intercellular recognition processes comprehensively, the two biochemical dimensions established by nucleic acids and proteins are not sufficient to satisfactorily explain all molecular events in, for example, cell adhesion or routing. The consideration of further code systems is essential to bridge this gap. A third biochemical alphabet forming code words with an information storage capacity second to no other substance class in rather small units (words, sentences) is established by monosaccharides (letters). As hardware oligosaccharides surpass peptides by more than seven orders of magnitude in the theoretical ability to build isomers, when the total of conceivable hexamers is calculated. In addition to the sequence complexity, the use of magnetic resonance spectroscopy and molecular modeling has been instrumental in discovering that even small glycans can often reside in not only one but several distinct low-energy conformations (keys). Intriguingly, conformers can display notably different capacities to fit snugly into the binding site of nonhomologous receptors (locks). This process, experimentally verified for two classes of lectins, is termed "differential conformer selection." It adds potential for shifts of the conformer equilibrium to modulate ligand properties dynamically and reversibly to the well-known changes in sequence (including anomeric positioning and linkage points) and in pattern of substitution, for example, by sulfation. In the intimate interplay with sugar receptors (lectins, enzymes, and antibodies) the message of coding units of the sugar code is deciphered. Their recognition will trigger postbinding signaling and the intended biological response. Knowledge about the driving forces for the molecular rendezvous, i.e., contributions of bidentate or cooperative hydrogen bonds, dispersion forces, stacking, and solvent rearrangement, will enable the design of high-affinity ligands or mimetics thereof. They embody clinical applications reaching from receptor localization in diagnostic pathology to cell type-selective targeting of drugs and inhibition of undesired cell adhesion in bacterial/viral infections, inflammation, or metastasis.
The Coding of Biological Information: From Nucleotide Sequence to Protein Recognition
NASA Astrophysics Data System (ADS)
Štambuk, Nikola
The paper reviews the classic results of Swanson, Dayhoff, Grantham, Blalock and Root-Bernstein, which link genetic code nucleotide patterns to the protein structure, evolution and molecular recognition. Symbolic representation of the binary addresses defining particular nucleotide and amino acid properties is discussed, with consideration of: structure and metric of the code, direct correspondence between amino acid and nucleotide information, and molecular recognition of the interacting protein motifs coded by the complementary DNA and RNA strands.
Information Theory, Inference and Learning Algorithms
NASA Astrophysics Data System (ADS)
Mackay, David J. C.
2003-10-01
Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
Anonymization of DICOM Electronic Medical Records for Radiation Therapy
Newhauser, Wayne; Jones, Timothy; Swerdloff, Stuart; Newhauser, Warren; Cilia, Mark; Carver, Robert; Halloran, Andy; Zhang, Rui
2014-01-01
Electronic medical records (EMR) and treatment plans are used in research on patient outcomes and radiation effects. In many situations researchers must remove protected health information (PHI) from EMRs. The literature contains several studies describing the anonymization of generic Digital Imaging and Communication in Medicine (DICOM) files and DICOM image sets but no publications were found that discuss the anonymization of DICOM radiation therapy plans, a key component of an EMR in a cancer clinic. In addition to this we were unable to find a commercial software tool that met the minimum requirements for anonymization and preservation of data integrity for radiation therapy research. The purpose of this study was to develop a prototype software code to meet the requirements for the anonymization of radiation therapy treatment plans and to develop a way to validate that code and demonstrate that it properly anonymized treatment plans and preserved data integrity. We extended an open-source code to process all relevant PHI and to allow for the automatic anonymization of multiple EMRs. The prototype code successfully anonymized multiple treatment plans in less than 1 minute per patient. We also tested commercial optical character recognition (OCR) algorithms for the detection of burned-in text on the images, but they were unable to reliably recognize text. In addition, we developed and tested an image filtering algorithm that allowed us to isolate and redact alpha-numeric text from a test radiograph. Validation tests verified that PHI was anonymized and data integrity, such as the relationship between DICOM unique identifiers (UID) was preserved. PMID:25147130
Anonymization of DICOM electronic medical records for radiation therapy.
Newhauser, Wayne; Jones, Timothy; Swerdloff, Stuart; Newhauser, Warren; Cilia, Mark; Carver, Robert; Halloran, Andy; Zhang, Rui
2014-10-01
Electronic medical records (EMR) and treatment plans are used in research on patient outcomes and radiation effects. In many situations researchers must remove protected health information (PHI) from EMRs. The literature contains several studies describing the anonymization of generic Digital Imaging and Communication in Medicine (DICOM) files and DICOM image sets but no publications were found that discuss the anonymization of DICOM radiation therapy plans, a key component of an EMR in a cancer clinic. In addition to this we were unable to find a commercial software tool that met the minimum requirements for anonymization and preservation of data integrity for radiation therapy research. The purpose of this study was to develop a prototype software code to meet the requirements for the anonymization of radiation therapy treatment plans and to develop a way to validate that code and demonstrate that it properly anonymized treatment plans and preserved data integrity. We extended an open-source code to process all relevant PHI and to allow for the automatic anonymization of multiple EMRs. The prototype code successfully anonymized multiple treatment plans in less than 1min/patient. We also tested commercial optical character recognition (OCR) algorithms for the detection of burned-in text on the images, but they were unable to reliably recognize text. In addition, we developed and tested an image filtering algorithm that allowed us to isolate and redact alpha-numeric text from a test radiograph. Validation tests verified that PHI was anonymized and data integrity, such as the relationship between DICOM unique identifiers (UID) was preserved. Copyright © 2014 Elsevier Ltd. All rights reserved.
Continuous-variable quantum network coding for coherent states
NASA Astrophysics Data System (ADS)
Shang, Tao; Li, Ke; Liu, Jian-wei
2017-04-01
As far as the spectral characteristic of quantum information is concerned, the existing quantum network coding schemes can be looked on as the discrete-variable quantum network coding schemes. Considering the practical advantage of continuous variables, in this paper, we explore two feasible continuous-variable quantum network coding (CVQNC) schemes. Basic operations and CVQNC schemes are both provided. The first scheme is based on Gaussian cloning and ADD/SUB operators and can transmit two coherent states across with a fidelity of 1/2, while the second scheme utilizes continuous-variable quantum teleportation and can transmit two coherent states perfectly. By encoding classical information on quantum states, quantum network coding schemes can be utilized to transmit classical information. Scheme analysis shows that compared with the discrete-variable paradigms, the proposed CVQNC schemes provide better network throughput from the viewpoint of classical information transmission. By modulating the amplitude and phase quadratures of coherent states with classical characters, the first scheme and the second scheme can transmit 4{log _2}N and 2{log _2}N bits of information by a single network use, respectively.
Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S
2012-03-01
Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.
VizieR Online Data Catalog: Mercury-T code (Bolmont+, 2015)
NASA Astrophysics Data System (ADS)
Bolmont, E.; Raymond, S. N.; Leconte, J.; Hersant, F.; Correia, A. C. M.
2015-11-01
The major addition to Mercury provided in Mercury-T is the addition of the tidal forces and torques. But we also added the effect of general relativity and rotation-induced deformation. We explain in the following sections how these effects were incorporated in the code. We also give the planets and star/BD/Jupiter parameters which are implemented in the code. The link to this code and the manual can also be found here: http://www.emelinebolmont.com/research-interests (2 data files).
Bradshaw, Debbie; Groenewald, Pamela; Bourne, David E.; Mahomed, Hassan; Nojilana, Beatrice; Daniels, Johan; Nixon, Jo
2006-01-01
OBJECTIVE: To review the quality of the coding of the cause of death (COD) statistics and assess the mortality information needs of the City of Cape Town. METHODS: Using an action research approach, a study was set up to investigate the quality of COD information, the accuracy of COD coding and consistency of coding practices in the larger health subdistricts. Mortality information needs and the best way of presenting the statistics to assist health managers were explored. FINDINGS: Useful information was contained in 75% of death certificates, but nearly 60% had only a single cause certified; 55% of forms were coded accurately. Disagreement was mainly because routine coders coded the immediate instead of the underlying COD. An abridged classification of COD, based on causes of public health importance, prevalent causes and selected combinations of diseases was implemented with training on underlying cause. Analysis of the 2001 data identified the leading causes of death and premature mortality and illustrated striking differences in the disease burden and profile between health subdistricts. CONCLUSION: Action research is particularly useful for improving information systems and revealed the need to standardize the coding practice to identify underlying cause. The specificity of the full ICD classification is beyond the level of detail on the death certificates currently available. An abridged classification for coding provides a practical tool appropriate for local level public health surveillance. Attention to the presentation of COD statistics is important to enable the data to inform decision-makers. PMID:16583080
Bradshaw, Debbie; Groenewald, Pamela; Bourne, David E; Mahomed, Hassan; Nojilana, Beatrice; Daniels, Johan; Nixon, Jo
2006-03-01
To review the quality of the coding of the cause of death (COD) statistics and assess the mortality information needs of the City of Cape Town. Using an action research approach, a study was set up to investigate the quality of COD information, the accuracy of COD coding and consistency of coding practices in the larger health subdistricts. Mortality information needs and the best way of presenting the statistics to assist health managers were explored. Useful information was contained in 75% of death certificates, but nearly 60% had only a single cause certified; 55% of forms were coded accurately. Disagreement was mainly because routine coders coded the immediate instead of the underlying COD. An abridged classification of COD, based on causes of public health importance, prevalent causes and selected combinations of diseases was implemented with training on underlying cause. Analysis of the 2001 data identified the leading causes of death and premature mortality and illustrated striking differences in the disease burden and profile between health subdistricts. Action research is particularly useful for improving information systems and revealed the need to standardize the coding practice to identify underlying cause. The specificity of the full ICD classification is beyond the level of detail on the death certificates currently available. An abridged classification for coding provides a practical tool appropriate for local level public health surveillance. Attention to the presentation of COD statistics is important to enable the data to inform decision-makers.
P-Code-Enhanced Encryption-Mode Processing of GPS Signals
NASA Technical Reports Server (NTRS)
Young, Lawrence; Meehan, Thomas; Thomas, Jess B.
2003-01-01
A method of processing signals in a Global Positioning System (GPS) receiver has been invented to enable the receiver to recover some of the information that is otherwise lost when GPS signals are encrypted at the transmitters. The need for this method arises because, at the option of the military, precision GPS code (P-code) is sometimes encrypted by a secret binary code, denoted the A code. Authorized users can recover the full signal with knowledge of the A-code. However, even in the absence of knowledge of the A-code, one can track the encrypted signal by use of an estimate of the A-code. The present invention is a method of making and using such an estimate. In comparison with prior such methods, this method makes it possible to recover more of the lost information and obtain greater accuracy.
HIPAA brings new requirements, new opportunities.
Moynihan, J J; McLure, M L
2000-03-01
The passage of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) brought with it the need for Federal rules to implement the act's simplification and cost-reduction efforts. HHS has published proposed rules related to security for the electronic transmission of health information, privacy of individually identifiable health information, transactions and code sets, and national provider and employer identifiers. Additional proposed rules will be published this year for claims attachments and health plan identifiers. Although HIPAA does not require providers to conduct business electronically, the new standards give providers the opportunity to reduce healthcare administrative costs significantly and undertake electronic commerce efficiently and cost-effectively.
3D Encoding of Musical Score Information and the Playback Method Used by the Cellular Phone
NASA Astrophysics Data System (ADS)
Kubo, Hitoshi; Sugiura, Akihiko
Recently, 3G cellular phone that can take a movie has spread by improving the digital camera function. And, 2Dcode has accurate readout and high operability. And it has spread as an information transmission means. However, the symbol is expanded and complicated when information of 2D codes increases. To solve these, 3D code was proposed. But it need the special equipment for readout, and specializes in the enhancing reality feeling technology. Therefore, it is difficult to apply it to the cellular phone. And so, we propose 3D code that can be recognized by the movie shooting function of the cellular phone. And, score information was encoded. We apply Gray Code to the property of music, and encode it. And the effectiveness was verified.
NASA Astrophysics Data System (ADS)
Zhao, Shengmei; Wang, Le; Liang, Wenqiang; Cheng, Weiwen; Gong, Longyan
2015-10-01
In this paper, we propose a high performance optical encryption (OE) scheme based on computational ghost imaging (GI) with QR code and compressive sensing (CS) technique, named QR-CGI-OE scheme. N random phase screens, generated by Alice, is a secret key and be shared with its authorized user, Bob. The information is first encoded by Alice with QR code, and the QR-coded image is then encrypted with the aid of computational ghost imaging optical system. Here, measurement results from the GI optical system's bucket detector are the encrypted information and be transmitted to Bob. With the key, Bob decrypts the encrypted information to obtain the QR-coded image with GI and CS techniques, and further recovers the information by QR decoding. The experimental and numerical simulated results show that the authorized users can recover completely the original image, whereas the eavesdroppers can not acquire any information about the image even the eavesdropping ratio (ER) is up to 60% at the given measurement times. For the proposed scheme, the number of bits sent from Alice to Bob are reduced considerably and the robustness is enhanced significantly. Meantime, the measurement times in GI system is reduced and the quality of the reconstructed QR-coded image is improved.
1987-12-01
mRNA), lular viruses within a few hours in dif- and Sl-analysis showed that anti-IgM and ferent body fluids and may be used for phorbol esters...suppressed mRNA coding for general virus diagnosis. the secreted form of IgM, showing that Thiophilic adsorption for the puri- these additives affect...constructs were and can be an alternative method to pro- utilized containing the prokaryotic CAT - tein A affinity chromatography, especial- gene
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-28
... e-Tag Information to Commission Staff; Notice Specifying webRegistry Code In Order No. 771,\\1\\ the... stated that, ``following issuance of this Final Rule and the Commission's registration in the OATI web... in the Purchasing-Seller Entity section of OATI webRegistry. This code should be used to designate...
Chan, Wen-Ling; Yang, Wen-Kuang; Huang, Hsien-Da; Chang, Jan-Gowth
2013-01-01
RNA interference (RNAi) is a gene silencing process within living cells, which is controlled by the RNA-induced silencing complex with a sequence-specific manner. In flies and mice, the pseudogene transcripts can be processed into short interfering RNAs (siRNAs) that regulate protein-coding genes through the RNAi pathway. Following these findings, we construct an innovative and comprehensive database to elucidate siRNA-mediated mechanism in human transcribed pseudogenes (TPGs). To investigate TPG producing siRNAs that regulate protein-coding genes, we mapped the TPGs to small RNAs (sRNAs) that were supported by publicly deep sequencing data from various sRNA libraries and constructed the TPG-derived siRNA-target interactions. In addition, we also presented that TPGs can act as a target for miRNAs that actually regulate the parental gene. To enable the systematic compilation and updating of these results and additional information, we have developed a database, pseudoMap, capturing various types of information, including sequence data, TPG and cognate annotation, deep sequencing data, RNA-folding structure, gene expression profiles, miRNA annotation and target prediction. As our knowledge, pseudoMap is the first database to demonstrate two mechanisms of human TPGs: encoding siRNAs and decoying miRNAs that target the parental gene. pseudoMap is freely accessible at http://pseudomap.mbc.nctu.edu.tw/. Database URL: http://pseudomap.mbc.nctu.edu.tw/
A Repository of Codes of Ethics and Technical Standards in Health Informatics
Zaïane, Osmar R.
2014-01-01
We present a searchable repository of codes of ethics and standards in health informatics. It is built using state-of-the-art search algorithms and technologies. The repository will be potentially beneficial for public health practitioners, researchers, and software developers in finding and comparing ethics topics of interest. Public health clinics, clinicians, and researchers can use the repository platform as a one-stop reference for various ethics codes and standards. In addition, the repository interface is built for easy navigation, fast search, and side-by-side comparative reading of documents. Our selection criteria for codes and standards are two-fold; firstly, to maintain intellectual property rights, we index only codes and standards freely available on the internet. Secondly, major international, regional, and national health informatics bodies across the globe are surveyed with the aim of understanding the landscape in this domain. We also look at prevalent technical standards in health informatics from major bodies such as the International Standards Organization (ISO) and the U. S. Food and Drug Administration (FDA). Our repository contains codes of ethics from the International Medical Informatics Association (IMIA), the iHealth Coalition (iHC), the American Health Information Management Association (AHIMA), the Australasian College of Health Informatics (ACHI), the British Computer Society (BCS), and the UK Council for Health Informatics Professions (UKCHIP), with room for adding more in the future. Our major contribution is enhancing the findability of codes and standards related to health informatics ethics by compilation and unified access through the health informatics ethics repository. PMID:25422725
Mu, Chuang; Wang, Ruijia; Li, Tianqi; Li, Yuqiang; Tian, Meilin; Jiao, Wenqian; Huang, Xiaoting; Zhang, Lingling; Hu, Xiaoli; Wang, Shi; Bao, Zhenmin
2016-08-01
Long non-coding RNA (lncRNA) structurally resembles mRNA but cannot be translated into protein. Although the systematic identification and characterization of lncRNAs have been increasingly reported in model species, information concerning non-model species is still lacking. Here, we report the first systematic identification and characterization of lncRNAs in two sea cucumber species: (1) Apostichopus japonicus during lipopolysaccharide (LPS) challenge and in heathy tissues and (2) Holothuria glaberrima during radial organ complex regeneration, using RNA-seq datasets and bioinformatics analysis. We identified A. japonicus and H. glaberrima lncRNAs that were differentially expressed during LPS challenge and radial organ complex regeneration, respectively. Notably, the predicted lncRNA-microRNA-gene trinities revealed that, in addition to targeting protein-coding transcripts, miRNAs might also target lncRNAs, thereby participating in a potential novel layer of regulatory interactions among non-coding RNA classes in echinoderms. Furthermore, the constructed coding-non-coding network implied the potential involvement of lncRNA-gene interactions during the regulation of several important genes (e.g., Toll-like receptor 1 [TLR1] and transglutaminase-1 [TGM1]) in response to LPS challenge and radial organ complex regeneration in sea cucumbers. Overall, this pioneer systematic identification, annotation, and characterization of lncRNAs in echinoderm pave the way for similar studies and future genetic, genomic, and evolutionary research in non-model species.
Conserved mechanisms of vocalization coding in mammalian and songbird auditory midbrain.
Woolley, Sarah M N; Portfors, Christine V
2013-11-01
The ubiquity of social vocalizations among animals provides the opportunity to identify conserved mechanisms of auditory processing that subserve communication. Identifying auditory coding properties that are shared across vocal communicators will provide insight into how human auditory processing leads to speech perception. Here, we compare auditory response properties and neural coding of social vocalizations in auditory midbrain neurons of mammalian and avian vocal communicators. The auditory midbrain is a nexus of auditory processing because it receives and integrates information from multiple parallel pathways and provides the ascending auditory input to the thalamus. The auditory midbrain is also the first region in the ascending auditory system where neurons show complex tuning properties that are correlated with the acoustics of social vocalizations. Single unit studies in mice, bats and zebra finches reveal shared principles of auditory coding including tonotopy, excitatory and inhibitory interactions that shape responses to vocal signals, nonlinear response properties that are important for auditory coding of social vocalizations and modulation tuning. Additionally, single neuron responses in the mouse and songbird midbrain are reliable, selective for specific syllables, and rely on spike timing for neural discrimination of distinct vocalizations. We propose that future research on auditory coding of vocalizations in mouse and songbird midbrain neurons adopt similar experimental and analytical approaches so that conserved principles of vocalization coding may be distinguished from those that are specialized for each species. This article is part of a Special Issue entitled "Communication Sounds and the Brain: New Directions and Perspectives". Copyright © 2013 Elsevier B.V. All rights reserved.
Advanced technology development for image gathering, coding, and processing
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.
1990-01-01
Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.
NASA Astrophysics Data System (ADS)
Yang, Qianli; Pitkow, Xaq
2015-03-01
Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.
Comparison of Three Information Sources for Smoking Information in Electronic Health Records
Wang, Liwei; Ruan, Xiaoyang; Yang, Ping; Liu, Hongfang
2016-01-01
OBJECTIVE The primary aim was to compare independent and joint performance of retrieving smoking status through different sources, including narrative text processed by natural language processing (NLP), patient-provided information (PPI), and diagnosis codes (ie, International Classification of Diseases, Ninth Revision [ICD-9]). We also compared the performance of retrieving smoking strength information (ie, heavy/light smoker) from narrative text and PPI. MATERIALS AND METHODS Our study leveraged an existing lung cancer cohort for smoking status, amount, and strength information, which was manually chart-reviewed. On the NLP side, smoking-related electronic medical record (EMR) data were retrieved first. A pattern-based smoking information extraction module was then implemented to extract smoking-related information. After that, heuristic rules were used to obtain smoking status-related information. Smoking information was also obtained from structured data sources based on diagnosis codes and PPI. Sensitivity, specificity, and accuracy were measured using patients with coverage (ie, the proportion of patients whose smoking status/strength can be effectively determined). RESULTS NLP alone has the best overall performance for smoking status extraction (patient coverage: 0.88; sensitivity: 0.97; specificity: 0.70; accuracy: 0.88); combining PPI with NLP further improved patient coverage to 0.96. ICD-9 does not provide additional improvement to NLP and its combination with PPI. For smoking strength, combining NLP with PPI has slight improvement over NLP alone. CONCLUSION These findings suggest that narrative text could serve as a more reliable and comprehensive source for obtaining smoking-related information than structured data sources. PPI, the readily available structured data, could be used as a complementary source for more comprehensive patient coverage. PMID:27980387
Hybrid concatenated codes and iterative decoding
NASA Technical Reports Server (NTRS)
Divsalar, Dariush (Inventor); Pollara, Fabrizio (Inventor)
2000-01-01
Several improved turbo code apparatuses and methods. The invention encompasses several classes: (1) A data source is applied to two or more encoders with an interleaver between the source and each of the second and subsequent encoders. Each encoder outputs a code element which may be transmitted or stored. A parallel decoder provides the ability to decode the code elements to derive the original source information d without use of a received data signal corresponding to d. The output may be coupled to a multilevel trellis-coded modulator (TCM). (2) A data source d is applied to two or more encoders with an interleaver between the source and each of the second and subsequent encoders. Each of the encoders outputs a code element. In addition, the original data source d is output from the encoder. All of the output elements are coupled to a TCM. (3) At least two data sources are applied to two or more encoders with an interleaver between each source and each of the second and subsequent encoders. The output may be coupled to a TCM. (4) At least two data sources are applied to two or more encoders with at least two interleavers between each source and each of the second and subsequent encoders. (5) At least one data source is applied to one or more serially linked encoders through at least one interleaver. The output may be coupled to a TCM. The invention includes a novel way of terminating a turbo coder.
Feasibility of a computer-assisted feedback system between dispatch centre and ambulances.
Lindström, Veronica; Karlsten, Rolf; Falk, Ann-Charlotte; Castrèn, Maaret
2011-06-01
The aim of the study was to evaluate the feasibility of a newly developed computer-assisted feedback system between dispatch centre and ambulances in Stockholm, Sweden. A computer-assisted feedback system based on a Finnish model was designed to fit the Swedish emergency medical system. Feedback codes were identified and divided into three categories; assessment of patients' primary condition when ambulance arrives at scene, no transport by the ambulance and level of priority. Two ambulances and one emergency medical communication centre (EMCC) in Stockholm participated in the study. A sample of 530 feedback codes sent through the computer-assisted feedback system was reviewed. The information on the ambulance medical records was compared with the feedback codes used and 240 assignments were further analyzed. The used feedback codes sent from ambulance to EMCC were correct in 92% of the assignments. The most commonly used feedback code sent to the emergency medical dispatchers was 'agree with the dispatchers' assessment'. In addition, in 160 assignments there was a mismatch between emergency medical dispatchers and ambulance nurse assessments. Our results have shown a high agreement between medical dispatchers and ambulance nurse assessment. The feasibility of the feedback codes seems to be acceptable based on the small margin of error. The computer-assisted feedback system may, when used on a daily basis, make it possible for the medical dispatchers to receive feedback in a structural way. The EMCC organization can directly evaluate any changes in the assessment protocol by structured feedback sent from the ambulance.
Kawano, Tomonori
2013-03-01
There have been a wide variety of approaches for handling the pieces of DNA as the "unplugged" tools for digital information storage and processing, including a series of studies applied to the security-related area, such as DNA-based digital barcodes, water marks and cryptography. In the present article, novel designs of artificial genes as the media for storing the digitally compressed data for images are proposed for bio-computing purpose while natural genes principally encode for proteins. Furthermore, the proposed system allows cryptographical application of DNA through biochemically editable designs with capacity for steganographical numeric data embedment. As a model case of image-coding DNA technique application, numerically and biochemically combined protocols are employed for ciphering the given "passwords" and/or secret numbers using DNA sequences. The "passwords" of interest were decomposed into single letters and translated into the font image coded on the separate DNA chains with both the coding regions in which the images are encoded based on the novel run-length encoding rule, and the non-coding regions designed for biochemical editing and the remodeling processes revealing the hidden orientation of letters composing the original "passwords." The latter processes require the molecular biological tools for digestion and ligation of the fragmented DNA molecules targeting at the polymerase chain reaction-engineered termini of the chains. Lastly, additional protocols for steganographical overwriting of the numeric data of interests over the image-coding DNA are also discussed.
Abdulhameed, Hunida E; Hammami, Muhammad M; Mohamed, Elbushra A Hameed
2011-08-01
The consistency of codes governing disclosure of terminal illness to patients and families in Islamic countries has not been studied until now. To review available codes on disclosure of terminal illness in Islamic countries. DATA SOURCE AND EXTRACTION: Data were extracted through searches on Google and PubMed. Codes related to disclosure of terminal illness to patients or families were abstracted, and then classified independently by the three authors. Codes for 14 Islamic countries were located. Five codes were silent regarding informing the patient, seven allowed concealment, one mandated disclosure and one prohibited disclosure. Five codes were silent regarding informing the family, four allowed disclosure and five mandated/recommended disclosure. The Islamic Organization for Medical Sciences code was silent on both issues. Codes regarding disclosure of terminal illness to patients and families differed markedly among Islamic countries. They were silent in one-third of the codes, and tended to favour a paternalistic/utilitarian, family-centred approach over an autonomous, patient-centred approach.
48 CFR 204.7202-1 - CAGE codes.
Code of Federal Regulations, 2014 CFR
2014-10-01
... issued by DLA Logistics Information Service. (Their address is: Customer Service, Federal Center, 74... Logistics Information Service assigns or records and maintains CAGE codes to identify commercial and... Volume 7 of DoD 4100.39-M, Federal Logistics Information System (FLIS) Procedures Manual, prescribe use...
48 CFR 204.7202-1 - CAGE codes.
Code of Federal Regulations, 2011 CFR
2011-10-01
... issued by DLA Logistics Information Service. (Their address is: Customer Service, Federal Center, 74... Logistics Information Service assigns or records and maintains CAGE codes to identify commercial and... Volume 7 of DoD 4100.39-M, Federal Logistics Information System (FLIS) Procedures Manual, prescribe use...
48 CFR 204.7202-1 - CAGE codes.
Code of Federal Regulations, 2013 CFR
2013-10-01
... issued by DLA Logistics Information Service. (Their address is: Customer Service, Federal Center, 74... Logistics Information Service assigns or records and maintains CAGE codes to identify commercial and... Volume 7 of DoD 4100.39-M, Federal Logistics Information System (FLIS) Procedures Manual, prescribe use...
48 CFR 204.7202-1 - CAGE codes.
Code of Federal Regulations, 2012 CFR
2012-10-01
... issued by DLA Logistics Information Service. (Their address is: Customer Service, Federal Center, 74... Logistics Information Service assigns or records and maintains CAGE codes to identify commercial and... Volume 7 of DoD 4100.39-M, Federal Logistics Information System (FLIS) Procedures Manual, prescribe use...
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
ERIC Educational Resources Information Center
Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed
2013-01-01
Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…
Efficient Type Representation in TAL
NASA Technical Reports Server (NTRS)
Chen, Juan
2009-01-01
Certifying compilers generate proofs for low-level code that guarantee safety properties of the code. Type information is an essential part of safety proofs. But the size of type information remains a concern for certifying compilers in practice. This paper demonstrates type representation techniques in a large-scale compiler that achieves both concise type information and efficient type checking. In our 200,000-line certifying compiler, the size of type information is about 36% of the size of pure code and data for our benchmarks, the best result to the best of our knowledge. The type checking time is about 2% of the compilation time.
Error Control Coding Techniques for Space and Satellite Communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Takeshita, Oscar Y.; Cabral, Hermano A.
1998-01-01
It is well known that the BER performance of a parallel concatenated turbo-code improves roughly as 1/N, where N is the information block length. However, it has been observed by Benedetto and Montorsi that for most parallel concatenated turbo-codes, the FER performance does not improve monotonically with N. In this report, we study the FER of turbo-codes, and the effects of their concatenation with an outer code. Two methods of concatenation are investigated: across several frames and within each frame. Some asymmetric codes are shown to have excellent FER performance with an information block length of 16384. We also show that the proposed outer coding schemes can improve the BER performance as well by eliminating pathological frames generated by the iterative MAP decoding process.
Parzeller, Markus; Zedler, Barbara
2013-01-01
The article deals with the new regulations in the German Civil Code (BGB) which came into effect in Germany on 26 Feb 2013 as the Patient Rights Act (PatRG). In Part I, the legislative procedure, the treatment contract and the contracting parties (Section 630a Civil Code), the applicable regulations (Section 630b Civil Code) and the obligations to cooperate and inform (Section 630c Civil Code) are discussed and critically analysed.
Li, Xiang-Yao; Wang, Ning; Wang, Yong-Jie; Zuo, Zhen-Xing; Koga, Kohei; Luo, Fei
2014-01-01
Temporal properties of spike firing in the central nervous system (CNS) are critical for neuronal coding and the precision of information storage. Chronic pain has been reported to affect cognitive and emotional functions, in addition to trigger long-term plasticity in sensory synapses and behavioral sensitization. Less is known about the possible changes in temporal precision of cortical neurons in chronic pain conditions. In the present study, we investigated the temporal precision of action potential firing in the anterior cingulate cortex (ACC) by using both in vivo and in vitro electrophysiological approaches. We found that peripheral inflammation caused by complete Freund's adjuvant (CFA) increased the standard deviation (SD) of spikes latency (also called jitter) of ∼51% of recorded neurons in the ACC of adult rats in vivo. Similar increases in jitter were found in ACC neurons using in vitro brain slices from adult mice with peripheral inflammation or nerve injury. Bath application of glutamate receptor antagonists CNQX and AP5 abolished the enhancement of jitter induced by CFA injection or nerve injury, suggesting that the increased jitter depends on the glutamatergic synaptic transmission. Activation of adenylyl cyclases (ACs) by bath application of forskolin increased jitter, whereas genetic deletion of AC1 abolished the change of jitter caused by CFA inflammation. Our study provides strong evidence for long-term changes of temporal precision of information coding in cortical neurons after peripheral injuries and explains neuronal mechanism for chronic pain caused cognitive and emotional impairment. PMID:25100600
Phase synchronization motion and neural coding in dynamic transmission of neural information.
Wang, Rubin; Zhang, Zhikang; Qu, Jingyi; Cao, Jianting
2011-07-01
In order to explore the dynamic characteristics of neural coding in the transmission of neural information in the brain, a model of neural network consisting of three neuronal populations is proposed in this paper using the theory of stochastic phase dynamics. Based on the model established, the neural phase synchronization motion and neural coding under spontaneous activity and stimulation are examined, for the case of varying network structure. Our analysis shows that, under the condition of spontaneous activity, the characteristics of phase neural coding are unrelated to the number of neurons participated in neural firing within the neuronal populations. The result of numerical simulation supports the existence of sparse coding within the brain, and verifies the crucial importance of the magnitudes of the coupling coefficients in neural information processing as well as the completely different information processing capability of neural information transmission in both serial and parallel couplings. The result also testifies that under external stimulation, the bigger the number of neurons in a neuronal population, the more the stimulation influences the phase synchronization motion and neural coding evolution in other neuronal populations. We verify numerically the experimental result in neurobiology that the reduction of the coupling coefficient between neuronal populations implies the enhancement of lateral inhibition function in neural networks, with the enhancement equivalent to depressing neuronal excitability threshold. Thus, the neuronal populations tend to have a stronger reaction under the same stimulation, and more neurons get excited, leading to more neurons participating in neural coding and phase synchronization motion.
An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Allison, E-mail: lewis.allison10@gmail.com; Smith, Ralph; Williams, Brian
For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is tomore » employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.« less
Tennant, David Robin; Bruyninckx, Chris
2018-03-01
Consumer exposure assessments for food additives are incomplete without information about the proportions of foods in each authorised category that contain the additive. Such information has been difficult to obtain but the Mintel Global New Products Database (GNPD) provides information about product launches across Europe over the past 20 years. These data can be searched to identify products with specific additives listed on product labels and the numbers compared with total product launches for food and drink categories in the same database to determine the frequency of occurrence. There are uncertainties associated with the data but these can be managed by adopting a cautious and conservative approach. GNPD data can be mapped with authorised food categories and with food descriptions used in the EFSA Comprehensive European Food Consumption Surveys Database for exposure modelling. The data, when presented as percent occurrence, could be incorporated into the EFSA ANS Panel's 'brand-loyal/non-brand loyal exposure model in a quantitative way. Case studies of preservative, antioxidant, colour and sweetener additives showed that the impact of including occurrence data is greatest in the non-brand loyal scenario. Recommendations for future research include identifying occurrence data for alcoholic beverages, linking regulatory food codes, FoodEx and GNPD product descriptions, developing the use of occurrence data for carry-over foods and improving understanding of brand loyalty in consumer exposure models.
The Impact of Causality on Information-Theoretic Source and Channel Coding Problems
ERIC Educational Resources Information Center
Palaiyanur, Harikrishna R.
2011-01-01
This thesis studies several problems in information theory where the notion of causality comes into play. Causality in information theory refers to the timing of when information is available to parties in a coding system. The first part of the thesis studies the error exponent (or reliability function) for several communication problems over…
Information theoretical assessment of image gathering and coding for digital restoration
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; John, Sarah; Reichenbach, Stephen E.
1990-01-01
The process of image-gathering, coding, and restoration is presently treated in its entirety rather than as a catenation of isolated tasks, on the basis of the relationship between the spectral information density of a transmitted signal and the restorability of images from the signal. This 'information-theoretic' assessment accounts for the information density and efficiency of the acquired signal as a function of the image-gathering system's design and radiance-field statistics, as well as for the information efficiency and data compression that are obtainable through the combination of image gathering with coding to reduce signal redundancy. It is found that high information efficiency is achievable only through minimization of image-gathering degradation as well as signal redundancy.
Coding of vocalizations by single neurons in ventrolateral prefrontal cortex.
Plakke, Bethany; Diltz, Mark D; Romanski, Lizabeth M
2013-11-01
Neuronal activity in single prefrontal neurons has been correlated with behavioral responses, rules, task variables and stimulus features. In the non-human primate, neurons recorded in ventrolateral prefrontal cortex (VLPFC) have been found to respond to species-specific vocalizations. Previous studies have found multisensory neurons which respond to simultaneously presented faces and vocalizations in this region. Behavioral data suggests that face and vocal information are inextricably linked in animals and humans and therefore may also be tightly linked in the coding of communication calls in prefrontal neurons. In this study we therefore examined the role of VLPFC in encoding vocalization call type information. Specifically, we examined previously recorded single unit responses from the VLPFC in awake, behaving rhesus macaques in response to 3 types of species-specific vocalizations made by 3 individual callers. Analysis of responses by vocalization call type and caller identity showed that ∼19% of cells had a main effect of call type with fewer cells encoding caller. Classification performance of VLPFC neurons was ∼42% averaged across the population. When assessed at discrete time bins, classification performance reached 70 percent for coos in the first 300 ms and remained above chance for the duration of the response period, though performance was lower for other call types. In light of the sub-optimal classification performance of the majority of VLPFC neurons when only vocal information is present, and the recent evidence that most VLPFC neurons are multisensory, the potential enhancement of classification with the addition of accompanying face information is discussed and additional studies recommended. Behavioral and neuronal evidence has shown a considerable benefit in recognition and memory performance when faces and voices are presented simultaneously. In the natural environment both facial and vocalization information is present simultaneously and neural systems no doubt evolved to integrate multisensory stimuli during recognition. This article is part of a Special Issue entitled "Communication Sounds and the Brain: New Directions and Perspectives". Copyright © 2013 Elsevier B.V. All rights reserved.
Hanford Facility Dangerous Waste Permit Application for T Plant Complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
BARNES, B.M.
2002-09-01
The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating treatment, storage, and/or disposal units, such as the T Plant Complex (this document, DOE/RL-95-36). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agencymore » (40 Code of Federal Regulations 270), with additional information needs defined by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the T Plant Complex permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents Section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the T Plant Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text.« less
Zou, Ding; Djordjevic, Ivan B
2016-09-05
In this paper, we propose a rate-adaptive FEC scheme based on LDPC codes together with its software reconfigurable unified FPGA architecture. By FPGA emulation, we demonstrate that the proposed class of rate-adaptive LDPC codes based on shortening with an overhead from 25% to 42.9% provides a coding gain ranging from 13.08 dB to 14.28 dB at a post-FEC BER of 10-15 for BPSK transmission. In addition, the proposed rate-adaptive LDPC coding combined with higher-order modulations have been demonstrated including QPSK, 8-QAM, 16-QAM, 32-QAM, and 64-QAM, which covers a wide range of signal-to-noise ratios. Furthermore, we apply the unequal error protection by employing different LDPC codes on different bits in 16-QAM and 64-QAM, which results in additional 0.5dB gain compared to conventional LDPC coded modulation with the same code rate of corresponding LDPC code.
Wang, Yong-Mei; Tian, Xue-Tao; Zhang, Hui; Yang, Zhong-Rui; Yin, Xue-Bo
2018-06-21
Counterfeiting is a global epidemic that is compelling the development of new anticounterfeiting strategy. Herein, we report a novel multiple anticounterfeiting encoding strategy of invisible fluorescent quick response (QR) codes with emission color as information storage unit. The strategy requires red, green, and blue (RGB) light-emitting materials for different emission colors as encrypting information, single excitation for all of the emission for practicability, and ultraviolet (UV) excitation for invisibility under daylight. Therefore, RGB light-emitting nanoscale metal-organic frameworks (NMOFs) are designed as inks to construct the colorful light-emitting boxes for information encrypting, while three black vertex boxes were used for positioning. Full-color emissions are obtained by mixing the trichromatic NMOFs inks through inkjet printer. The encrypting information capacity is easily adjusted by the number of light-emitting boxes with the infinite emission colors. The information is decoded with specific excitation light at 275 nm, making the QR codes invisible under daylight. The composition of inks, invisibility, inkjet printing, and the abundant encrypting information all contribute to multiple anticounterfeiting. The proposed QR codes pattern holds great potential for advanced anticounterfeiting.
NASA Technical Reports Server (NTRS)
Gardner, Kevin D.; Liu, Jong-Shang; Murthy, Durbha V.; Kruse, Marlin J.; James, Darrell
1999-01-01
AlliedSignal Engines, in cooperation with NASA GRC (National Aeronautics and Space Administration Glenn Research Center), completed an evaluation of recently-developed aeroelastic computer codes using test cases from the AlliedSignal Engines fan blisk and turbine databases. Test data included strain gage, performance, and steady-state pressure information obtained for conditions where synchronous or flutter vibratory conditions were found to occur. Aeroelastic codes evaluated included quasi 3-D UNSFLO (MIT Developed/AE Modified, Quasi 3-D Aeroelastic Computer Code), 2-D FREPS (NASA-Developed Forced Response Prediction System Aeroelastic Computer Code), and 3-D TURBO-AE (NASA/Mississippi State University Developed 3-D Aeroelastic Computer Code). Unsteady pressure predictions for the turbine test case were used to evaluate the forced response prediction capabilities of each of the three aeroelastic codes. Additionally, one of the fan flutter cases was evaluated using TURBO-AE. The UNSFLO and FREPS evaluation predictions showed good agreement with the experimental test data trends, but quantitative improvements are needed. UNSFLO over-predicted turbine blade response reductions, while FREPS under-predicted them. The inviscid TURBO-AE turbine analysis predicted no discernible blade response reduction, indicating the necessity of including viscous effects for this test case. For the TURBO-AE fan blisk test case, significant effort was expended getting the viscous version of the code to give converged steady flow solutions for the transonic flow conditions. Once converged, the steady solutions provided an excellent match with test data and the calibrated DAWES (AlliedSignal 3-D Viscous Steady Flow CFD Solver). However, efforts expended establishing quality steady-state solutions prevented exercising the unsteady portion of the TURBO-AE code during the present program. AlliedSignal recommends that unsteady pressure measurement data be obtained for both test cases examined for use in aeroelastic code validation.
Dickinson, Dwight; Ramsey, Mary E; Gold, James M
2007-05-01
In focusing on potentially localizable cognitive impairments, the schizophrenia meta-analytic literature has overlooked the largest single impairment: on digit symbol coding tasks. To compare the magnitude of the schizophrenia impairment on coding tasks with impairments on other traditional neuropsychological instruments. MEDLINE and PsycINFO electronic databases and reference lists from identified articles. English-language studies from 1990 to present, comparing performance of patients with schizophrenia and healthy controls on coding tasks and cognitive measures representing at least 2 other cognitive domains. Of 182 studies identified, 40 met all criteria for inclusion in the meta-analysis. Means, standard deviations, and sample sizes were extracted for digit symbol coding and 36 other cognitive variables. In addition, we recorded potential clinical moderator variables, including chronicity/severity, medication status, age, and education, and potential study design moderators, including coding task variant, matching, and study publication date. Main analyses synthesized data from 37 studies comprising 1961 patients with schizophrenia and 1444 comparison subjects. Combination of mean effect sizes across studies by means of a random effects model yielded a weighted mean effect for digit symbol coding of g = -1.57 (95% confidence interval, -1.66 to -1.48). This effect compared with a grand mean effect of g = -0.98 and was significantly larger than effects for widely used measures of episodic memory, executive functioning, and working memory. Moderator variable analyses indicated that clinical and study design differences between studies had little effect on the coding task effect. Comparison with previous meta-analyses suggested that current results were representative of the broader literature. Subsidiary analysis of data from relatives of patients with schizophrenia also suggested prominent coding task impairments in this group. The 5-minute digit symbol coding task, reliable and easy to administer, taps an information processing inefficiency that is a central feature of the cognitive deficit in schizophrenia and deserves systematic investigation.
Convergence Acceleration and Documentation of CFD Codes for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Marquart, Jed E.
2005-01-01
The development and analysis of turbomachinery components for industrial and aerospace applications has been greatly enhanced in recent years through the advent of computational fluid dynamics (CFD) codes and techniques. Although the use of this technology has greatly reduced the time required to perform analysis and design, there still remains much room for improvement in the process. In particular, there is a steep learning curve associated with most turbomachinery CFD codes, and the computation times need to be reduced in order to facilitate their integration into standard work processes. Two turbomachinery codes have recently been developed by Dr. Daniel Dorney (MSFC) and Dr. Douglas Sondak (Boston University). These codes are entitled Aardvark (for 2-D and quasi 3-D simulations) and Phantom (for 3-D simulations). The codes utilize the General Equation Set (GES), structured grid methodology, and overset O- and H-grids. The codes have been used with success by Drs. Dorney and Sondak, as well as others within the turbomachinery community, to analyze engine components and other geometries. One of the primary objectives of this study was to establish a set of parametric input values which will enhance convergence rates for steady state simulations, as well as reduce the runtime required for unsteady cases. The goal is to reduce the turnaround time for CFD simulations, thus permitting more design parametrics to be run within a given time period. In addition, other code enhancements to reduce runtimes were investigated and implemented. The other primary goal of the study was to develop enhanced users manuals for Aardvark and Phantom. These manuals are intended to answer most questions for new users, as well as provide valuable detailed information for the experienced user. The existence of detailed user s manuals will enable new users to become proficient with the codes, as well as reducing the dependency of new users on the code authors. In order to achieve the objectives listed, the following tasks were accomplished: 1) Parametric Study Of Preconditioning Parameters And Other Code Inputs; 2) Code Modifications To Reduce Runtimes; 3) Investigation Of Compiler Options To Reduce Code Runtime; and 4) Development/Enhancement of Users Manuals for Aardvark and Phantom
Hide and Seek: Exploiting and Hardening Leakage-Resilient Code Randomization
2016-05-30
Hide and Seek: Exploiting and Hardening Leakage -Resilient Code Randomization Robert Rudd MIT Lincoln Laboratory Thomas Hobson MIT Lincoln Laboratory...Irvine Ahmad-Reza Sadeghi TU Darmstadt Hamed Okhravi MIT Lincoln Laboratory Abstract Information leakage vulnerabilities can allow adversaries to...bypass mitigations based on code randomization. This discovery motivates numerous techniques that diminish direct and indirect information leakage : (i
Hide and Seek: Exploiting and Hardening Leakage-Resilient Code Randomization
2016-03-30
Hide and Seek: Exploiting and Hardening Leakage -Resilient Code Randomization Robert Rudd MIT Lincoln Laboratory Thomas Hobson MIT Lincoln Laboratory...Irvine Ahmad-Reza Sadeghi TU Darmstadt Hamed Okhravi MIT Lincoln Laboratory Abstract Information leakage vulnerabilities can allow adversaries to...bypass mitigations based on code randomization. This discovery motivates numerous techniques that diminish direct and indirect information leakage : (i
ERIC Educational Resources Information Center
Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela
2015-01-01
Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…
Sum of the Magnitude for Hard Decision Decoding Algorithm Based on Loop Update Detection
Meng, Jiahui; Zhao, Danfeng; Tian, Hai; Zhang, Liang
2018-01-01
In order to improve the performance of non-binary low-density parity check codes (LDPC) hard decision decoding algorithm and to reduce the complexity of decoding, a sum of the magnitude for hard decision decoding algorithm based on loop update detection is proposed. This will also ensure the reliability, stability and high transmission rate of 5G mobile communication. The algorithm is based on the hard decision decoding algorithm (HDA) and uses the soft information from the channel to calculate the reliability, while the sum of the variable nodes’ (VN) magnitude is excluded for computing the reliability of the parity checks. At the same time, the reliability information of the variable node is considered and the loop update detection algorithm is introduced. The bit corresponding to the error code word is flipped multiple times, before this is searched in the order of most likely error probability to finally find the correct code word. Simulation results show that the performance of one of the improved schemes is better than the weighted symbol flipping (WSF) algorithm under different hexadecimal numbers by about 2.2 dB and 2.35 dB at the bit error rate (BER) of 10−5 over an additive white Gaussian noise (AWGN) channel, respectively. Furthermore, the average number of decoding iterations is significantly reduced. PMID:29342963
APPRIS 2017: principal isoforms for multiple gene sets
Rodriguez-Rivas, Juan; Di Domenico, Tomás; Vázquez, Jesús; Valencia, Alfonso
2018-01-01
Abstract The APPRIS database (http://appris-tools.org) uses protein structural and functional features and information from cross-species conservation to annotate splice isoforms in protein-coding genes. APPRIS selects a single protein isoform, the ‘principal’ isoform, as the reference for each gene based on these annotations. A single main splice isoform reflects the biological reality for most protein coding genes and APPRIS principal isoforms are the best predictors of these main proteins isoforms. Here, we present the updates to the database, new developments that include the addition of three new species (chimpanzee, Drosophila melangaster and Caenorhabditis elegans), the expansion of APPRIS to cover the RefSeq gene set and the UniProtKB proteome for six species and refinements in the core methods that make up the annotation pipeline. In addition APPRIS now provides a measure of reliability for individual principal isoforms and updates with each release of the GENCODE/Ensembl and RefSeq reference sets. The individual GENCODE/Ensembl, RefSeq and UniProtKB reference gene sets for six organisms have been merged to produce common sets of splice variants. PMID:29069475
Barriers to success: physical separation optimizes event-file retrieval in shared workspaces.
Klempova, Bibiana; Liepelt, Roman
2017-07-08
Sharing tasks with other persons can simplify our work and life, but seeing and hearing other people's actions may also be very distracting. The joint Simon effect (JSE) is a standard measure of referential response coding when two persons share a Simon task. Sequential modulations of the joint Simon effect (smJSE) are interpreted as a measure of event-file processing containing stimulus information, response information and information about the just relevant control-state active in a given social situation. This study tested effects of physical (Experiment 1) and virtual (Experiment 2) separation of shared workspaces on referential coding and event-file processing using a joint Simon task. In Experiment 1, participants performed this task in individual (go-nogo), joint and standard Simon task conditions with and without a transparent curtain (physical separation) placed along the imagined vertical midline of the monitor. In Experiment 2, participants performed the same tasks with and without receiving background music (virtual separation). For response times, physical separation enhanced event-file retrieval indicated by an enlarged smJSE in the joint Simon task with curtain than without curtain (Experiment1), but did not change referential response coding. In line with this, we also found evidence for enhanced event-file processing through physical separation in the joint Simon task for error rates. Virtual separation did neither impact event-file processing, nor referential coding, but generally slowed down response times in the joint Simon task. For errors, virtual separation hampered event-file processing in the joint Simon task. For the cognitively more demanding standard two-choice Simon task, we found music to have a degrading effect on event-file retrieval for response times. Our findings suggest that adding a physical separation optimizes event-file processing in shared workspaces, while music seems to lead to a more relaxed task processing mode under shared task conditions. In addition, music had an interfering impact on joint error processing and more generally when dealing with a more complex task in isolation.
A Multi-Label Classification Approach for Coding Cancer Information Service Chat Transcripts
Rios, Anthony; Vanderpool, Robin; Shaw, Pam
2017-01-01
National Cancer Institute's (NCI) Cancer Information Service (CIS) offers online instant messaging based information service called LiveHelp to patients, family members, friends, and other cancer information consumers. A cancer information specialist (IS) ‘chats’ with a consumer and provides information on a variety of topics including clinical trials. After a LiveHelp chat session is finished, the IS codes about 20 different elements of metadata about the session in electronic contact record forms (ECRF), which are to be later used for quality control and reporting. Besides straightforward elements like age and gender, more specific elements to be coded include the purpose of contact, the subjects of interaction, and the different responses provided to the consumer, the latter two often taking on multiple values. As such, ECRF coding is a time consuming task and automating this process could help ISs to focus more on their primary goal of helping consumers with valuable cancer related information. As a first attempt in this task, we explored multi-label and multi-class text classification approaches to code the purpose, subjects of interaction, and the responses provided based on the chat transcripts. With a sample dataset of about 673 transcripts, we achieved example-based F-scores of 0.67 (for subjects) and 0.58 (responses). We also achieved label-based micro F-scores of 0.65 (for subjects), 0.62 (for responses), and 0.61 (for purpose). To our knowledge this is the first attempt in automatic coding of Live-Help transcripts and our initial results on the smaller corpus indicate promising future directions in this task. PMID:28736775
Automating annotation of information-giving for analysis of clinical conversation.
Mayfield, Elijah; Laws, M Barton; Wilson, Ira B; Penstein Rosé, Carolyn
2014-02-01
Coding of clinical communication for fine-grained features such as speech acts has produced a substantial literature. However, annotation by humans is laborious and expensive, limiting application of these methods. We aimed to show that through machine learning, computers could code certain categories of speech acts with sufficient reliability to make useful distinctions among clinical encounters. The data were transcripts of 415 routine outpatient visits of HIV patients which had previously been coded for speech acts using the Generalized Medical Interaction Analysis System (GMIAS); 50 had also been coded for larger scale features using the Comprehensive Analysis of the Structure of Encounters System (CASES). We aggregated selected speech acts into information-giving and requesting, then trained the machine to automatically annotate using logistic regression classification. We evaluated reliability by per-speech act accuracy. We used multiple regression to predict patient reports of communication quality from post-visit surveys using the patient and provider information-giving to information-requesting ratio (briefly, information-giving ratio) and patient gender. Automated coding produces moderate reliability with human coding (accuracy 71.2%, κ=0.57), with high correlation between machine and human prediction of the information-giving ratio (r=0.96). The regression significantly predicted four of five patient-reported measures of communication quality (r=0.263-0.344). The information-giving ratio is a useful and intuitive measure for predicting patient perception of provider-patient communication quality. These predictions can be made with automated annotation, which is a practical option for studying large collections of clinical encounters with objectivity, consistency, and low cost, providing greater opportunity for training and reflection for care providers.
Critical Care Coding for Neurologists.
Nuwer, Marc R; Vespa, Paul M
2015-10-01
Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.
Coding of Neuroinfectious Diseases.
Barkley, Gregory L
2015-12-01
Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.
Diagnostic Coding for Epilepsy.
Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R
2016-02-01
Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-15
... manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be... Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-12
...). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS... Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests...
The impact of rare variation on gene expression across tissues.
Li, Xin; Kim, Yungil; Tsang, Emily K; Davis, Joe R; Damani, Farhan N; Chiang, Colby; Hess, Gaelen T; Zappala, Zachary; Strober, Benjamin J; Scott, Alexandra J; Li, Amy; Ganna, Andrea; Bassik, Michael C; Merker, Jason D; Hall, Ira M; Battle, Alexis; Montgomery, Stephen B
2017-10-11
Rare genetic variants are abundant in humans and are expected to contribute to individual disease risk. While genetic association studies have successfully identified common genetic variants associated with susceptibility, these studies are not practical for identifying rare variants. Efforts to distinguish pathogenic variants from benign rare variants have leveraged the genetic code to identify deleterious protein-coding alleles, but no analogous code exists for non-coding variants. Therefore, ascertaining which rare variants have phenotypic effects remains a major challenge. Rare non-coding variants have been associated with extreme gene expression in studies using single tissues, but their effects across tissues are unknown. Here we identify gene expression outliers, or individuals showing extreme expression levels for a particular gene, across 44 human tissues by using combined analyses of whole genomes and multi-tissue RNA-sequencing data from the Genotype-Tissue Expression (GTEx) project v6p release. We find that 58% of underexpression and 28% of overexpression outliers have nearby conserved rare variants compared to 8% of non-outliers. Additionally, we developed RIVER (RNA-informed variant effect on regulation), a Bayesian statistical model that incorporates expression data to predict a regulatory effect for rare variants with higher accuracy than models using genomic annotations alone. Overall, we demonstrate that rare variants contribute to large gene expression changes across tissues and provide an integrative method for interpretation of rare variants in individual genomes.
MODFLOW 2.0: A program for predicting moderator flow patterns
NASA Astrophysics Data System (ADS)
Peterson, P. F.; Paik, I. K.
1991-07-01
Sudden changes in the temperature of flowing liquids can result in transient buoyancy forces which strongly impact the flow hydrodynamics via flow stratification. These effects have been studied for the case of potential flow of stratified liquids to line sinks, but not for moderator flow in SRS reactors. Standard codes, such as TRAC and COMMIX, do not have the capability to capture the stratification effect, due to strong numerical diffusion which smears away the hot/cold fluid interface. A related problem with standard codes is the inability to track plumes injected into the liquid flow, again due to numerical diffusion. The combined effects of buoyant stratification and plume dispersion have been identified as being important in the operation of the Supplementary Safety System which injects neutron-poison ink into SRS reactors to provide safe shutdown in the event of safety rod failure. The MODFLOW code discussed here provides transient moderator flow pattern information with stratification effects, and tracks the location of ink plumes in the reactor. The code, written in Fortran, is compiled for Macintosh II computers, and includes subroutines for interactive control and graphical output. Removing the graphics capabilities, the code can also be compiled on other computers. With graphics, in addition to the capability to perform safety related computations, MODFLOW also provides an easy tool for becoming familiar with flow distributions in SRS reactors.
Death of a dogma: eukaryotic mRNAs can code for more than one protein
Mouilleron, Hélène; Delcourt, Vivian; Roucou, Xavier
2016-01-01
mRNAs carry the genetic information that is translated by ribosomes. The traditional view of a mature eukaryotic mRNA is a molecule with three main regions, the 5′ UTR, the protein coding open reading frame (ORF) or coding sequence (CDS), and the 3′ UTR. This concept assumes that ribosomes translate one ORF only, generally the longest one, and produce one protein. As a result, in the early days of genomics and bioinformatics, one CDS was associated with each protein-coding gene. This fundamental concept of a single CDS is being challenged by increasing experimental evidence indicating that annotated proteins are not the only proteins translated from mRNAs. In particular, mass spectrometry (MS)-based proteomics and ribosome profiling have detected productive translation of alternative open reading frames. In several cases, the alternative and annotated proteins interact. Thus, the expression of two or more proteins translated from the same mRNA may offer a mechanism to ensure the co-expression of proteins which have functional interactions. Translational mechanisms already described in eukaryotic cells indicate that the cellular machinery is able to translate different CDSs from a single viral or cellular mRNA. In addition to summarizing data showing that the protein coding potential of eukaryotic mRNAs has been underestimated, this review aims to challenge the single translated CDS dogma. PMID:26578573
Bright, T J
2013-01-01
Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). THE APPROACH CONSISTED OF FIVE STEPS: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains.
Bright, T.J.
2013-01-01
Summary Background Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. Objective To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). Methods The approach consisted of five steps: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. Results The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. Conclusion This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains. PMID:24454586
Carroll, John A; Smith, Helen E; Scott, Donia; Cassell, Jackie A
2016-01-01
Background Electronic medical records (EMRs) are revolutionizing health-related research. One key issue for study quality is the accurate identification of patients with the condition of interest. Information in EMRs can be entered as structured codes or unstructured free text. The majority of research studies have used only coded parts of EMRs for case-detection, which may bias findings, miss cases, and reduce study quality. This review examines whether incorporating information from text into case-detection algorithms can improve research quality. Methods A systematic search returned 9659 papers, 67 of which reported on the extraction of information from free text of EMRs with the stated purpose of detecting cases of a named clinical condition. Methods for extracting information from text and the technical accuracy of case-detection algorithms were reviewed. Results Studies mainly used US hospital-based EMRs, and extracted information from text for 41 conditions using keyword searches, rule-based algorithms, and machine learning methods. There was no clear difference in case-detection algorithm accuracy between rule-based and machine learning methods of extraction. Inclusion of information from text resulted in a significant improvement in algorithm sensitivity and area under the receiver operating characteristic in comparison to codes alone (median sensitivity 78% (codes + text) vs 62% (codes), P = .03; median area under the receiver operating characteristic 95% (codes + text) vs 88% (codes), P = .025). Conclusions Text in EMRs is accessible, especially with open source information extraction algorithms, and significantly improves case detection when combined with codes. More harmonization of reporting within EMR studies is needed, particularly standardized reporting of algorithm accuracy metrics like positive predictive value (precision) and sensitivity (recall). PMID:26911811
Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.
Mørk, Søren; Holmes, Ian
2012-03-01
Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.
Illusory conjunctions in simultanagnosia: coarse coding of visual feature location?
McCrea, Simon M; Buxbaum, Laurel J; Coslett, H Branch
2006-01-01
Simultanagnosia is a disorder characterized by an inability to see more than one object at a time. We report a simultanagnosic patient (ED) with bilateral posterior infarctions who produced frequent illusory conjunctions on tasks involving form and surface features (e.g., a red T) and form alone. ED also produced "blend" errors in which features of one familiar perceptual unit appeared to migrate to another familiar perceptual unit (e.g., "RO" read as "PQ"). ED often misread scrambled letter strings as a familiar word (e.g., "hmoe" read as "home"). Finally, ED's success in reporting two letters in an array was inversely related to the distance between the letters. These findings are consistent with the hypothesis that ED's illusory reflect coarse coding of visual feature location that is ameliorated in part by top-down information from object and word recognition systems; the findings are also consistent, however, with Treisman's Feature Integration Theory. Finally, the data provide additional support for the claim that the dorsal parieto-occipital cortex is implicated in the binding of visual feature information.
Laser electro-optic system for rapid three-dimensional /3-D/ topographic mapping of surfaces
NASA Technical Reports Server (NTRS)
Altschuler, M. D.; Altschuler, B. R.; Taboada, J.
1981-01-01
It is pointed out that the generic utility of a robot in a factory/assembly environment could be substantially enhanced by providing a vision capability to the robot. A standard videocamera for robot vision provides a two-dimensional image which contains insufficient information for a detailed three-dimensional reconstruction of an object. Approaches which supply the additional information needed for the three-dimensional mapping of objects with complex surface shapes are briefly considered and a description is presented of a laser-based system which can provide three-dimensional vision to a robot. The system consists of a laser beam array generator, an optical image recorder, and software for controlling the required operations. The projection of a laser beam array onto a surface produces a dot pattern image which is viewed from one or more suitable perspectives. Attention is given to the mathematical method employed, the space coding technique, the approaches used for obtaining the transformation parameters, the optics for laser beam array generation, the hardware for beam array coding, and aspects of image acquisition.
Design of 9.271-pressure-ratio 5-stage core compressor and overall performance for first 3 stages
NASA Technical Reports Server (NTRS)
Steinke, Ronald J.
1986-01-01
Overall aerodynamic design information is given for all five stages of an axial flow core compressor (74A) having a 9.271 pressure ratio and 29.710 kg/sec flow. For the inlet stage group (first three stages), detailed blade element design information and experimental overall performance are given. At rotor 1 inlet tip speed was 430.291 m/sec, and hub to tip radius ratio was 0.488. A low number of blades per row was achieved by the use of low-aspect-ratio blading of moderate solidity. The high reaction stages have about equal energy addition. Radial energy varied to give constant total pressure at the rotor exit. The blade element profile and shock losses and the incidence and deviation angles were based on relevant experimental data. Blade shapes are mostly double circular arc. Analysis by a three-dimensional Euler code verified the experimentally measured high flow at design speed and IGV-stator setting angles. An optimization code gave an optimal IGV-stator reset schedule for higher measured efficiency at all speeds.
Portelli, Geoffrey; Barrett, John M; Hilgen, Gerrit; Masquelier, Timothée; Maccione, Alessandro; Di Marco, Stefano; Berdondini, Luca; Kornprobst, Pierre; Sernagor, Evelyne
2016-01-01
How a population of retinal ganglion cells (RGCs) encodes the visual scene remains an open question. Going beyond individual RGC coding strategies, results in salamander suggest that the relative latencies of a RGC pair encode spatial information. Thus, a population code based on this concerted spiking could be a powerful mechanism to transmit visual information rapidly and efficiently. Here, we tested this hypothesis in mouse by recording simultaneous light-evoked responses from hundreds of RGCs, at pan-retinal level, using a new generation of large-scale, high-density multielectrode array consisting of 4096 electrodes. Interestingly, we did not find any RGCs exhibiting a clear latency tuning to the stimuli, suggesting that in mouse, individual RGC pairs may not provide sufficient information. We show that a significant amount of information is encoded synergistically in the concerted spiking of large RGC populations. Thus, the RGC population response described with relative activities, or ranks, provides more relevant information than classical independent spike count- or latency- based codes. In particular, we report for the first time that when considering the relative activities across the whole population, the wave of first stimulus-evoked spikes is an accurate indicator of stimulus content. We show that this coding strategy coexists with classical neural codes, and that it is more efficient and faster. Overall, these novel observations suggest that already at the level of the retina, concerted spiking provides a reliable and fast strategy to rapidly transmit new visual scenes.
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
Robertson, Ann R R; Fernando, Bernard; Morrison, Zoe; Kalra, Dipak; Sheikh, Aziz
2015-03-27
Globally, diabetes mellitus presents a substantial and increasing burden to individuals, health care systems and society. Structuring and coding of information in the electronic health record underpin attempts to improve sharing and searching for information. Digital records for those with long-term conditions are expected to bring direct and secondary uses benefits, and potentially to support patient self-management. We sought to investigate if how and why records for adults with diabetes were structured and coded and to explore a range of UK stakeholders' perceptions of current practice in the National Health Service. We carried out a qualitative, theoretically informed case study of documenting health care information for diabetes in family practice and hospital settings in England, using semi-structured interviews, observations, systems demonstrations and documentary data. We conducted 22 interviews and four on-site observations. With respect to secondary uses - research, audit, public health and service planning - interviewees clearly articulated the benefits of highly structured and coded diabetes data and it was believed that benefits would expand through linkage to other datasets. Direct, more marginal, clinical benefits in terms of managing and monitoring diabetes and perhaps encouraging patient self-management were also reported. We observed marked differences in levels of record structuring and/or coding between family practices, where it was high, and the hospital. We found little evidence that structured and coded data were being exploited to improve information sharing between care settings. Using high levels of data structuring and coding in records for diabetes patients has the potential to be exploited more fully, and lessons might be learned from successful developments elsewhere in the UK. A first step would be for hospitals to attain levels of health information technology infrastructure and systems use commensurate with family practices.
The search for person-related information in general practice: a qualitative study.
Schrans, Diego; Avonts, Dirk; Christiaens, Thierry; Willems, Sara; de Smet, Kaat; van Boven, Kees; Boeckxstaens, Pauline; Kühlein, Thomas
2016-02-01
General practice is person-focused. Contextual information influences the clinical decision-making process in primary care. Currently, person-related information (PeRI) is neither recorded in a systematic way nor coded in the electronic medical record (EMR), and therefore not usable for scientific use. To search for classes of PeRI influencing the process of care. GPs, from nine countries worldwide, were asked to write down narrative case histories where personal factors played a role in decision-making. In an inductive process, the case histories were consecutively coded according to classes of PeRI. The classes found were deductively applied to the following cases and refined, until saturation was reached. Then, the classes were grouped into code-families and further clustered into domains. The inductive analysis of 32 case histories resulted in 33 defined PeRI codes, classifying all personal-related information in the cases. The 33 codes were grouped in the following seven mutually exclusive code-families: 'aspects between patient and formal care provider', 'social environment and family', 'functioning/behaviour', 'life history/non-medical experiences', 'personal medical information', 'socio-demographics' and 'work-/employment-related information'. The code-families were clustered into four domains: 'social environment and extended family', 'medicine', 'individual' and 'work and employment'. As PeRI is used in the process of decision-making, it should be part of the EMR. The PeRI classes we identified might form the basis of a new contextual classification mainly for research purposes. This might help to create evidence of the person-centredness of general practice. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Language Codes and Memory Codes.
ERIC Educational Resources Information Center
Liberman, Alvin M.; And Others
Paraphrase, as it reflects the processes of remembering rather than those of forgetting, implies that language is best transmitted in one form and stored in another. The dual representation of linguistic information that is implied by paraphrase is important for storing information that has been received and for transmitting information that has…
ERIC Educational Resources Information Center
Meadows, William C.
2011-01-01
Interest in North American Indian code talkers continues to increase. In addition to numerous works about the Navajo code talkers, several publications on other groups of Native American code talkers--including the Choctaw, Comanche, Hopi, Meskwaki, Canadian Cree--and about code talkers in general have appeared. This article chronicles recent…
NASA Astrophysics Data System (ADS)
You, Minli; Lin, Min; Wang, Shurui; Wang, Xuemin; Zhang, Ge; Hong, Yuan; Dong, Yuqing; Jin, Guorui; Xu, Feng
2016-05-01
Medicine counterfeiting is a serious issue worldwide, involving potentially devastating health repercussions. Advanced anti-counterfeit technology for drugs has therefore aroused intensive interest. However, existing anti-counterfeit technologies are associated with drawbacks such as the high cost, complex fabrication process, sophisticated operation and incapability in authenticating drug ingredients. In this contribution, we developed a smart phone recognition based upconversion fluorescent three-dimensional (3D) quick response (QR) code for tracking and anti-counterfeiting of drugs. We firstly formulated three colored inks incorporating upconversion nanoparticles with RGB (i.e., red, green and blue) emission colors. Using a modified inkjet printer, we printed a series of colors by precisely regulating the overlap of these three inks. Meanwhile, we developed a multilayer printing and splitting technology, which significantly increases the information storage capacity per unit area. As an example, we directly printed the upconversion fluorescent 3D QR code on the surface of drug capsules. The 3D QR code consisted of three different color layers with each layer encoded by information of different aspects of the drug. A smart phone APP was designed to decode the multicolor 3D QR code, providing the authenticity and related information of drugs. The developed technology possesses merits in terms of low cost, ease of operation, high throughput and high information capacity, thus holds great potential for drug anti-counterfeiting.Medicine counterfeiting is a serious issue worldwide, involving potentially devastating health repercussions. Advanced anti-counterfeit technology for drugs has therefore aroused intensive interest. However, existing anti-counterfeit technologies are associated with drawbacks such as the high cost, complex fabrication process, sophisticated operation and incapability in authenticating drug ingredients. In this contribution, we developed a smart phone recognition based upconversion fluorescent three-dimensional (3D) quick response (QR) code for tracking and anti-counterfeiting of drugs. We firstly formulated three colored inks incorporating upconversion nanoparticles with RGB (i.e., red, green and blue) emission colors. Using a modified inkjet printer, we printed a series of colors by precisely regulating the overlap of these three inks. Meanwhile, we developed a multilayer printing and splitting technology, which significantly increases the information storage capacity per unit area. As an example, we directly printed the upconversion fluorescent 3D QR code on the surface of drug capsules. The 3D QR code consisted of three different color layers with each layer encoded by information of different aspects of the drug. A smart phone APP was designed to decode the multicolor 3D QR code, providing the authenticity and related information of drugs. The developed technology possesses merits in terms of low cost, ease of operation, high throughput and high information capacity, thus holds great potential for drug anti-counterfeiting. Electronic supplementary information (ESI) available: Calculating details of UCNP content per 3D QR code and decoding process of the 3D QR code. See DOI: 10.1039/c6nr01353h
NASA Astrophysics Data System (ADS)
Zhao, Bei; Zhong, Yanfei; Zhang, Liangpei
2016-06-01
Land-use classification of very high spatial resolution remote sensing (VHSR) imagery is one of the most challenging tasks in the field of remote sensing image processing. However, the land-use classification is hard to be addressed by the land-cover classification techniques, due to the complexity of the land-use scenes. Scene classification is considered to be one of the expected ways to address the land-use classification issue. The commonly used scene classification methods of VHSR imagery are all derived from the computer vision community that mainly deal with terrestrial image recognition. Differing from terrestrial images, VHSR images are taken by looking down with airborne and spaceborne sensors, which leads to the distinct light conditions and spatial configuration of land cover in VHSR imagery. Considering the distinct characteristics, two questions should be answered: (1) Which type or combination of information is suitable for the VHSR imagery scene classification? (2) Which scene classification algorithm is best for VHSR imagery? In this paper, an efficient spectral-structural bag-of-features scene classifier (SSBFC) is proposed to combine the spectral and structural information of VHSR imagery. SSBFC utilizes the first- and second-order statistics (the mean and standard deviation values, MeanStd) as the statistical spectral descriptor for the spectral information of the VHSR imagery, and uses dense scale-invariant feature transform (SIFT) as the structural feature descriptor. From the experimental results, the spectral information works better than the structural information, while the combination of the spectral and structural information is better than any single type of information. Taking the characteristic of the spatial configuration into consideration, SSBFC uses the whole image scene as the scope of the pooling operator, instead of the scope generated by a spatial pyramid (SP) commonly used in terrestrial image classification. The experimental results show that the whole image as the scope of the pooling operator performs better than the scope generated by SP. In addition, SSBFC codes and pools the spectral and structural features separately to avoid mutual interruption between the spectral and structural features. The coding vectors of spectral and structural features are then concatenated into a final coding vector. Finally, SSBFC classifies the final coding vector by support vector machine (SVM) with a histogram intersection kernel (HIK). Compared with the latest scene classification methods, the experimental results with three VHSR datasets demonstrate that the proposed SSBFC performs better than the other classification methods for VHSR image scenes.
2012-03-01
advanced antenna systems AMC adaptive modulation and coding AWGN additive white Gaussian noise BPSK binary phase shift keying BS base station BTC ...QAM-16, and QAM-64, and coding types include convolutional coding (CC), convolutional turbo coding (CTC), block turbo coding ( BTC ), zero-terminating
Cross-cultural perspectives on research participation and informed consent.
Barata, Paula C; Gucciardi, Enza; Ahmad, Farah; Stewart, Donna E
2006-01-01
This study examined Portuguese Canadian and Caribbean Canadian immigrants' perceptions of health research and informed consent procedures. Six focus groups (three in each cultural group) involving 42 participants and two individual interviews were conducted. The focus groups began with a general question about health research. This was followed by three short role-plays between the moderator and the assistant. The role-plays involved a fictional health research study in which a patient is approached for recruitment, is read a consent form, and is asked to sign. The role-plays stopped at key moments at which time focus group participants were asked questions about their understanding and their perceptions. Focus group transcripts were coded in QSR NUDIST software using open coding and then compared across cultural groups. Six overriding themes emerged: two were common in both the Portuguese and Caribbean transcripts, one emphasized the importance of trust and mistrust, and the other highlighted the need and desire for more information about health research. However, these themes were expressed somewhat differently in the two groups. In addition, there were four overriding themes that were specific to only one cultural group. In the Portuguese groups, there was an overwhelming positive regard for the research process and an emphasis on verbal as opposed to written information. The Caribbean participants qualified their participation in research studies and repeatedly raised images of invasive research.
Hiriscau, Ioana E; Stingelin-Giles, Nicola; Stadler, Christina; Schmeck, Klaus; Reiter-Theil, Stella
2014-06-01
Conducting prevention research with children and adolescents raises ethical challenges especially regarding confidentiality. Research with children and adolescents often applies methodologies which aims at the disclosure of sensitive information about practices that impact on adolescent mental and physical health such as sexual activity, smoking, alcohol consumption, illegal drug use, self-damaging and suicidal behaviour (ideation and attempts). The scope of the article is to review normative documents that cover topics relevant for confidentiality when conducting research with children and adolescents. A systematic literature search in MEDLINE was performed to identify relevant international and European guidelines and codes of ethics that cover health, behavioural and social science research. Additionally, the European Research Ethics website was consulted for double check. However, none of the documents aimed at biomedical, behavioural or social research offers concrete support in resolving practical research ethics problems regarding confidentiality. The codes show a lack of clarity in any circumstances in which the researcher might have an obligation to breach confidentiality by disclosing sensitive information. Only little information is given on what kind of disclosed information, if disclosed, might justify breaching confidentiality. The findings prove a need for normative documents to address the ethical questions regarding confidentiality arising in research practice explicitly and specifically. Moreover, further forms of ethical guidance should be developed to support ethical research with children and adolescents.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-30
... manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This listing is not intended to be... Environmental protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-17
...). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). B. What should I..., Feed additives, Food additives, Pesticides and pests, Reporting and recordkeeping requirements. Dated...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-28
...). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). B. What should I... protection, Agricultural commodities, Feed additives, Food additives, Pesticides and pests, Reporting and...
1988-01-01
Article 162 of this Mexican Code provides, among other things, that "Every person has the right freely, responsibly, and in an informed fashion to determine the number and spacing of his or her children." When a marriage is involved, this right is to be observed by the spouses "in agreement with each other." The civil codes of the following states contain the same provisions: 1) Baja California (Art. 159 of the Civil Code of 28 April 1972 as revised in Decree No. 167 of 31 January 1974); 2) Morelos (Art. 255 of the Civil Code of 26 September 1949 as revised in Decree No. 135 of 29 December 1981); 3) Queretaro (Art. 162 of the Civil Code of 29 December 1950 as revised in the Act of 9 January 1981); 4) San Luis Potosi (Art. 147 of the Civil Code of 24 March 1946 as revised in 13 June 1978); Sinaloa (Art. 162 of the Civil Code of 18 June 1940 as revised in Decree No. 28 of 14 October 1975); 5) Tamaulipas (Art. 146 of the Civil Code of 21 November 1960 as revised in Decree No. 20 of 30 April 1975); 6) Veracruz-Llave (Art. 98 of the Civil Code of 1 September 1932 as revised in the Act of 30 December 1975); and 7) Zacatecas (Art. 253 of the Civil Code of 9 February 1965 as revised in Decree No. 104 of 13 August 1975). The Civil Codes of Puebla and Tlaxcala provide for this right only in the context of marriage with the spouses in agreement. See Art. 317 of the Civil Code of Puebla of 15 April 1985 and Article 52 of the Civil Code of Tlaxcala of 31 August 1976 as revised in Decree No. 23 of 2 April 1984. The Family Code of Hidalgo requires as a formality of marriage a certification that the spouses are aware of methods of controlling fertility, responsible parenthood, and family planning. In addition, Article 22 the Civil Code of the Federal District provides that the legal capacity of natural persons is acquired at birth and lost at death; however, from the moment of conception the individual comes under the protection of the law, which is valid with respect to the individual as far as the effects of this law provides as if the individual were already born. full text
LPSN—list of prokaryotic names with standing in nomenclature
Parte, Aidan C.
2014-01-01
The List of Prokaryotic Names with Standing in Nomenclature (LPSN; http://www.bacterio.net) is a database that lists the names of prokaryotes (Bacteria and Archaea) that have been validly published in the International Journal of Systematic and Evolutionary Microbiology directly or by inclusion in a Validation List, under the Rules of International Code of Nomenclature of Bacteria. Currently there are 15 974 taxa listed. In addition, LPSN has an up-to-date classification of prokaryotes and information on prokaryotic nomenclature and culture collections. PMID:24243842
KSOS Secure Unix Verification Plan (Kernelized Secure Operating System).
1980-12-01
shall be handled as proprietary information untii 5 Apri 1978. After that time, the Government m-. distribute the document as it sees fit. UNIX and PWB...Accession For P-’(’ T.’i3 :- NTI G.;:’... &I : " \\ " Y: Codes mdlc/or 71!O lii WDL-TR7809 KSOS VERIFICATION PLAN SECTION I INTRODUCTION "’The purpose...funding, additional tools may be available by the time they are needed for FSOS verification. We intend to use the best available technology in
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-09-01
The furniture and fixtures industry encompasses companies that manufacture household, office, store, public building, and restaurant furniture and fixtures. The second section provides background information on the size, geographic distribution, employment, production, sales, and economic condition of the Wood Furniture and Fixtures industry. The type of facilities described within the document are also described in terms of their Standard Industrial Classification (SIC) codes. Additionally, this section contains a list of the largest companies in terms of sales.
Gene and genon concept: coding versus regulation
2007-01-01
We analyse here the definition of the gene in order to distinguish, on the basis of modern insight in molecular biology, what the gene is coding for, namely a specific polypeptide, and how its expression is realized and controlled. Before the coding role of the DNA was discovered, a gene was identified with a specific phenotypic trait, from Mendel through Morgan up to Benzer. Subsequently, however, molecular biologists ventured to define a gene at the level of the DNA sequence in terms of coding. As is becoming ever more evident, the relations between information stored at DNA level and functional products are very intricate, and the regulatory aspects are as important and essential as the information coding for products. This approach led, thus, to a conceptual hybrid that confused coding, regulation and functional aspects. In this essay, we develop a definition of the gene that once again starts from the functional aspect. A cellular function can be represented by a polypeptide or an RNA. In the case of the polypeptide, its biochemical identity is determined by the mRNA prior to translation, and that is where we locate the gene. The steps from specific, but possibly separated sequence fragments at DNA level to that final mRNA then can be analysed in terms of regulation. For that purpose, we coin the new term “genon”. In that manner, we can clearly separate product and regulative information while keeping the fundamental relation between coding and function without the need to introduce a conceptual hybrid. In mRNA, the program regulating the expression of a gene is superimposed onto and added to the coding sequence in cis - we call it the genon. The complementary external control of a given mRNA by trans-acting factors is incorporated in its transgenon. A consequence of this definition is that, in eukaryotes, the gene is, in most cases, not yet present at DNA level. Rather, it is assembled by RNA processing, including differential splicing, from various pieces, as steered by the genon. It emerges finally as an uninterrupted nucleic acid sequence at mRNA level just prior to translation, in faithful correspondence with the amino acid sequence to be produced as a polypeptide. After translation, the genon has fulfilled its role and expires. The distinction between the protein coding information as materialised in the final polypeptide and the processing information represented by the genon allows us to set up a new information theoretic scheme. The standard sequence information determined by the genetic code expresses the relation between coding sequence and product. Backward analysis asks from which coding region in the DNA a given polypeptide originates. The (more interesting) forward analysis asks in how many polypeptides of how many different types a given DNA segment is expressed. This concerns the control of the expression process for which we have introduced the genon concept. Thus, the information theoretic analysis can capture the complementary aspects of coding and regulation, of gene and genon. PMID:18087760
Mazor, Kathleen M; Rubin, Donald L; Roblin, Douglas W; Williams, Andrew E; Han, Paul K J; Gaglio, Bridget; Cutrona, Sarah L; Costanza, Mary E; Wagner, Joann L
2016-08-01
Patient question-asking is essential to shared decision making. We sought to describe patients' questions when faced with cancer prevention and screening decisions, and to explore differences in question-asking as a function of health literacy with respect to spoken information (health literacy-listening). Four-hundred and thirty-three (433) adults listened to simulated physician-patient interactions discussing (i) prophylactic tamoxifen for breast cancer prevention, (ii) PSA testing for prostate cancer and (iii) colorectal cancer screening, and identified questions they would have. Health literacy-listening was assessed using the Cancer Message Literacy Test-Listening (CMLT-Listening). Two authors developed a coding scheme, which was applied to all questions. Analyses examined whether participants scoring above or below the median on the CMLT-Listening asked a similar variety of questions. Questions were coded into six major function categories: risks/benefits, procedure details, personalizing information, additional information, decision making and credibility. Participants who scored higher on the CMLT-Listening asked a greater variety of risks/benefits questions; those who scored lower asked a greater variety of questions seeking to personalize information. This difference persisted after adjusting for education. Patients' health literacy-listening is associated with distinctive patterns of question utilization following cancer screening and prevention counselling. Providers should not only be responsive to the question functions the patient favours, but also seek to ensure that the patient is exposed to the full range of information needed for shared decision making. © 2015 The Authors. Health Expectations Published by John Wiley & Sons Ltd.
Layered Wyner-Ziv video coding.
Xu, Qian; Xiong, Zixiang
2006-12-01
Following recent theoretical works on successive Wyner-Ziv coding (WZC), we propose a practical layered Wyner-Ziv video coder using the DCT, nested scalar quantization, and irregular LDPC code based Slepian-Wolf coding (or lossless source coding with side information at the decoder). Our main novelty is to use the base layer of a standard scalable video coder (e.g., MPEG-4/H.26L FGS or H.263+) as the decoder side information and perform layered WZC for quality enhancement. Similar to FGS coding, there is no performance difference between layered and monolithic WZC when the enhancement bitstream is generated in our proposed coder. Using an H.26L coded version as the base layer, experiments indicate that WZC gives slightly worse performance than FGS coding when the channel (for both the base and enhancement layers) is noiseless. However, when the channel is noisy, extensive simulations of video transmission over wireless networks conforming to the CDMA2000 1X standard show that H.26L base layer coding plus Wyner-Ziv enhancement layer coding are more robust against channel errors than H.26L FGS coding. These results demonstrate that layered Wyner-Ziv video coding is a promising new technique for video streaming over wireless networks.
On the linear programming bound for linear Lee codes.
Astola, Helena; Tabus, Ioan
2016-01-01
Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.
Optimization of algorithm of coding of genetic information of Chlamydia
NASA Astrophysics Data System (ADS)
Feodorova, Valentina A.; Ulyanov, Sergey S.; Zaytsev, Sergey S.; Saltykov, Yury V.; Ulianova, Onega V.
2018-04-01
New method of coding of genetic information using coherent optical fields is developed. Universal technique of transformation of nucleotide sequences of bacterial gene into laser speckle pattern is suggested. Reference speckle patterns of the nucleotide sequences of omp1 gene of typical wild strains of Chlamydia trachomatis of genovars D, E, F, G, J and K and Chlamydia psittaci serovar I as well are generated. Algorithm of coding of gene information into speckle pattern is optimized. Fully developed speckles with Gaussian statistics for gene-based speckles have been used as criterion of optimization.
EJSCREEN Data--2015 Public Release
EJSCREEN is an environmental justice (EJ) screening and mapping tool that provides EPA with a nationally consistent dataset and methodology for calculating EJ indexes, which can be used for highlighting places that may be candidates for further review, analysis, or outreach as the agency develops programs, policies and other activities. The tool provides both summary and detailed information at the Census block group level or a user-defined area for both demographic and environmental indicators. The summary information is in the form of EJ Indexes which combine demographic information with a single environmental indicator (such as proximity to traffic) that can help identify communities living in areas with greater potential for environmental and health impacts. The tool also provides additional detailed demographic and environmental information to supplement screening analyses. EJSCREEN displays this information in color-coded maps, bar charts, and standard reports. Users should keep in mind that screening tools are subject to substantial uncertainty in their demographic and environmental data, particularly when looking at small geographic areas, such as Census block groups. Data on the full range of environmental impacts and demographic factors in any given location are almost certainly not available directly through this tool, and its initial results should be supplemented with additional information and local knowledge before making any judgments about poten
Connection anonymity analysis in coded-WDM PONs
NASA Astrophysics Data System (ADS)
Sue, Chuan-Ching
2008-04-01
A coded wavelength division multiplexing passive optical network (WDM PON) is presented for fiber to the home (FTTH) systems to protect against eavesdropping. The proposed scheme applies spectral amplitude coding (SAC) with a unipolar maximal-length sequence (M-sequence) code matrix to generate a specific signature address (coding) and to retrieve its matching address codeword (decoding) by exploiting the cyclic properties inherent in array waveguide grating (AWG) routers. In addition to ensuring the confidentiality of user data, the proposed coded-WDM scheme is also a suitable candidate for the physical layer with connection anonymity. Under the assumption that the eavesdropper applies a photo-detection strategy, it is shown that the coded WDM PON outperforms the conventional TDM PON and WDM PON schemes in terms of a higher degree of connection anonymity. Additionally, the proposed scheme allows the system operator to partition the optical network units (ONUs) into appropriate groups so as to achieve a better degree of anonymity.